Mar 18 06:46:53 crc systemd[1]: Starting Kubernetes Kubelet... Mar 18 06:46:54 crc restorecon[4753]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 18 06:46:54 crc restorecon[4753]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 18 06:46:55 crc kubenswrapper[4917]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 06:46:55 crc kubenswrapper[4917]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 18 06:46:55 crc kubenswrapper[4917]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 06:46:55 crc kubenswrapper[4917]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 06:46:55 crc kubenswrapper[4917]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 18 06:46:55 crc kubenswrapper[4917]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.495306 4917 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504060 4917 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504092 4917 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504102 4917 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504111 4917 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504120 4917 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504128 4917 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504137 4917 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504146 4917 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504153 4917 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504162 4917 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504170 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504180 4917 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504190 4917 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504198 4917 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504220 4917 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504228 4917 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504236 4917 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504244 4917 feature_gate.go:330] unrecognized feature gate: Example Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504255 4917 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504266 4917 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504276 4917 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504285 4917 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504295 4917 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504304 4917 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504312 4917 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504320 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504328 4917 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504337 4917 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504345 4917 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504353 4917 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504361 4917 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504369 4917 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504377 4917 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504387 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504396 4917 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504403 4917 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504411 4917 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504419 4917 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504427 4917 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504435 4917 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504443 4917 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504450 4917 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504458 4917 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504467 4917 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504475 4917 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504483 4917 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504491 4917 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504499 4917 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504507 4917 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504514 4917 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504522 4917 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504530 4917 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504538 4917 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504548 4917 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504558 4917 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504567 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504577 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504614 4917 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504623 4917 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504632 4917 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504640 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504649 4917 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504660 4917 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504670 4917 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504679 4917 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504687 4917 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504694 4917 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504702 4917 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504709 4917 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504717 4917 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.504725 4917 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505782 4917 flags.go:64] FLAG: --address="0.0.0.0" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505806 4917 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505823 4917 flags.go:64] FLAG: --anonymous-auth="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505835 4917 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505847 4917 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505856 4917 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505868 4917 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505880 4917 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505889 4917 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505898 4917 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505907 4917 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505917 4917 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505926 4917 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505935 4917 flags.go:64] FLAG: --cgroup-root="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505944 4917 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505953 4917 flags.go:64] FLAG: --client-ca-file="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505962 4917 flags.go:64] FLAG: --cloud-config="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505971 4917 flags.go:64] FLAG: --cloud-provider="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505980 4917 flags.go:64] FLAG: --cluster-dns="[]" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505990 4917 flags.go:64] FLAG: --cluster-domain="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.505999 4917 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506008 4917 flags.go:64] FLAG: --config-dir="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506017 4917 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506026 4917 flags.go:64] FLAG: --container-log-max-files="5" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506037 4917 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506046 4917 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506055 4917 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506064 4917 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506073 4917 flags.go:64] FLAG: --contention-profiling="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506081 4917 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506090 4917 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506100 4917 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506108 4917 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506119 4917 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506129 4917 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506138 4917 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506146 4917 flags.go:64] FLAG: --enable-load-reader="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506155 4917 flags.go:64] FLAG: --enable-server="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506164 4917 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506175 4917 flags.go:64] FLAG: --event-burst="100" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506185 4917 flags.go:64] FLAG: --event-qps="50" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506193 4917 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506202 4917 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506213 4917 flags.go:64] FLAG: --eviction-hard="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506225 4917 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506234 4917 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506243 4917 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506252 4917 flags.go:64] FLAG: --eviction-soft="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506261 4917 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506270 4917 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506279 4917 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506288 4917 flags.go:64] FLAG: --experimental-mounter-path="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506296 4917 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506305 4917 flags.go:64] FLAG: --fail-swap-on="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506314 4917 flags.go:64] FLAG: --feature-gates="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506325 4917 flags.go:64] FLAG: --file-check-frequency="20s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506334 4917 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506343 4917 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506352 4917 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506361 4917 flags.go:64] FLAG: --healthz-port="10248" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506370 4917 flags.go:64] FLAG: --help="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506379 4917 flags.go:64] FLAG: --hostname-override="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506389 4917 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506397 4917 flags.go:64] FLAG: --http-check-frequency="20s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506407 4917 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506416 4917 flags.go:64] FLAG: --image-credential-provider-config="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506425 4917 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506434 4917 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506443 4917 flags.go:64] FLAG: --image-service-endpoint="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506451 4917 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506460 4917 flags.go:64] FLAG: --kube-api-burst="100" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506469 4917 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506478 4917 flags.go:64] FLAG: --kube-api-qps="50" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506487 4917 flags.go:64] FLAG: --kube-reserved="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506496 4917 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506504 4917 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506514 4917 flags.go:64] FLAG: --kubelet-cgroups="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506523 4917 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506532 4917 flags.go:64] FLAG: --lock-file="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506548 4917 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506557 4917 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506566 4917 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506615 4917 flags.go:64] FLAG: --log-json-split-stream="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506625 4917 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506634 4917 flags.go:64] FLAG: --log-text-split-stream="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506642 4917 flags.go:64] FLAG: --logging-format="text" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506651 4917 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506661 4917 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506670 4917 flags.go:64] FLAG: --manifest-url="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506679 4917 flags.go:64] FLAG: --manifest-url-header="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506691 4917 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506700 4917 flags.go:64] FLAG: --max-open-files="1000000" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506712 4917 flags.go:64] FLAG: --max-pods="110" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506721 4917 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506730 4917 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506739 4917 flags.go:64] FLAG: --memory-manager-policy="None" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506750 4917 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506759 4917 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506769 4917 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506778 4917 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506797 4917 flags.go:64] FLAG: --node-status-max-images="50" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506806 4917 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506815 4917 flags.go:64] FLAG: --oom-score-adj="-999" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506824 4917 flags.go:64] FLAG: --pod-cidr="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506833 4917 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506845 4917 flags.go:64] FLAG: --pod-manifest-path="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506853 4917 flags.go:64] FLAG: --pod-max-pids="-1" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506863 4917 flags.go:64] FLAG: --pods-per-core="0" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506872 4917 flags.go:64] FLAG: --port="10250" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506881 4917 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506889 4917 flags.go:64] FLAG: --provider-id="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506898 4917 flags.go:64] FLAG: --qos-reserved="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506907 4917 flags.go:64] FLAG: --read-only-port="10255" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506916 4917 flags.go:64] FLAG: --register-node="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506926 4917 flags.go:64] FLAG: --register-schedulable="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506936 4917 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506951 4917 flags.go:64] FLAG: --registry-burst="10" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506959 4917 flags.go:64] FLAG: --registry-qps="5" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506968 4917 flags.go:64] FLAG: --reserved-cpus="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506977 4917 flags.go:64] FLAG: --reserved-memory="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506988 4917 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.506997 4917 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507006 4917 flags.go:64] FLAG: --rotate-certificates="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507015 4917 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507023 4917 flags.go:64] FLAG: --runonce="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507032 4917 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507041 4917 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507050 4917 flags.go:64] FLAG: --seccomp-default="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507059 4917 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507068 4917 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507077 4917 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507086 4917 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507095 4917 flags.go:64] FLAG: --storage-driver-password="root" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507104 4917 flags.go:64] FLAG: --storage-driver-secure="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507113 4917 flags.go:64] FLAG: --storage-driver-table="stats" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507121 4917 flags.go:64] FLAG: --storage-driver-user="root" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507130 4917 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507140 4917 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507149 4917 flags.go:64] FLAG: --system-cgroups="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507157 4917 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507171 4917 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507180 4917 flags.go:64] FLAG: --tls-cert-file="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507188 4917 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507199 4917 flags.go:64] FLAG: --tls-min-version="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507208 4917 flags.go:64] FLAG: --tls-private-key-file="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507217 4917 flags.go:64] FLAG: --topology-manager-policy="none" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507225 4917 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507234 4917 flags.go:64] FLAG: --topology-manager-scope="container" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507242 4917 flags.go:64] FLAG: --v="2" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507254 4917 flags.go:64] FLAG: --version="false" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507265 4917 flags.go:64] FLAG: --vmodule="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507275 4917 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.507285 4917 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507494 4917 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507504 4917 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507513 4917 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507522 4917 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507531 4917 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507539 4917 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507549 4917 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507558 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507567 4917 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507575 4917 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507607 4917 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507615 4917 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507624 4917 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507631 4917 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507639 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507647 4917 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507655 4917 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507663 4917 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507671 4917 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507681 4917 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507691 4917 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507699 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507707 4917 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507715 4917 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507723 4917 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507730 4917 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507738 4917 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507746 4917 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507754 4917 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507761 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507769 4917 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507777 4917 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507786 4917 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507794 4917 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507803 4917 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507811 4917 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507821 4917 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507831 4917 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507840 4917 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507849 4917 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507858 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507867 4917 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507874 4917 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507919 4917 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507932 4917 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507944 4917 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507954 4917 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507967 4917 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507979 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.507991 4917 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508001 4917 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508010 4917 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508018 4917 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508026 4917 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508035 4917 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508043 4917 feature_gate.go:330] unrecognized feature gate: Example Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508052 4917 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508059 4917 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508067 4917 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508076 4917 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508085 4917 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508093 4917 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508101 4917 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508109 4917 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508124 4917 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508132 4917 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508139 4917 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508147 4917 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508154 4917 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508162 4917 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.508171 4917 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.508194 4917 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.524895 4917 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.524938 4917 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525058 4917 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525069 4917 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525078 4917 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525087 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525096 4917 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525104 4917 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525112 4917 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525120 4917 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525128 4917 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525135 4917 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525143 4917 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525151 4917 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525159 4917 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525166 4917 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525174 4917 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525181 4917 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525189 4917 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525197 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525204 4917 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525212 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525219 4917 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525227 4917 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525235 4917 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525242 4917 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525250 4917 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525258 4917 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525265 4917 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525276 4917 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525286 4917 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525295 4917 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525305 4917 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525315 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525324 4917 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525333 4917 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525343 4917 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525352 4917 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525360 4917 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525369 4917 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525378 4917 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525386 4917 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525395 4917 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525404 4917 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525413 4917 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525420 4917 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525428 4917 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525436 4917 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525444 4917 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525452 4917 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525459 4917 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525467 4917 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525475 4917 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525482 4917 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525490 4917 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525498 4917 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525508 4917 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525518 4917 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525527 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525535 4917 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525544 4917 feature_gate.go:330] unrecognized feature gate: Example Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525552 4917 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525560 4917 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525567 4917 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525576 4917 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525617 4917 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525627 4917 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525638 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525647 4917 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525655 4917 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525663 4917 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525671 4917 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525681 4917 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.525693 4917 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525913 4917 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525924 4917 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525933 4917 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525942 4917 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525950 4917 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525958 4917 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525967 4917 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525975 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525984 4917 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.525992 4917 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526000 4917 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526008 4917 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526016 4917 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526024 4917 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526032 4917 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526040 4917 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526048 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526056 4917 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526094 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526105 4917 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526113 4917 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526122 4917 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526131 4917 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526139 4917 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526147 4917 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526155 4917 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526164 4917 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526172 4917 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526180 4917 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526188 4917 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526196 4917 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526205 4917 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526213 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526221 4917 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526230 4917 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526239 4917 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526247 4917 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526255 4917 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526263 4917 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526270 4917 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526280 4917 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526291 4917 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526301 4917 feature_gate.go:330] unrecognized feature gate: Example Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526309 4917 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526317 4917 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526325 4917 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526333 4917 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526341 4917 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526348 4917 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526356 4917 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526364 4917 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526371 4917 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526379 4917 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526387 4917 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526394 4917 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526404 4917 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526413 4917 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526422 4917 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526430 4917 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526438 4917 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526446 4917 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526454 4917 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526461 4917 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526472 4917 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526481 4917 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526489 4917 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526497 4917 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526506 4917 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526513 4917 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526521 4917 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.526530 4917 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.526541 4917 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.527616 4917 server.go:940] "Client rotation is on, will bootstrap in background" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.532146 4917 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.538364 4917 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.538541 4917 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.540530 4917 server.go:997] "Starting client certificate rotation" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.540639 4917 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.540901 4917 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.567862 4917 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.570365 4917 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.570908 4917 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.592178 4917 log.go:25] "Validated CRI v1 runtime API" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.634927 4917 log.go:25] "Validated CRI v1 image API" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.638140 4917 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.643134 4917 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-18-06-42-24-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.643190 4917 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.670891 4917 manager.go:217] Machine: {Timestamp:2026-03-18 06:46:55.66724526 +0000 UTC m=+0.608400044 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e77627ae-240f-4740-8125-ab1e275f53d4 BootID:a8eadbf6-d127-4d55-8d87-8302d8daa3be Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:4c:0d:6d Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:4c:0d:6d Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:7b:07:32 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:72:ad:94 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:2a:be:41 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:27:59:ed Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:fa:8d:59 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:6e:42:62:1f:70:37 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:22:49:07:e8:9d:20 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.671274 4917 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.671450 4917 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.673235 4917 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.673535 4917 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.673615 4917 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.674114 4917 topology_manager.go:138] "Creating topology manager with none policy" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.674134 4917 container_manager_linux.go:303] "Creating device plugin manager" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.674915 4917 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.674974 4917 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.675269 4917 state_mem.go:36] "Initialized new in-memory state store" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.675412 4917 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.679106 4917 kubelet.go:418] "Attempting to sync node with API server" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.679144 4917 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.679181 4917 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.679202 4917 kubelet.go:324] "Adding apiserver pod source" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.679264 4917 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.684703 4917 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.684761 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.684870 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.684905 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.685028 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.686120 4917 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.687856 4917 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689752 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689796 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689811 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689826 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689849 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689863 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689877 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689901 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689916 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689931 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689981 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.689997 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.691043 4917 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.691860 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.691907 4917 server.go:1280] "Started kubelet" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.692949 4917 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 18 06:46:55 crc systemd[1]: Started Kubernetes Kubelet. Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.693516 4917 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.698820 4917 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.699471 4917 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.699512 4917 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.700459 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.700692 4917 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.700717 4917 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.701007 4917 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.702075 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="200ms" Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.702328 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.702680 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.701340 4917 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.184:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ddca1ca11f151 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,LastTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.705262 4917 server.go:460] "Adding debug handlers to kubelet server" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.713082 4917 factory.go:153] Registering CRI-O factory Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.713139 4917 factory.go:221] Registration of the crio container factory successfully Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.713243 4917 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.713259 4917 factory.go:55] Registering systemd factory Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.713272 4917 factory.go:221] Registration of the systemd container factory successfully Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.713317 4917 factory.go:103] Registering Raw factory Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.713339 4917 manager.go:1196] Started watching for new ooms in manager Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.716798 4917 manager.go:319] Starting recovery of all containers Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.730928 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731004 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731027 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731113 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731137 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731156 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731175 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731197 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731241 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731260 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731282 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731306 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731325 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731350 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731376 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731397 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731416 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731436 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731457 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731476 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731494 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731514 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731533 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731554 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731573 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731637 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731668 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731689 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731745 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731767 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731787 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731809 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731830 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731849 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731870 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731890 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731910 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731931 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731954 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731976 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.731995 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732015 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732036 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732056 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732121 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732144 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732164 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732183 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732204 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732222 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732241 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732261 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732287 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732310 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732332 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732351 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732372 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732394 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732415 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732435 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732454 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732474 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732493 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732513 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732532 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732554 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732574 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732633 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732656 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732680 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732700 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732719 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732738 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732757 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732775 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732794 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732815 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732873 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732893 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732915 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732934 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732952 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732971 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.732990 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733010 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733029 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733050 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733068 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733088 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733107 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733126 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733156 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733176 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733195 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733214 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733233 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733251 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733272 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733293 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733312 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733331 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733350 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733370 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733390 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733416 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733437 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733456 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733476 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733498 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733518 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733543 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733564 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733615 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733643 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733667 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733727 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733746 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733777 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733797 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733817 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733837 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733856 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733877 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733898 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733917 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733935 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733953 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733972 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.733991 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734009 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734030 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734049 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734068 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734086 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734106 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734125 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734145 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734162 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734180 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734199 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734218 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734238 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734255 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734275 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734293 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734317 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734335 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734353 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734373 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734392 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734417 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734435 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734453 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734473 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734493 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734512 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734530 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734550 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734568 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734623 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734650 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734672 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734691 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734710 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734733 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734752 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734773 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734799 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734819 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734838 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734856 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734876 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734896 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734915 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734934 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734954 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734974 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.734993 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735011 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735030 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735049 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735068 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735088 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735106 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735125 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735143 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735163 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735181 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735199 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735217 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735237 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735255 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735273 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735293 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735311 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735333 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735351 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735371 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735391 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735409 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735428 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735445 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735464 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735483 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.735501 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.739529 4917 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.739647 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.739668 4917 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.739681 4917 reconstruct.go:97] "Volume reconstruction finished" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.739690 4917 reconciler.go:26] "Reconciler: start to sync state" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.757917 4917 manager.go:324] Recovery completed Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.765776 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.767046 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.767079 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.767088 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.768458 4917 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.771259 4917 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.771334 4917 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.771386 4917 kubelet.go:2335] "Starting kubelet main sync loop" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.771632 4917 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.771660 4917 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.771695 4917 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.771731 4917 state_mem.go:36] "Initialized new in-memory state store" Mar 18 06:46:55 crc kubenswrapper[4917]: W0318 06:46:55.774453 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.774518 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.792086 4917 policy_none.go:49] "None policy: Start" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.793510 4917 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.793559 4917 state_mem.go:35] "Initializing new in-memory state store" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.802929 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.851252 4917 manager.go:334] "Starting Device Plugin manager" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.851315 4917 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.851336 4917 server.go:79] "Starting device plugin registration server" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.851990 4917 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.852019 4917 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.852240 4917 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.852478 4917 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.852513 4917 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.859199 4917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.872755 4917 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.872865 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.874476 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.874567 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.874624 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.874984 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.875027 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.875073 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.876422 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.876454 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.876468 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.876901 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.876938 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.876990 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.877259 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.877628 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.877661 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.878963 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.878985 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.878997 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.879138 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.879641 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.879675 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.879962 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.880021 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.880038 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.880833 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.880910 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.880928 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.883208 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.883280 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.883299 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.883627 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.884269 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.884317 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.887703 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.887742 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.887745 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.887777 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.887794 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.887755 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.888069 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.888101 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.889459 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.889504 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.889529 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.903568 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="400ms" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944205 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944280 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944323 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944361 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944418 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944463 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944615 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944705 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944748 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944785 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944823 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944867 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944914 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.944950 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.945115 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.953039 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.954994 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.955108 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.955134 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:55 crc kubenswrapper[4917]: I0318 06:46:55.955195 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:46:55 crc kubenswrapper[4917]: E0318 06:46:55.956132 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Mar 18 06:46:56 crc kubenswrapper[4917]: E0318 06:46:56.005816 4917 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.184:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189ddca1ca11f151 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,LastTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047080 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047165 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047210 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047247 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047282 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047319 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047359 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047351 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047430 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047450 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047503 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047521 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047460 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047625 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047643 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047712 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047731 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047754 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047767 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047797 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047854 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047894 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047895 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047925 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047892 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047947 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.047995 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.048043 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.048094 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.048246 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.156438 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.158813 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.158872 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.158891 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.158942 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:46:56 crc kubenswrapper[4917]: E0318 06:46:56.159680 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.207468 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.223756 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.233767 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.255685 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.262713 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:46:56 crc kubenswrapper[4917]: W0318 06:46:56.266026 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0fdb574e471f2ab0a7efd20ead7c73e26494723fc4f1a030edc1269667451144 WatchSource:0}: Error finding container 0fdb574e471f2ab0a7efd20ead7c73e26494723fc4f1a030edc1269667451144: Status 404 returned error can't find the container with id 0fdb574e471f2ab0a7efd20ead7c73e26494723fc4f1a030edc1269667451144 Mar 18 06:46:56 crc kubenswrapper[4917]: W0318 06:46:56.269788 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-615ca5b7300d34e2d7d3c85ae87dd777b058527db1e0a91f4d8372d76fc85080 WatchSource:0}: Error finding container 615ca5b7300d34e2d7d3c85ae87dd777b058527db1e0a91f4d8372d76fc85080: Status 404 returned error can't find the container with id 615ca5b7300d34e2d7d3c85ae87dd777b058527db1e0a91f4d8372d76fc85080 Mar 18 06:46:56 crc kubenswrapper[4917]: W0318 06:46:56.286655 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-448c4955145605aee1b3bc4fb0cbe1700bea70ef08dec4c6bc8901b06002508e WatchSource:0}: Error finding container 448c4955145605aee1b3bc4fb0cbe1700bea70ef08dec4c6bc8901b06002508e: Status 404 returned error can't find the container with id 448c4955145605aee1b3bc4fb0cbe1700bea70ef08dec4c6bc8901b06002508e Mar 18 06:46:56 crc kubenswrapper[4917]: W0318 06:46:56.290143 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2cab416a13d870382f6abe38fd96178764b4642e368016f90b88e39d1b47cf02 WatchSource:0}: Error finding container 2cab416a13d870382f6abe38fd96178764b4642e368016f90b88e39d1b47cf02: Status 404 returned error can't find the container with id 2cab416a13d870382f6abe38fd96178764b4642e368016f90b88e39d1b47cf02 Mar 18 06:46:56 crc kubenswrapper[4917]: E0318 06:46:56.305304 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="800ms" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.560541 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.562934 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.563020 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.563040 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.563082 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:46:56 crc kubenswrapper[4917]: E0318 06:46:56.563782 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.693282 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.778682 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"448c4955145605aee1b3bc4fb0cbe1700bea70ef08dec4c6bc8901b06002508e"} Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.782543 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8382137f388cbb6dfc73bf8b904f04c38b754b9cc3c25f04645ea52d1cd618ce"} Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.785374 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"615ca5b7300d34e2d7d3c85ae87dd777b058527db1e0a91f4d8372d76fc85080"} Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.786921 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0fdb574e471f2ab0a7efd20ead7c73e26494723fc4f1a030edc1269667451144"} Mar 18 06:46:56 crc kubenswrapper[4917]: I0318 06:46:56.790437 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2cab416a13d870382f6abe38fd96178764b4642e368016f90b88e39d1b47cf02"} Mar 18 06:46:56 crc kubenswrapper[4917]: W0318 06:46:56.853942 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:56 crc kubenswrapper[4917]: E0318 06:46:56.854031 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:56 crc kubenswrapper[4917]: W0318 06:46:56.857798 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:56 crc kubenswrapper[4917]: E0318 06:46:56.857865 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:56 crc kubenswrapper[4917]: W0318 06:46:56.944718 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:56 crc kubenswrapper[4917]: E0318 06:46:56.944827 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:57 crc kubenswrapper[4917]: E0318 06:46:57.106436 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="1.6s" Mar 18 06:46:57 crc kubenswrapper[4917]: W0318 06:46:57.225869 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:57 crc kubenswrapper[4917]: E0318 06:46:57.226026 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.364427 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.367064 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.367428 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.367447 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.367487 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:46:57 crc kubenswrapper[4917]: E0318 06:46:57.368149 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.629414 4917 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 06:46:57 crc kubenswrapper[4917]: E0318 06:46:57.631669 4917 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.693745 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.795817 4917 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="e3fca13bc50b57804b493858634558ba5fff525e42664e73ffe5a2315abf5128" exitCode=0 Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.795874 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"e3fca13bc50b57804b493858634558ba5fff525e42664e73ffe5a2315abf5128"} Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.795997 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.797624 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.797655 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.797666 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.799320 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bd2123bfd7cab76d8585dec47334b5461aa674a8deaa2d8262b3168e63aa2158"} Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.799355 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3178418872d0dccec0135fd63950389720710368991e50a202dc241dbef7a1df"} Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.799371 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"77bb48c37e48b86055e70fba417ddf2721298d8809a29fad69a8eaaeb1af7298"} Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.802256 4917 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f8c81a32495a7c65c95ef15c439a77b444a1102cf3261b4731a07e810b856fd4" exitCode=0 Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.802335 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f8c81a32495a7c65c95ef15c439a77b444a1102cf3261b4731a07e810b856fd4"} Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.802450 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.805481 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475" exitCode=0 Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.805567 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475"} Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.805838 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.807177 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.807207 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.807218 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.807437 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.807476 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.807494 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.808779 4917 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4d415076ca3f1e480b38b5742e442445d84efd8621de1b35cad1abe8521a17ee" exitCode=0 Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.808859 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4d415076ca3f1e480b38b5742e442445d84efd8621de1b35cad1abe8521a17ee"} Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.809065 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.810504 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.810541 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.810635 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.811676 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.812784 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.812813 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:57 crc kubenswrapper[4917]: I0318 06:46:57.812824 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.692636 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:58 crc kubenswrapper[4917]: E0318 06:46:58.707461 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="3.2s" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.813008 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.813050 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.813060 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.813071 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.815248 4917 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3431dcc6e0ee78fb1846121ec2f3a0792cc1cc5b90e0ad666e1781d672af8dfc" exitCode=0 Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.815293 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3431dcc6e0ee78fb1846121ec2f3a0792cc1cc5b90e0ad666e1781d672af8dfc"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.815397 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.817321 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.817341 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.817356 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.819910 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"4244c70a8653603a5ac748496f9c091474cb526c01055a2473eacb8a8aefb370"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.819982 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.820943 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.820966 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.820995 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.826363 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4a4f414721a104c5549443127943027ceb52954d2be9c16687d4bd6a351abe27"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.826534 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.828362 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.828393 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.828403 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.832979 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"02e1008b1effe01b6c8dcc4010d34b44815171f2a647343654f3e2ef43447062"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.833019 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2b31a25e3585d31d87e588ddad17ff7d16ad1875a195bcbfb7ae70406b09a935"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.833032 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"492d87b64ae90706eda986d5872d8fbde80ad40ff1a180b52fdded85c52d3fe6"} Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.833093 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.835975 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.836007 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.836018 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.968345 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.969354 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.969380 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.969389 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:58 crc kubenswrapper[4917]: I0318 06:46:58.969410 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:46:58 crc kubenswrapper[4917]: E0318 06:46:58.969890 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.184:6443: connect: connection refused" node="crc" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.090219 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:59 crc kubenswrapper[4917]: W0318 06:46:59.100823 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:46:59 crc kubenswrapper[4917]: E0318 06:46:59.100934 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.184:6443: connect: connection refused" logger="UnhandledError" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.450353 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.700735 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.838754 4917 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="9d0ea1100ac98818aa1371415d2fc3cb26904a538e13d9345545c750373df197" exitCode=0 Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.838867 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"9d0ea1100ac98818aa1371415d2fc3cb26904a538e13d9345545c750373df197"} Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.838883 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.840055 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.840102 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.840122 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.842689 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"39e1d8dc6ae51c82c1a38f9508020cefd5ab93a836a298881a0d305cf5b75b54"} Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.842775 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.842845 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.842887 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.843008 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.844203 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.844242 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.844252 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.844876 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.844899 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.844909 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.844950 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.844878 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.844997 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.845012 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.845022 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:46:59 crc kubenswrapper[4917]: I0318 06:46:59.845026 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.639896 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.850943 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.850488 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f9f29d1251350a556ed6b42dd8d585e78be5dff49e0407694e95a901aa58647c"} Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.851146 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"90155a5323220891abc6bdafe0d9ac61942e977a6e1173949cb3c682f3d03364"} Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.851214 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a3b311952d160166985f9f00b17f049e95842f8a01e1e8c9046805c00083baf6"} Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.851655 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.855141 4917 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.855234 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.855307 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.855336 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.855349 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.855690 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.855715 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.855779 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.857185 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.857214 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:00 crc kubenswrapper[4917]: I0318 06:47:00.857226 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.861322 4917 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.861916 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.862023 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.861751 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7ea597d60cdf0ae6e5c9954f8653fe9420f84e10b2420c9010f0c8ae2add2ff0"} Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.862122 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f7183844dcd53190cb8f4393b54f382cc5a88a22172066fda558859a313b4eaa"} Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.863633 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.863739 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.863765 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.863995 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.864161 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.864301 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:01 crc kubenswrapper[4917]: I0318 06:47:01.913204 4917 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.116271 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.116522 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.118307 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.118358 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.118375 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.125771 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.171059 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.172543 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.172633 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.172655 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.172683 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.864818 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.864979 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.866685 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.866742 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.866764 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.867181 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.867241 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:02 crc kubenswrapper[4917]: I0318 06:47:02.867267 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.060115 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.060444 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.062438 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.062505 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.062529 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.193083 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.873471 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.875441 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.875505 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:04 crc kubenswrapper[4917]: I0318 06:47:04.875524 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:05 crc kubenswrapper[4917]: E0318 06:47:05.859335 4917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.296918 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.297432 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.299223 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.299781 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.300033 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.511179 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.511806 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.514075 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.514152 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.514172 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.518194 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.879088 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.881185 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.881246 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:06 crc kubenswrapper[4917]: I0318 06:47:06.881271 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:08 crc kubenswrapper[4917]: I0318 06:47:08.897463 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 18 06:47:08 crc kubenswrapper[4917]: I0318 06:47:08.897808 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:08 crc kubenswrapper[4917]: I0318 06:47:08.899509 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:08 crc kubenswrapper[4917]: I0318 06:47:08.899575 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:08 crc kubenswrapper[4917]: I0318 06:47:08.899636 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:09 crc kubenswrapper[4917]: W0318 06:47:09.281431 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 06:47:09 crc kubenswrapper[4917]: I0318 06:47:09.281566 4917 trace.go:236] Trace[1674250245]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 06:46:59.279) (total time: 10002ms): Mar 18 06:47:09 crc kubenswrapper[4917]: Trace[1674250245]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:47:09.281) Mar 18 06:47:09 crc kubenswrapper[4917]: Trace[1674250245]: [10.002112895s] [10.002112895s] END Mar 18 06:47:09 crc kubenswrapper[4917]: E0318 06:47:09.281639 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 06:47:09 crc kubenswrapper[4917]: I0318 06:47:09.512111 4917 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 06:47:09 crc kubenswrapper[4917]: I0318 06:47:09.512191 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 06:47:09 crc kubenswrapper[4917]: W0318 06:47:09.594960 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 06:47:09 crc kubenswrapper[4917]: I0318 06:47:09.595041 4917 trace.go:236] Trace[1896280905]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 06:46:59.592) (total time: 10002ms): Mar 18 06:47:09 crc kubenswrapper[4917]: Trace[1896280905]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (06:47:09.594) Mar 18 06:47:09 crc kubenswrapper[4917]: Trace[1896280905]: [10.002205592s] [10.002205592s] END Mar 18 06:47:09 crc kubenswrapper[4917]: E0318 06:47:09.595063 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 06:47:09 crc kubenswrapper[4917]: I0318 06:47:09.694136 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 18 06:47:09 crc kubenswrapper[4917]: W0318 06:47:09.969815 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 18 06:47:09 crc kubenswrapper[4917]: I0318 06:47:09.969930 4917 trace.go:236] Trace[193045799]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (18-Mar-2026 06:46:59.968) (total time: 10001ms): Mar 18 06:47:09 crc kubenswrapper[4917]: Trace[193045799]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:47:09.969) Mar 18 06:47:09 crc kubenswrapper[4917]: Trace[193045799]: [10.001433943s] [10.001433943s] END Mar 18 06:47:09 crc kubenswrapper[4917]: E0318 06:47:09.969962 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 18 06:47:10 crc kubenswrapper[4917]: I0318 06:47:10.975038 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:10Z is after 2026-02-23T05:33:13Z Mar 18 06:47:10 crc kubenswrapper[4917]: E0318 06:47:10.986411 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:10Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 18 06:47:10 crc kubenswrapper[4917]: E0318 06:47:10.987508 4917 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:10 crc kubenswrapper[4917]: E0318 06:47:10.989394 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:10Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 06:47:10 crc kubenswrapper[4917]: W0318 06:47:10.990748 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:10Z is after 2026-02-23T05:33:13Z Mar 18 06:47:10 crc kubenswrapper[4917]: E0318 06:47:10.990830 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:10Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:10 crc kubenswrapper[4917]: E0318 06:47:10.997250 4917 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:10Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ddca1ca11f151 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,LastTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.000314 4917 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.000373 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.014596 4917 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.014779 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.028931 4917 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54524->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.030458 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54524->192.168.126.11:17697: read: connection reset by peer" Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.700075 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:11Z is after 2026-02-23T05:33:13Z Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.897385 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.901670 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="39e1d8dc6ae51c82c1a38f9508020cefd5ab93a836a298881a0d305cf5b75b54" exitCode=255 Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.901716 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"39e1d8dc6ae51c82c1a38f9508020cefd5ab93a836a298881a0d305cf5b75b54"} Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.901911 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.902992 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.903072 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.903098 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:11 crc kubenswrapper[4917]: I0318 06:47:11.904055 4917 scope.go:117] "RemoveContainer" containerID="39e1d8dc6ae51c82c1a38f9508020cefd5ab93a836a298881a0d305cf5b75b54" Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.697403 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:12Z is after 2026-02-23T05:33:13Z Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.906709 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.907709 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.909901 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0a1b563f3aa2f826c33195bd9eb30a36b0ff6dbc52852627a62777e25bf3df2f" exitCode=255 Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.909942 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0a1b563f3aa2f826c33195bd9eb30a36b0ff6dbc52852627a62777e25bf3df2f"} Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.909996 4917 scope.go:117] "RemoveContainer" containerID="39e1d8dc6ae51c82c1a38f9508020cefd5ab93a836a298881a0d305cf5b75b54" Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.910199 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.911342 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.911385 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.911406 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:12 crc kubenswrapper[4917]: I0318 06:47:12.912233 4917 scope.go:117] "RemoveContainer" containerID="0a1b563f3aa2f826c33195bd9eb30a36b0ff6dbc52852627a62777e25bf3df2f" Mar 18 06:47:12 crc kubenswrapper[4917]: E0318 06:47:12.912781 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:47:13 crc kubenswrapper[4917]: I0318 06:47:13.697434 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:13Z is after 2026-02-23T05:33:13Z Mar 18 06:47:13 crc kubenswrapper[4917]: I0318 06:47:13.923706 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.060572 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.060952 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.063478 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.063677 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.063727 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.065000 4917 scope.go:117] "RemoveContainer" containerID="0a1b563f3aa2f826c33195bd9eb30a36b0ff6dbc52852627a62777e25bf3df2f" Mar 18 06:47:14 crc kubenswrapper[4917]: E0318 06:47:14.065396 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.205792 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:47:14 crc kubenswrapper[4917]: W0318 06:47:14.553973 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:14Z is after 2026-02-23T05:33:13Z Mar 18 06:47:14 crc kubenswrapper[4917]: E0318 06:47:14.554080 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:14Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.697400 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:14Z is after 2026-02-23T05:33:13Z Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.930426 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.931855 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.932001 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.932025 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.933990 4917 scope.go:117] "RemoveContainer" containerID="0a1b563f3aa2f826c33195bd9eb30a36b0ff6dbc52852627a62777e25bf3df2f" Mar 18 06:47:14 crc kubenswrapper[4917]: E0318 06:47:14.934809 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:47:14 crc kubenswrapper[4917]: I0318 06:47:14.941159 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:47:15 crc kubenswrapper[4917]: W0318 06:47:15.312504 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:15Z is after 2026-02-23T05:33:13Z Mar 18 06:47:15 crc kubenswrapper[4917]: E0318 06:47:15.312694 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:15 crc kubenswrapper[4917]: W0318 06:47:15.481792 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:15Z is after 2026-02-23T05:33:13Z Mar 18 06:47:15 crc kubenswrapper[4917]: E0318 06:47:15.481896 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:15Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:15 crc kubenswrapper[4917]: I0318 06:47:15.697982 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:15Z is after 2026-02-23T05:33:13Z Mar 18 06:47:15 crc kubenswrapper[4917]: E0318 06:47:15.859580 4917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 06:47:15 crc kubenswrapper[4917]: I0318 06:47:15.933025 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:15 crc kubenswrapper[4917]: I0318 06:47:15.934303 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:15 crc kubenswrapper[4917]: I0318 06:47:15.934386 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:15 crc kubenswrapper[4917]: I0318 06:47:15.934405 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:15 crc kubenswrapper[4917]: I0318 06:47:15.936093 4917 scope.go:117] "RemoveContainer" containerID="0a1b563f3aa2f826c33195bd9eb30a36b0ff6dbc52852627a62777e25bf3df2f" Mar 18 06:47:15 crc kubenswrapper[4917]: E0318 06:47:15.937121 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:47:16 crc kubenswrapper[4917]: I0318 06:47:16.697883 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:16Z is after 2026-02-23T05:33:13Z Mar 18 06:47:17 crc kubenswrapper[4917]: I0318 06:47:17.390559 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:17 crc kubenswrapper[4917]: I0318 06:47:17.392305 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:17 crc kubenswrapper[4917]: I0318 06:47:17.392371 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:17 crc kubenswrapper[4917]: I0318 06:47:17.392390 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:17 crc kubenswrapper[4917]: I0318 06:47:17.392437 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:47:17 crc kubenswrapper[4917]: E0318 06:47:17.393108 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:17Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 06:47:17 crc kubenswrapper[4917]: E0318 06:47:17.397656 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:17Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 06:47:17 crc kubenswrapper[4917]: I0318 06:47:17.698510 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:17Z is after 2026-02-23T05:33:13Z Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.697839 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:18Z is after 2026-02-23T05:33:13Z Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.935174 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.935436 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.937140 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.937216 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.937238 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.956075 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.956362 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.957912 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.957981 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:18 crc kubenswrapper[4917]: I0318 06:47:18.958004 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:19 crc kubenswrapper[4917]: I0318 06:47:19.228708 4917 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 06:47:19 crc kubenswrapper[4917]: E0318 06:47:19.234640 4917 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:19 crc kubenswrapper[4917]: I0318 06:47:19.511668 4917 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 06:47:19 crc kubenswrapper[4917]: I0318 06:47:19.511938 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 06:47:19 crc kubenswrapper[4917]: I0318 06:47:19.698462 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:19Z is after 2026-02-23T05:33:13Z Mar 18 06:47:20 crc kubenswrapper[4917]: I0318 06:47:20.697804 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:20Z is after 2026-02-23T05:33:13Z Mar 18 06:47:20 crc kubenswrapper[4917]: I0318 06:47:20.938840 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:47:20 crc kubenswrapper[4917]: I0318 06:47:20.939121 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:20 crc kubenswrapper[4917]: I0318 06:47:20.940869 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:20 crc kubenswrapper[4917]: I0318 06:47:20.941103 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:20 crc kubenswrapper[4917]: I0318 06:47:20.941319 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:20 crc kubenswrapper[4917]: I0318 06:47:20.942339 4917 scope.go:117] "RemoveContainer" containerID="0a1b563f3aa2f826c33195bd9eb30a36b0ff6dbc52852627a62777e25bf3df2f" Mar 18 06:47:20 crc kubenswrapper[4917]: E0318 06:47:20.942888 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:47:21 crc kubenswrapper[4917]: E0318 06:47:21.002911 4917 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:21Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ddca1ca11f151 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,LastTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:47:21 crc kubenswrapper[4917]: W0318 06:47:21.431040 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:21Z is after 2026-02-23T05:33:13Z Mar 18 06:47:21 crc kubenswrapper[4917]: E0318 06:47:21.431163 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:21 crc kubenswrapper[4917]: I0318 06:47:21.697825 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:21Z is after 2026-02-23T05:33:13Z Mar 18 06:47:22 crc kubenswrapper[4917]: I0318 06:47:22.697963 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:22Z is after 2026-02-23T05:33:13Z Mar 18 06:47:22 crc kubenswrapper[4917]: W0318 06:47:22.996708 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:22Z is after 2026-02-23T05:33:13Z Mar 18 06:47:22 crc kubenswrapper[4917]: E0318 06:47:22.996847 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:22Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:23 crc kubenswrapper[4917]: W0318 06:47:23.429104 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:23Z is after 2026-02-23T05:33:13Z Mar 18 06:47:23 crc kubenswrapper[4917]: E0318 06:47:23.429191 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:23 crc kubenswrapper[4917]: I0318 06:47:23.695772 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:23Z is after 2026-02-23T05:33:13Z Mar 18 06:47:24 crc kubenswrapper[4917]: I0318 06:47:24.398224 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:24 crc kubenswrapper[4917]: E0318 06:47:24.400102 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:24Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 06:47:24 crc kubenswrapper[4917]: I0318 06:47:24.401038 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:24 crc kubenswrapper[4917]: I0318 06:47:24.401101 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:24 crc kubenswrapper[4917]: I0318 06:47:24.401127 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:24 crc kubenswrapper[4917]: I0318 06:47:24.401169 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:47:24 crc kubenswrapper[4917]: E0318 06:47:24.406452 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:24Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 06:47:24 crc kubenswrapper[4917]: I0318 06:47:24.696819 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:24Z is after 2026-02-23T05:33:13Z Mar 18 06:47:25 crc kubenswrapper[4917]: W0318 06:47:25.372142 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:25Z is after 2026-02-23T05:33:13Z Mar 18 06:47:25 crc kubenswrapper[4917]: E0318 06:47:25.372261 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:25 crc kubenswrapper[4917]: I0318 06:47:25.697133 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:25Z is after 2026-02-23T05:33:13Z Mar 18 06:47:25 crc kubenswrapper[4917]: E0318 06:47:25.859724 4917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 06:47:26 crc kubenswrapper[4917]: I0318 06:47:26.697801 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:26Z is after 2026-02-23T05:33:13Z Mar 18 06:47:27 crc kubenswrapper[4917]: I0318 06:47:27.696416 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:27Z is after 2026-02-23T05:33:13Z Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.611227 4917 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:42912->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.611316 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:42912->192.168.126.11:10357: read: connection reset by peer" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.611407 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.611667 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.614083 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.614140 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.614157 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.614912 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"3178418872d0dccec0135fd63950389720710368991e50a202dc241dbef7a1df"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.615172 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://3178418872d0dccec0135fd63950389720710368991e50a202dc241dbef7a1df" gracePeriod=30 Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.697708 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:28Z is after 2026-02-23T05:33:13Z Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.977485 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.978434 4917 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3178418872d0dccec0135fd63950389720710368991e50a202dc241dbef7a1df" exitCode=255 Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.978518 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3178418872d0dccec0135fd63950389720710368991e50a202dc241dbef7a1df"} Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.978818 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.978944 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1d9c7fdc4b08c0e5b279133dd8462ad4f40a159f90b935f0868bec4463cd9ca4"} Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.983449 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.983510 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:28 crc kubenswrapper[4917]: I0318 06:47:28.983528 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:29 crc kubenswrapper[4917]: I0318 06:47:29.090917 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:47:29 crc kubenswrapper[4917]: I0318 06:47:29.697977 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:29Z is after 2026-02-23T05:33:13Z Mar 18 06:47:29 crc kubenswrapper[4917]: I0318 06:47:29.982056 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:29 crc kubenswrapper[4917]: I0318 06:47:29.983747 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:29 crc kubenswrapper[4917]: I0318 06:47:29.983810 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:29 crc kubenswrapper[4917]: I0318 06:47:29.983830 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:30 crc kubenswrapper[4917]: I0318 06:47:30.697342 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:30Z is after 2026-02-23T05:33:13Z Mar 18 06:47:30 crc kubenswrapper[4917]: I0318 06:47:30.984250 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:30 crc kubenswrapper[4917]: I0318 06:47:30.985838 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:30 crc kubenswrapper[4917]: I0318 06:47:30.985906 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:30 crc kubenswrapper[4917]: I0318 06:47:30.985924 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:31 crc kubenswrapper[4917]: E0318 06:47:31.009786 4917 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:31Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ddca1ca11f151 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,LastTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:47:31 crc kubenswrapper[4917]: E0318 06:47:31.406438 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:31Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 06:47:31 crc kubenswrapper[4917]: I0318 06:47:31.406549 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:31 crc kubenswrapper[4917]: I0318 06:47:31.408175 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:31 crc kubenswrapper[4917]: I0318 06:47:31.408235 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:31 crc kubenswrapper[4917]: I0318 06:47:31.408264 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:31 crc kubenswrapper[4917]: I0318 06:47:31.408304 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:47:31 crc kubenswrapper[4917]: E0318 06:47:31.413064 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:31Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 06:47:31 crc kubenswrapper[4917]: I0318 06:47:31.697934 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:31Z is after 2026-02-23T05:33:13Z Mar 18 06:47:32 crc kubenswrapper[4917]: I0318 06:47:32.697934 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:32Z is after 2026-02-23T05:33:13Z Mar 18 06:47:33 crc kubenswrapper[4917]: I0318 06:47:33.697284 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:33Z is after 2026-02-23T05:33:13Z Mar 18 06:47:34 crc kubenswrapper[4917]: I0318 06:47:34.697178 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:34Z is after 2026-02-23T05:33:13Z Mar 18 06:47:35 crc kubenswrapper[4917]: I0318 06:47:35.699948 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:35Z is after 2026-02-23T05:33:13Z Mar 18 06:47:35 crc kubenswrapper[4917]: I0318 06:47:35.772152 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:35 crc kubenswrapper[4917]: I0318 06:47:35.774076 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:35 crc kubenswrapper[4917]: I0318 06:47:35.774141 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:35 crc kubenswrapper[4917]: I0318 06:47:35.774161 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:35 crc kubenswrapper[4917]: I0318 06:47:35.775103 4917 scope.go:117] "RemoveContainer" containerID="0a1b563f3aa2f826c33195bd9eb30a36b0ff6dbc52852627a62777e25bf3df2f" Mar 18 06:47:35 crc kubenswrapper[4917]: E0318 06:47:35.860357 4917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 06:47:36 crc kubenswrapper[4917]: I0318 06:47:36.490983 4917 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 06:47:36 crc kubenswrapper[4917]: E0318 06:47:36.494510 4917 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:36 crc kubenswrapper[4917]: E0318 06:47:36.495807 4917 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 18 06:47:36 crc kubenswrapper[4917]: I0318 06:47:36.510837 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:47:36 crc kubenswrapper[4917]: I0318 06:47:36.511162 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:36 crc kubenswrapper[4917]: I0318 06:47:36.513620 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:36 crc kubenswrapper[4917]: I0318 06:47:36.513679 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:36 crc kubenswrapper[4917]: I0318 06:47:36.513699 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:36 crc kubenswrapper[4917]: I0318 06:47:36.698185 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:36Z is after 2026-02-23T05:33:13Z Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.003901 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.004657 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.007176 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6d7e1319710b63bf7d5f6c819f10f9e38c369e1c94cb0f91ebbccf03b557ee5d" exitCode=255 Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.007233 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6d7e1319710b63bf7d5f6c819f10f9e38c369e1c94cb0f91ebbccf03b557ee5d"} Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.007280 4917 scope.go:117] "RemoveContainer" containerID="0a1b563f3aa2f826c33195bd9eb30a36b0ff6dbc52852627a62777e25bf3df2f" Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.007578 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.009079 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.009125 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.009182 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.009924 4917 scope.go:117] "RemoveContainer" containerID="6d7e1319710b63bf7d5f6c819f10f9e38c369e1c94cb0f91ebbccf03b557ee5d" Mar 18 06:47:37 crc kubenswrapper[4917]: E0318 06:47:37.010418 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:47:37 crc kubenswrapper[4917]: I0318 06:47:37.698257 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:37Z is after 2026-02-23T05:33:13Z Mar 18 06:47:38 crc kubenswrapper[4917]: I0318 06:47:38.011675 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 06:47:38 crc kubenswrapper[4917]: E0318 06:47:38.411817 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:38Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 06:47:38 crc kubenswrapper[4917]: I0318 06:47:38.414037 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:38 crc kubenswrapper[4917]: I0318 06:47:38.415455 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:38 crc kubenswrapper[4917]: I0318 06:47:38.415498 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:38 crc kubenswrapper[4917]: I0318 06:47:38.415509 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:38 crc kubenswrapper[4917]: I0318 06:47:38.415540 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:47:38 crc kubenswrapper[4917]: E0318 06:47:38.420125 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:38Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 06:47:38 crc kubenswrapper[4917]: I0318 06:47:38.697473 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:38Z is after 2026-02-23T05:33:13Z Mar 18 06:47:39 crc kubenswrapper[4917]: I0318 06:47:39.511225 4917 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 06:47:39 crc kubenswrapper[4917]: I0318 06:47:39.511342 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 06:47:39 crc kubenswrapper[4917]: I0318 06:47:39.697326 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:39Z is after 2026-02-23T05:33:13Z Mar 18 06:47:40 crc kubenswrapper[4917]: W0318 06:47:40.604125 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:40Z is after 2026-02-23T05:33:13Z Mar 18 06:47:40 crc kubenswrapper[4917]: E0318 06:47:40.604239 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:40 crc kubenswrapper[4917]: I0318 06:47:40.699406 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:40Z is after 2026-02-23T05:33:13Z Mar 18 06:47:40 crc kubenswrapper[4917]: I0318 06:47:40.938623 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:47:40 crc kubenswrapper[4917]: I0318 06:47:40.938875 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:40 crc kubenswrapper[4917]: I0318 06:47:40.940937 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:40 crc kubenswrapper[4917]: I0318 06:47:40.941021 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:40 crc kubenswrapper[4917]: I0318 06:47:40.941045 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:40 crc kubenswrapper[4917]: I0318 06:47:40.942119 4917 scope.go:117] "RemoveContainer" containerID="6d7e1319710b63bf7d5f6c819f10f9e38c369e1c94cb0f91ebbccf03b557ee5d" Mar 18 06:47:40 crc kubenswrapper[4917]: E0318 06:47:40.942506 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:47:41 crc kubenswrapper[4917]: E0318 06:47:41.015190 4917 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:41Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ddca1ca11f151 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,LastTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:47:41 crc kubenswrapper[4917]: I0318 06:47:41.699938 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:41Z is after 2026-02-23T05:33:13Z Mar 18 06:47:42 crc kubenswrapper[4917]: I0318 06:47:42.696809 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:42Z is after 2026-02-23T05:33:13Z Mar 18 06:47:43 crc kubenswrapper[4917]: I0318 06:47:43.697314 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:43Z is after 2026-02-23T05:33:13Z Mar 18 06:47:44 crc kubenswrapper[4917]: I0318 06:47:44.060432 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:47:44 crc kubenswrapper[4917]: I0318 06:47:44.060701 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:44 crc kubenswrapper[4917]: I0318 06:47:44.062085 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:44 crc kubenswrapper[4917]: I0318 06:47:44.062150 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:44 crc kubenswrapper[4917]: I0318 06:47:44.062163 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:44 crc kubenswrapper[4917]: I0318 06:47:44.062819 4917 scope.go:117] "RemoveContainer" containerID="6d7e1319710b63bf7d5f6c819f10f9e38c369e1c94cb0f91ebbccf03b557ee5d" Mar 18 06:47:44 crc kubenswrapper[4917]: E0318 06:47:44.063014 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:47:44 crc kubenswrapper[4917]: W0318 06:47:44.240737 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:44Z is after 2026-02-23T05:33:13Z Mar 18 06:47:44 crc kubenswrapper[4917]: E0318 06:47:44.240817 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:44 crc kubenswrapper[4917]: I0318 06:47:44.697071 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:44Z is after 2026-02-23T05:33:13Z Mar 18 06:47:45 crc kubenswrapper[4917]: E0318 06:47:45.418544 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:45Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 18 06:47:45 crc kubenswrapper[4917]: I0318 06:47:45.420768 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:45 crc kubenswrapper[4917]: I0318 06:47:45.422389 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:45 crc kubenswrapper[4917]: I0318 06:47:45.422443 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:45 crc kubenswrapper[4917]: I0318 06:47:45.422460 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:45 crc kubenswrapper[4917]: I0318 06:47:45.422491 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:47:45 crc kubenswrapper[4917]: E0318 06:47:45.426162 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:45Z is after 2026-02-23T05:33:13Z" node="crc" Mar 18 06:47:45 crc kubenswrapper[4917]: I0318 06:47:45.697822 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:45Z is after 2026-02-23T05:33:13Z Mar 18 06:47:45 crc kubenswrapper[4917]: E0318 06:47:45.861325 4917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 06:47:46 crc kubenswrapper[4917]: W0318 06:47:46.090284 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:46Z is after 2026-02-23T05:33:13Z Mar 18 06:47:46 crc kubenswrapper[4917]: E0318 06:47:46.090394 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:46 crc kubenswrapper[4917]: I0318 06:47:46.697529 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:46Z is after 2026-02-23T05:33:13Z Mar 18 06:47:47 crc kubenswrapper[4917]: I0318 06:47:47.697802 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:47Z is after 2026-02-23T05:33:13Z Mar 18 06:47:48 crc kubenswrapper[4917]: W0318 06:47:48.352251 4917 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:48Z is after 2026-02-23T05:33:13Z Mar 18 06:47:48 crc kubenswrapper[4917]: E0318 06:47:48.352395 4917 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:48Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 18 06:47:48 crc kubenswrapper[4917]: I0318 06:47:48.697810 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:48Z is after 2026-02-23T05:33:13Z Mar 18 06:47:49 crc kubenswrapper[4917]: I0318 06:47:49.458930 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 18 06:47:49 crc kubenswrapper[4917]: I0318 06:47:49.459153 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:49 crc kubenswrapper[4917]: I0318 06:47:49.460913 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:49 crc kubenswrapper[4917]: I0318 06:47:49.461010 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:49 crc kubenswrapper[4917]: I0318 06:47:49.461029 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:49 crc kubenswrapper[4917]: I0318 06:47:49.511548 4917 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 18 06:47:49 crc kubenswrapper[4917]: I0318 06:47:49.511719 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 18 06:47:49 crc kubenswrapper[4917]: I0318 06:47:49.697036 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:49Z is after 2026-02-23T05:33:13Z Mar 18 06:47:50 crc kubenswrapper[4917]: I0318 06:47:50.700070 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:50Z is after 2026-02-23T05:33:13Z Mar 18 06:47:51 crc kubenswrapper[4917]: E0318 06:47:51.020326 4917 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:47:51Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189ddca1ca11f151 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,LastTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:47:51 crc kubenswrapper[4917]: I0318 06:47:51.699267 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:47:52 crc kubenswrapper[4917]: E0318 06:47:52.425358 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 06:47:52 crc kubenswrapper[4917]: I0318 06:47:52.426363 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:52 crc kubenswrapper[4917]: I0318 06:47:52.428158 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:52 crc kubenswrapper[4917]: I0318 06:47:52.428227 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:52 crc kubenswrapper[4917]: I0318 06:47:52.428250 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:52 crc kubenswrapper[4917]: I0318 06:47:52.428297 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:47:52 crc kubenswrapper[4917]: E0318 06:47:52.434568 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 06:47:52 crc kubenswrapper[4917]: I0318 06:47:52.699797 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:47:53 crc kubenswrapper[4917]: I0318 06:47:53.694854 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:47:54 crc kubenswrapper[4917]: I0318 06:47:54.698961 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:47:55 crc kubenswrapper[4917]: I0318 06:47:55.700263 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:47:55 crc kubenswrapper[4917]: E0318 06:47:55.863343 4917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 06:47:56 crc kubenswrapper[4917]: I0318 06:47:56.700769 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:47:57 crc kubenswrapper[4917]: I0318 06:47:57.701409 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:47:57 crc kubenswrapper[4917]: I0318 06:47:57.772578 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:57 crc kubenswrapper[4917]: I0318 06:47:57.774268 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:57 crc kubenswrapper[4917]: I0318 06:47:57.774338 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:57 crc kubenswrapper[4917]: I0318 06:47:57.774357 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:57 crc kubenswrapper[4917]: I0318 06:47:57.775222 4917 scope.go:117] "RemoveContainer" containerID="6d7e1319710b63bf7d5f6c819f10f9e38c369e1c94cb0f91ebbccf03b557ee5d" Mar 18 06:47:58 crc kubenswrapper[4917]: I0318 06:47:58.083192 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 06:47:58 crc kubenswrapper[4917]: I0318 06:47:58.088132 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95"} Mar 18 06:47:58 crc kubenswrapper[4917]: I0318 06:47:58.088357 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:58 crc kubenswrapper[4917]: I0318 06:47:58.089942 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:58 crc kubenswrapper[4917]: I0318 06:47:58.090074 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:58 crc kubenswrapper[4917]: I0318 06:47:58.090104 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:58 crc kubenswrapper[4917]: I0318 06:47:58.699021 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.093805 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.094627 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.097397 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95" exitCode=255 Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.097466 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95"} Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.097524 4917 scope.go:117] "RemoveContainer" containerID="6d7e1319710b63bf7d5f6c819f10f9e38c369e1c94cb0f91ebbccf03b557ee5d" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.097779 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.109918 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.110018 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.110034 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.111040 4917 scope.go:117] "RemoveContainer" containerID="0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95" Mar 18 06:47:59 crc kubenswrapper[4917]: E0318 06:47:59.111382 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:47:59 crc kubenswrapper[4917]: E0318 06:47:59.434676 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.435575 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.437426 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.437488 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.437507 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.437549 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:47:59 crc kubenswrapper[4917]: E0318 06:47:59.444513 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.505189 4917 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41650->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.506511 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:41650->192.168.126.11:10357: read: connection reset by peer" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.506935 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.507277 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.509071 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.509290 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.509478 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.510301 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"1d9c7fdc4b08c0e5b279133dd8462ad4f40a159f90b935f0868bec4463cd9ca4"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.510688 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://1d9c7fdc4b08c0e5b279133dd8462ad4f40a159f90b935f0868bec4463cd9ca4" gracePeriod=30 Mar 18 06:47:59 crc kubenswrapper[4917]: I0318 06:47:59.704152 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.101779 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.103068 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.103393 4917 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1d9c7fdc4b08c0e5b279133dd8462ad4f40a159f90b935f0868bec4463cd9ca4" exitCode=255 Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.103467 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1d9c7fdc4b08c0e5b279133dd8462ad4f40a159f90b935f0868bec4463cd9ca4"} Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.103505 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"09f8f3eda196deb2b483da94fd0175b06497094584816dfec524159b89e33131"} Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.103528 4917 scope.go:117] "RemoveContainer" containerID="3178418872d0dccec0135fd63950389720710368991e50a202dc241dbef7a1df" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.103681 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.105116 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.105144 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.105155 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.107756 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.697212 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.938898 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.939057 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.940205 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.940233 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.940242 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:00 crc kubenswrapper[4917]: I0318 06:48:00.940697 4917 scope.go:117] "RemoveContainer" containerID="0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95" Mar 18 06:48:00 crc kubenswrapper[4917]: E0318 06:48:00.940879 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.028809 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ca11f151 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,LastTimestamp:2026-03-18 06:46:55.691862353 +0000 UTC m=+0.633017107,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.035253 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8d904f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767072847 +0000 UTC m=+0.708227551,LastTimestamp:2026-03-18 06:46:55.767072847 +0000 UTC m=+0.708227551,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.042749 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8dc02a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767085098 +0000 UTC m=+0.708239812,LastTimestamp:2026-03-18 06:46:55.767085098 +0000 UTC m=+0.708239812,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.050137 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8de118 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767093528 +0000 UTC m=+0.708248242,LastTimestamp:2026-03-18 06:46:55.767093528 +0000 UTC m=+0.708248242,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.058655 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1d4b56969 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.870347625 +0000 UTC m=+0.811502349,LastTimestamp:2026-03-18 06:46:55.870347625 +0000 UTC m=+0.811502349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.070111 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8d904f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8d904f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767072847 +0000 UTC m=+0.708227551,LastTimestamp:2026-03-18 06:46:55.874544681 +0000 UTC m=+0.815699425,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.078693 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8dc02a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8dc02a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767085098 +0000 UTC m=+0.708239812,LastTimestamp:2026-03-18 06:46:55.874616082 +0000 UTC m=+0.815770826,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.085488 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8de118\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8de118 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767093528 +0000 UTC m=+0.708248242,LastTimestamp:2026-03-18 06:46:55.874635713 +0000 UTC m=+0.815790467,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.092173 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8d904f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8d904f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767072847 +0000 UTC m=+0.708227551,LastTimestamp:2026-03-18 06:46:55.876444754 +0000 UTC m=+0.817599478,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.098798 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8dc02a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8dc02a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767085098 +0000 UTC m=+0.708239812,LastTimestamp:2026-03-18 06:46:55.876463634 +0000 UTC m=+0.817618358,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.105572 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8de118\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8de118 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767093528 +0000 UTC m=+0.708248242,LastTimestamp:2026-03-18 06:46:55.876475995 +0000 UTC m=+0.817630719,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.113076 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8d904f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8d904f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767072847 +0000 UTC m=+0.708227551,LastTimestamp:2026-03-18 06:46:55.876929425 +0000 UTC m=+0.818084179,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: I0318 06:48:01.117653 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.120580 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8dc02a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8dc02a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767085098 +0000 UTC m=+0.708239812,LastTimestamp:2026-03-18 06:46:55.876983926 +0000 UTC m=+0.818138680,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.127625 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8de118\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8de118 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767093528 +0000 UTC m=+0.708248242,LastTimestamp:2026-03-18 06:46:55.877001666 +0000 UTC m=+0.818156410,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.138545 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8d904f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8d904f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767072847 +0000 UTC m=+0.708227551,LastTimestamp:2026-03-18 06:46:55.878977982 +0000 UTC m=+0.820132706,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.144621 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8dc02a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8dc02a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767085098 +0000 UTC m=+0.708239812,LastTimestamp:2026-03-18 06:46:55.878992633 +0000 UTC m=+0.820147357,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.152413 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8de118\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8de118 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767093528 +0000 UTC m=+0.708248242,LastTimestamp:2026-03-18 06:46:55.879004353 +0000 UTC m=+0.820159087,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.159975 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8d904f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8d904f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767072847 +0000 UTC m=+0.708227551,LastTimestamp:2026-03-18 06:46:55.879999956 +0000 UTC m=+0.821154690,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.167015 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8dc02a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8dc02a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767085098 +0000 UTC m=+0.708239812,LastTimestamp:2026-03-18 06:46:55.880032967 +0000 UTC m=+0.821187691,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.173389 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8de118\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8de118 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767093528 +0000 UTC m=+0.708248242,LastTimestamp:2026-03-18 06:46:55.880045557 +0000 UTC m=+0.821200281,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.184238 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8d904f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8d904f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767072847 +0000 UTC m=+0.708227551,LastTimestamp:2026-03-18 06:46:55.880885406 +0000 UTC m=+0.822040130,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.188244 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8dc02a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8dc02a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767085098 +0000 UTC m=+0.708239812,LastTimestamp:2026-03-18 06:46:55.880922676 +0000 UTC m=+0.822077400,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.194842 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8de118\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8de118 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767093528 +0000 UTC m=+0.708248242,LastTimestamp:2026-03-18 06:46:55.880935857 +0000 UTC m=+0.822090581,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.198895 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8d904f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8d904f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767072847 +0000 UTC m=+0.708227551,LastTimestamp:2026-03-18 06:46:55.88323827 +0000 UTC m=+0.824393024,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.203498 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189ddca1ce8dc02a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189ddca1ce8dc02a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:55.767085098 +0000 UTC m=+0.708239812,LastTimestamp:2026-03-18 06:46:55.883291731 +0000 UTC m=+0.824446485,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.211786 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca1ecee29ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.276720078 +0000 UTC m=+1.217874782,LastTimestamp:2026-03-18 06:46:56.276720078 +0000 UTC m=+1.217874782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.218906 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ddca1ecf39ea2 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.277077666 +0000 UTC m=+1.218232400,LastTimestamp:2026-03-18 06:46:56.277077666 +0000 UTC m=+1.218232400,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.223119 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca1ed3e72f9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.281981689 +0000 UTC m=+1.223136423,LastTimestamp:2026-03-18 06:46:56.281981689 +0000 UTC m=+1.223136423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.229674 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca1ee1471a2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.29600605 +0000 UTC m=+1.237160804,LastTimestamp:2026-03-18 06:46:56.29600605 +0000 UTC m=+1.237160804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.236214 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca1ee242f87 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.297037703 +0000 UTC m=+1.238192447,LastTimestamp:2026-03-18 06:46:56.297037703 +0000 UTC m=+1.238192447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.241630 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ddca213bdc9e6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.927861222 +0000 UTC m=+1.869015946,LastTimestamp:2026-03-18 06:46:56.927861222 +0000 UTC m=+1.869015946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.245687 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca213cf8916 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.929024278 +0000 UTC m=+1.870179002,LastTimestamp:2026-03-18 06:46:56.929024278 +0000 UTC m=+1.870179002,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.249059 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca213f7dcaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.931667119 +0000 UTC m=+1.872821843,LastTimestamp:2026-03-18 06:46:56.931667119 +0000 UTC m=+1.872821843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.257633 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca2142f85d6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.935314902 +0000 UTC m=+1.876469626,LastTimestamp:2026-03-18 06:46:56.935314902 +0000 UTC m=+1.876469626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.261371 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca214c8aa4a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.945351242 +0000 UTC m=+1.886505966,LastTimestamp:2026-03-18 06:46:56.945351242 +0000 UTC m=+1.886505966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.266086 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca214ddb4c8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.946730184 +0000 UTC m=+1.887884908,LastTimestamp:2026-03-18 06:46:56.946730184 +0000 UTC m=+1.887884908,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.274545 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca214ef0db4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.94786706 +0000 UTC m=+1.889021784,LastTimestamp:2026-03-18 06:46:56.94786706 +0000 UTC m=+1.889021784,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.283667 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca2150ffe5c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.95002582 +0000 UTC m=+1.891180544,LastTimestamp:2026-03-18 06:46:56.95002582 +0000 UTC m=+1.891180544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.291557 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ddca2152d6b9e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.951954334 +0000 UTC m=+1.893109058,LastTimestamp:2026-03-18 06:46:56.951954334 +0000 UTC m=+1.893109058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.298256 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca2152de876 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.951986294 +0000 UTC m=+1.893141018,LastTimestamp:2026-03-18 06:46:56.951986294 +0000 UTC m=+1.893141018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.308295 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca21600078a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.96575681 +0000 UTC m=+1.906911534,LastTimestamp:2026-03-18 06:46:56.96575681 +0000 UTC m=+1.906911534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.323464 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca22a23480c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.303611404 +0000 UTC m=+2.244766128,LastTimestamp:2026-03-18 06:46:57.303611404 +0000 UTC m=+2.244766128,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.331143 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca22b0c276e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.318872942 +0000 UTC m=+2.260027666,LastTimestamp:2026-03-18 06:46:57.318872942 +0000 UTC m=+2.260027666,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.336919 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca22b24fb04 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.320499972 +0000 UTC m=+2.261654726,LastTimestamp:2026-03-18 06:46:57.320499972 +0000 UTC m=+2.261654726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.341819 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca239731a0d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.560500749 +0000 UTC m=+2.501655473,LastTimestamp:2026-03-18 06:46:57.560500749 +0000 UTC m=+2.501655473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.348886 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca23a608b39 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.576061753 +0000 UTC m=+2.517216477,LastTimestamp:2026-03-18 06:46:57.576061753 +0000 UTC m=+2.517216477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.354234 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca23a7978bf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.577695423 +0000 UTC m=+2.518850177,LastTimestamp:2026-03-18 06:46:57.577695423 +0000 UTC m=+2.518850177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.359882 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ddca247b7334c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.799844684 +0000 UTC m=+2.740999438,LastTimestamp:2026-03-18 06:46:57.799844684 +0000 UTC m=+2.740999438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.365329 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca248657abf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.811266239 +0000 UTC m=+2.752420973,LastTimestamp:2026-03-18 06:46:57.811266239 +0000 UTC m=+2.752420973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.371185 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca24868ac2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.811475502 +0000 UTC m=+2.752630226,LastTimestamp:2026-03-18 06:46:57.811475502 +0000 UTC m=+2.752630226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.376045 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca24873d351 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.812206417 +0000 UTC m=+2.753361171,LastTimestamp:2026-03-18 06:46:57.812206417 +0000 UTC m=+2.753361171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.382738 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca248ae66fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.816045308 +0000 UTC m=+2.757200052,LastTimestamp:2026-03-18 06:46:57.816045308 +0000 UTC m=+2.757200052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.387009 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca249ccda2d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.834818093 +0000 UTC m=+2.775972817,LastTimestamp:2026-03-18 06:46:57.834818093 +0000 UTC m=+2.775972817,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.390703 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ddca257ebffab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.071740331 +0000 UTC m=+3.012895035,LastTimestamp:2026-03-18 06:46:58.071740331 +0000 UTC m=+3.012895035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.395118 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca257fb7c47 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.072755271 +0000 UTC m=+3.013909985,LastTimestamp:2026-03-18 06:46:58.072755271 +0000 UTC m=+3.013909985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.398947 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca258118c32 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.074201138 +0000 UTC m=+3.015355842,LastTimestamp:2026-03-18 06:46:58.074201138 +0000 UTC m=+3.015355842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.402769 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2582d33f0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.076013552 +0000 UTC m=+3.017168266,LastTimestamp:2026-03-18 06:46:58.076013552 +0000 UTC m=+3.017168266,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.408713 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca2589bb042 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.083254338 +0000 UTC m=+3.024409052,LastTimestamp:2026-03-18 06:46:58.083254338 +0000 UTC m=+3.024409052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.413149 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca258bd8e62 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.08547389 +0000 UTC m=+3.026628614,LastTimestamp:2026-03-18 06:46:58.08547389 +0000 UTC m=+3.026628614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.416970 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189ddca258e3f47e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.087990398 +0000 UTC m=+3.029145102,LastTimestamp:2026-03-18 06:46:58.087990398 +0000 UTC m=+3.029145102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.421084 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2596d0aab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.096974507 +0000 UTC m=+3.038129221,LastTimestamp:2026-03-18 06:46:58.096974507 +0000 UTC m=+3.038129221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.424942 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca259974219 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.099741209 +0000 UTC m=+3.040895923,LastTimestamp:2026-03-18 06:46:58.099741209 +0000 UTC m=+3.040895923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.429170 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca259b16c6e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.101455982 +0000 UTC m=+3.042610696,LastTimestamp:2026-03-18 06:46:58.101455982 +0000 UTC m=+3.042610696,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.433030 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca265ae89cd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.302593485 +0000 UTC m=+3.243748199,LastTimestamp:2026-03-18 06:46:58.302593485 +0000 UTC m=+3.243748199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.436629 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca265b01bd7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.302696407 +0000 UTC m=+3.243851131,LastTimestamp:2026-03-18 06:46:58.302696407 +0000 UTC m=+3.243851131,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.441671 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca26666a0be openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.314657982 +0000 UTC m=+3.255812706,LastTimestamp:2026-03-18 06:46:58.314657982 +0000 UTC m=+3.255812706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.445865 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca26676d42b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.315719723 +0000 UTC m=+3.256874477,LastTimestamp:2026-03-18 06:46:58.315719723 +0000 UTC m=+3.256874477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.450867 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca2667dab6a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.316168042 +0000 UTC m=+3.257322756,LastTimestamp:2026-03-18 06:46:58.316168042 +0000 UTC m=+3.257322756,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.455787 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca26689ce58 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.316963416 +0000 UTC m=+3.258118130,LastTimestamp:2026-03-18 06:46:58.316963416 +0000 UTC m=+3.258118130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.462975 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca273138bc5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.527316933 +0000 UTC m=+3.468471647,LastTimestamp:2026-03-18 06:46:58.527316933 +0000 UTC m=+3.468471647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.468296 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca273857a43 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.534783555 +0000 UTC m=+3.475938269,LastTimestamp:2026-03-18 06:46:58.534783555 +0000 UTC m=+3.475938269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.473157 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca274e35ea7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.557714087 +0000 UTC m=+3.498868801,LastTimestamp:2026-03-18 06:46:58.557714087 +0000 UTC m=+3.498868801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.478081 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca274f93dbe openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.559147454 +0000 UTC m=+3.500302158,LastTimestamp:2026-03-18 06:46:58.559147454 +0000 UTC m=+3.500302158,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.482920 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189ddca2753aac55 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.563435605 +0000 UTC m=+3.504590319,LastTimestamp:2026-03-18 06:46:58.563435605 +0000 UTC m=+3.504590319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.488239 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca2820077a4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.777724836 +0000 UTC m=+3.718879550,LastTimestamp:2026-03-18 06:46:58.777724836 +0000 UTC m=+3.718879550,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.493341 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca2838ab59c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.803561884 +0000 UTC m=+3.744716608,LastTimestamp:2026-03-18 06:46:58.803561884 +0000 UTC m=+3.744716608,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.497296 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca2839ea8f9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.804869369 +0000 UTC m=+3.746024083,LastTimestamp:2026-03-18 06:46:58.804869369 +0000 UTC m=+3.746024083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.501937 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca28475267c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.818926204 +0000 UTC m=+3.760080918,LastTimestamp:2026-03-18 06:46:58.818926204 +0000 UTC m=+3.760080918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.507116 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca28ff22ddc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:59.011669468 +0000 UTC m=+3.952824192,LastTimestamp:2026-03-18 06:46:59.011669468 +0000 UTC m=+3.952824192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.511167 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca290ee4f38 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:59.02819308 +0000 UTC m=+3.969347844,LastTimestamp:2026-03-18 06:46:59.02819308 +0000 UTC m=+3.969347844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.515578 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca2911965e5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:59.031016933 +0000 UTC m=+3.972171657,LastTimestamp:2026-03-18 06:46:59.031016933 +0000 UTC m=+3.972171657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.522067 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca291ee5ea7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:59.044974247 +0000 UTC m=+3.986129001,LastTimestamp:2026-03-18 06:46:59.044974247 +0000 UTC m=+3.986129001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.527231 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2c16e079e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:59.841869726 +0000 UTC m=+4.783024460,LastTimestamp:2026-03-18 06:46:59.841869726 +0000 UTC m=+4.783024460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.533788 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2cfb7a3fb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.081574907 +0000 UTC m=+5.022729631,LastTimestamp:2026-03-18 06:47:00.081574907 +0000 UTC m=+5.022729631,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.538574 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2d070d604 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.093711876 +0000 UTC m=+5.034866610,LastTimestamp:2026-03-18 06:47:00.093711876 +0000 UTC m=+5.034866610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.545343 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2d08a42e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.095378147 +0000 UTC m=+5.036532891,LastTimestamp:2026-03-18 06:47:00.095378147 +0000 UTC m=+5.036532891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.551866 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2e118834d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.373136205 +0000 UTC m=+5.314290929,LastTimestamp:2026-03-18 06:47:00.373136205 +0000 UTC m=+5.314290929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.556124 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2e222209f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.390543519 +0000 UTC m=+5.331698243,LastTimestamp:2026-03-18 06:47:00.390543519 +0000 UTC m=+5.331698243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.561300 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2e2377fb9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.391944121 +0000 UTC m=+5.333098845,LastTimestamp:2026-03-18 06:47:00.391944121 +0000 UTC m=+5.333098845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.565474 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2f2c17cd2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.669422802 +0000 UTC m=+5.610577546,LastTimestamp:2026-03-18 06:47:00.669422802 +0000 UTC m=+5.610577546,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.569828 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2f3c8211d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.686635293 +0000 UTC m=+5.627790037,LastTimestamp:2026-03-18 06:47:00.686635293 +0000 UTC m=+5.627790037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.573844 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca2f3df3429 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.688147497 +0000 UTC m=+5.629302251,LastTimestamp:2026-03-18 06:47:00.688147497 +0000 UTC m=+5.629302251,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.578370 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca3031033b6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.943016886 +0000 UTC m=+5.884171610,LastTimestamp:2026-03-18 06:47:00.943016886 +0000 UTC m=+5.884171610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.582712 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca303b5db3b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.953873211 +0000 UTC m=+5.895027935,LastTimestamp:2026-03-18 06:47:00.953873211 +0000 UTC m=+5.895027935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.588865 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca303c71749 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:00.955002697 +0000 UTC m=+5.896157421,LastTimestamp:2026-03-18 06:47:00.955002697 +0000 UTC m=+5.896157421,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.593123 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca30f544639 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:01.148804665 +0000 UTC m=+6.089959389,LastTimestamp:2026-03-18 06:47:01.148804665 +0000 UTC m=+6.089959389,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.599329 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189ddca310541055 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:01.165568085 +0000 UTC m=+6.106722799,LastTimestamp:2026-03-18 06:47:01.165568085 +0000 UTC m=+6.106722799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.607182 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 06:48:01 crc kubenswrapper[4917]: &Event{ObjectMeta:{kube-controller-manager-crc.189ddca501d31baf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 06:48:01 crc kubenswrapper[4917]: body: Mar 18 06:48:01 crc kubenswrapper[4917]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:09.512170415 +0000 UTC m=+14.453325159,LastTimestamp:2026-03-18 06:47:09.512170415 +0000 UTC m=+14.453325159,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:48:01 crc kubenswrapper[4917]: > Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.611094 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca501d400b7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:09.512229047 +0000 UTC m=+14.453383801,LastTimestamp:2026-03-18 06:47:09.512229047 +0000 UTC m=+14.453383801,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.616568 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 06:48:01 crc kubenswrapper[4917]: &Event{ObjectMeta:{kube-apiserver-crc.189ddca55a870b17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 06:48:01 crc kubenswrapper[4917]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 06:48:01 crc kubenswrapper[4917]: Mar 18 06:48:01 crc kubenswrapper[4917]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:11.000357655 +0000 UTC m=+15.941512379,LastTimestamp:2026-03-18 06:47:11.000357655 +0000 UTC m=+15.941512379,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:48:01 crc kubenswrapper[4917]: > Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.620865 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca55a87a348 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:11.000396616 +0000 UTC m=+15.941551340,LastTimestamp:2026-03-18 06:47:11.000396616 +0000 UTC m=+15.941551340,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.625821 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ddca55a870b17\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 06:48:01 crc kubenswrapper[4917]: &Event{ObjectMeta:{kube-apiserver-crc.189ddca55a870b17 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 18 06:48:01 crc kubenswrapper[4917]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 18 06:48:01 crc kubenswrapper[4917]: Mar 18 06:48:01 crc kubenswrapper[4917]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:11.000357655 +0000 UTC m=+15.941512379,LastTimestamp:2026-03-18 06:47:11.014764311 +0000 UTC m=+15.955919055,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:48:01 crc kubenswrapper[4917]: > Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.630056 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ddca55a87a348\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca55a87a348 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:11.000396616 +0000 UTC m=+15.941551340,LastTimestamp:2026-03-18 06:47:11.014883724 +0000 UTC m=+15.956038458,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.634767 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 18 06:48:01 crc kubenswrapper[4917]: &Event{ObjectMeta:{kube-apiserver-crc.189ddca55c5212d0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:54524->192.168.126.11:17697: read: connection reset by peer Mar 18 06:48:01 crc kubenswrapper[4917]: body: Mar 18 06:48:01 crc kubenswrapper[4917]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:11.030440656 +0000 UTC m=+15.971595400,LastTimestamp:2026-03-18 06:47:11.030440656 +0000 UTC m=+15.971595400,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:48:01 crc kubenswrapper[4917]: > Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.639251 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca55c53dedd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54524->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:11.030558429 +0000 UTC m=+15.971713163,LastTimestamp:2026-03-18 06:47:11.030558429 +0000 UTC m=+15.971713163,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.643506 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189ddca2839ea8f9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189ddca2839ea8f9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:58.804869369 +0000 UTC m=+3.746024083,LastTimestamp:2026-03-18 06:47:11.905630969 +0000 UTC m=+16.846785723,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.650297 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ddca501d31baf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 06:48:01 crc kubenswrapper[4917]: &Event{ObjectMeta:{kube-controller-manager-crc.189ddca501d31baf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 06:48:01 crc kubenswrapper[4917]: body: Mar 18 06:48:01 crc kubenswrapper[4917]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:09.512170415 +0000 UTC m=+14.453325159,LastTimestamp:2026-03-18 06:47:19.511823421 +0000 UTC m=+24.452978175,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:48:01 crc kubenswrapper[4917]: > Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.655902 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ddca501d400b7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca501d400b7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:09.512229047 +0000 UTC m=+14.453383801,LastTimestamp:2026-03-18 06:47:19.512009095 +0000 UTC m=+24.453163839,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.666236 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 06:48:01 crc kubenswrapper[4917]: &Event{ObjectMeta:{kube-controller-manager-crc.189ddca97438a146 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:42912->192.168.126.11:10357: read: connection reset by peer Mar 18 06:48:01 crc kubenswrapper[4917]: body: Mar 18 06:48:01 crc kubenswrapper[4917]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:28.611295558 +0000 UTC m=+33.552450302,LastTimestamp:2026-03-18 06:47:28.611295558 +0000 UTC m=+33.552450302,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:48:01 crc kubenswrapper[4917]: > Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.671240 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca97439a51f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:42912->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:28.611362079 +0000 UTC m=+33.552516823,LastTimestamp:2026-03-18 06:47:28.611362079 +0000 UTC m=+33.552516823,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.681038 4917 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca974736323 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:28.615146275 +0000 UTC m=+33.556301049,LastTimestamp:2026-03-18 06:47:28.615146275 +0000 UTC m=+33.556301049,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.696077 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ddca214ef0db4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca214ef0db4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:56.94786706 +0000 UTC m=+1.889021784,LastTimestamp:2026-03-18 06:47:28.640686503 +0000 UTC m=+33.581841257,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: I0318 06:48:01.697756 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.701623 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ddca22a23480c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca22a23480c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.303611404 +0000 UTC m=+2.244766128,LastTimestamp:2026-03-18 06:47:28.89765402 +0000 UTC m=+33.838808774,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.709575 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ddca22b0c276e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca22b0c276e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:46:57.318872942 +0000 UTC m=+2.260027666,LastTimestamp:2026-03-18 06:47:28.911656767 +0000 UTC m=+33.852811511,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.719284 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ddca501d31baf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 06:48:01 crc kubenswrapper[4917]: &Event{ObjectMeta:{kube-controller-manager-crc.189ddca501d31baf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 06:48:01 crc kubenswrapper[4917]: body: Mar 18 06:48:01 crc kubenswrapper[4917]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:09.512170415 +0000 UTC m=+14.453325159,LastTimestamp:2026-03-18 06:47:39.511312984 +0000 UTC m=+44.452467738,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:48:01 crc kubenswrapper[4917]: > Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.724808 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ddca501d400b7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189ddca501d400b7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:09.512229047 +0000 UTC m=+14.453383801,LastTimestamp:2026-03-18 06:47:39.511398566 +0000 UTC m=+44.452553310,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 18 06:48:01 crc kubenswrapper[4917]: E0318 06:48:01.731047 4917 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189ddca501d31baf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 18 06:48:01 crc kubenswrapper[4917]: &Event{ObjectMeta:{kube-controller-manager-crc.189ddca501d31baf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 18 06:48:01 crc kubenswrapper[4917]: body: Mar 18 06:48:01 crc kubenswrapper[4917]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:47:09.512170415 +0000 UTC m=+14.453325159,LastTimestamp:2026-03-18 06:47:49.511673743 +0000 UTC m=+54.452828497,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:48:01 crc kubenswrapper[4917]: > Mar 18 06:48:02 crc kubenswrapper[4917]: I0318 06:48:02.696435 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:03 crc kubenswrapper[4917]: I0318 06:48:03.697244 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:04 crc kubenswrapper[4917]: I0318 06:48:04.065988 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:48:04 crc kubenswrapper[4917]: I0318 06:48:04.066161 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:04 crc kubenswrapper[4917]: I0318 06:48:04.067320 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:04 crc kubenswrapper[4917]: I0318 06:48:04.067398 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:04 crc kubenswrapper[4917]: I0318 06:48:04.067419 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:04 crc kubenswrapper[4917]: I0318 06:48:04.068351 4917 scope.go:117] "RemoveContainer" containerID="0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95" Mar 18 06:48:04 crc kubenswrapper[4917]: E0318 06:48:04.068678 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:48:04 crc kubenswrapper[4917]: I0318 06:48:04.696644 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:05 crc kubenswrapper[4917]: I0318 06:48:05.696354 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:05 crc kubenswrapper[4917]: E0318 06:48:05.863636 4917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 06:48:06 crc kubenswrapper[4917]: E0318 06:48:06.442362 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.445335 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.446740 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.446805 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.446824 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.447296 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:48:06 crc kubenswrapper[4917]: E0318 06:48:06.453620 4917 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.510512 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.510821 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.512299 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.512347 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.512360 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.516691 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:48:06 crc kubenswrapper[4917]: I0318 06:48:06.698570 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:07 crc kubenswrapper[4917]: I0318 06:48:07.133157 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:07 crc kubenswrapper[4917]: I0318 06:48:07.133260 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:48:07 crc kubenswrapper[4917]: I0318 06:48:07.134497 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:07 crc kubenswrapper[4917]: I0318 06:48:07.134574 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:07 crc kubenswrapper[4917]: I0318 06:48:07.134648 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:07 crc kubenswrapper[4917]: I0318 06:48:07.699114 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:08 crc kubenswrapper[4917]: I0318 06:48:08.138430 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:08 crc kubenswrapper[4917]: I0318 06:48:08.139922 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:08 crc kubenswrapper[4917]: I0318 06:48:08.140301 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:08 crc kubenswrapper[4917]: I0318 06:48:08.140503 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:08 crc kubenswrapper[4917]: I0318 06:48:08.498680 4917 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 18 06:48:08 crc kubenswrapper[4917]: I0318 06:48:08.517799 4917 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 06:48:08 crc kubenswrapper[4917]: I0318 06:48:08.699105 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:09 crc kubenswrapper[4917]: I0318 06:48:09.095213 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:48:09 crc kubenswrapper[4917]: I0318 06:48:09.140041 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:09 crc kubenswrapper[4917]: I0318 06:48:09.141190 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:09 crc kubenswrapper[4917]: I0318 06:48:09.141216 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:09 crc kubenswrapper[4917]: I0318 06:48:09.141227 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:09 crc kubenswrapper[4917]: I0318 06:48:09.699711 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:10 crc kubenswrapper[4917]: I0318 06:48:10.699222 4917 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 18 06:48:10 crc kubenswrapper[4917]: I0318 06:48:10.746313 4917 csr.go:261] certificate signing request csr-cjkkg is approved, waiting to be issued Mar 18 06:48:10 crc kubenswrapper[4917]: I0318 06:48:10.757137 4917 csr.go:257] certificate signing request csr-cjkkg is issued Mar 18 06:48:10 crc kubenswrapper[4917]: I0318 06:48:10.764047 4917 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 18 06:48:11 crc kubenswrapper[4917]: I0318 06:48:11.540214 4917 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 18 06:48:11 crc kubenswrapper[4917]: I0318 06:48:11.759159 4917 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-26 16:20:45.392361028 +0000 UTC Mar 18 06:48:11 crc kubenswrapper[4917]: I0318 06:48:11.759234 4917 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6081h32m33.63313286s for next certificate rotation Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.454628 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.456460 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.456497 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.456513 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.456664 4917 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.466960 4917 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.467304 4917 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.467348 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.471857 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.471905 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.471923 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.471949 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.471968 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:13Z","lastTransitionTime":"2026-03-18T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.492505 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8eadbf6-d127-4d55-8d87-8302d8daa3be\\\",\\\"systemUUID\\\":\\\"e77627ae-240f-4740-8125-ab1e275f53d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.503928 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.504009 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.504062 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.504088 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.504105 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:13Z","lastTransitionTime":"2026-03-18T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.521506 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8eadbf6-d127-4d55-8d87-8302d8daa3be\\\",\\\"systemUUID\\\":\\\"e77627ae-240f-4740-8125-ab1e275f53d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.532912 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.533016 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.533035 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.533095 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.533122 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:13Z","lastTransitionTime":"2026-03-18T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.551657 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8eadbf6-d127-4d55-8d87-8302d8daa3be\\\",\\\"systemUUID\\\":\\\"e77627ae-240f-4740-8125-ab1e275f53d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.562468 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.562525 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.562542 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.562565 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:13 crc kubenswrapper[4917]: I0318 06:48:13.562618 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:13Z","lastTransitionTime":"2026-03-18T06:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.578523 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8eadbf6-d127-4d55-8d87-8302d8daa3be\\\",\\\"systemUUID\\\":\\\"e77627ae-240f-4740-8125-ab1e275f53d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.578802 4917 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.578866 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.679987 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.780961 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.881999 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:13 crc kubenswrapper[4917]: E0318 06:48:13.982426 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.082555 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.182992 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.283426 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.384258 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.485255 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.586274 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.687260 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.788189 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.889112 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:14 crc kubenswrapper[4917]: E0318 06:48:14.989524 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.090184 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.190938 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.291781 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.392835 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.493715 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.594244 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.694779 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.795146 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.863962 4917 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.896304 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:15 crc kubenswrapper[4917]: E0318 06:48:15.997231 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: E0318 06:48:16.098138 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: E0318 06:48:16.198540 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: E0318 06:48:16.299261 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: E0318 06:48:16.400306 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: E0318 06:48:16.501231 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: E0318 06:48:16.601362 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: E0318 06:48:16.701521 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: E0318 06:48:16.801788 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: I0318 06:48:16.892914 4917 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 06:48:16 crc kubenswrapper[4917]: E0318 06:48:16.902687 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:16 crc kubenswrapper[4917]: I0318 06:48:16.965676 4917 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.003670 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.104170 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.204854 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.305903 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.406987 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.507443 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.607788 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.708141 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.808466 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:17 crc kubenswrapper[4917]: E0318 06:48:17.908792 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.009652 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.110810 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.211151 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.312315 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.413406 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.514278 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.614402 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.714622 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: I0318 06:48:18.771980 4917 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 18 06:48:18 crc kubenswrapper[4917]: I0318 06:48:18.773649 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:18 crc kubenswrapper[4917]: I0318 06:48:18.773705 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:18 crc kubenswrapper[4917]: I0318 06:48:18.773722 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:18 crc kubenswrapper[4917]: I0318 06:48:18.774535 4917 scope.go:117] "RemoveContainer" containerID="0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.774845 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.815034 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:18 crc kubenswrapper[4917]: E0318 06:48:18.915545 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.016659 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.117741 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.218446 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.319315 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.420357 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.521327 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.622451 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.722870 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.823316 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:19 crc kubenswrapper[4917]: E0318 06:48:19.923728 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.024513 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.125226 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.226434 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.327167 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.428071 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.528558 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.629653 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.730128 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.830647 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:20 crc kubenswrapper[4917]: E0318 06:48:20.931693 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.032780 4917 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.093187 4917 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.136054 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.136099 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.136115 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.136140 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.136157 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:21Z","lastTransitionTime":"2026-03-18T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.239077 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.239147 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.239172 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.239201 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.239220 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:21Z","lastTransitionTime":"2026-03-18T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.342637 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.342708 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.342726 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.342751 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.342768 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:21Z","lastTransitionTime":"2026-03-18T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.446350 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.446410 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.446427 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.446452 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.446469 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:21Z","lastTransitionTime":"2026-03-18T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.549926 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.549984 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.550000 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.550023 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.550041 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:21Z","lastTransitionTime":"2026-03-18T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.652403 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.652701 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.652883 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.653033 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.653200 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:21Z","lastTransitionTime":"2026-03-18T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.723173 4917 apiserver.go:52] "Watching apiserver" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.729416 4917 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.729840 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.730337 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.730707 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.730650 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.730675 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.730512 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.730706 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.731848 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.732185 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.732248 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.734452 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.735909 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.736149 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.736274 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.736423 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.736754 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.737050 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.739772 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.739982 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.757087 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.757146 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.757165 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.757189 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.757208 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:21Z","lastTransitionTime":"2026-03-18T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.784106 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.802330 4917 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.809052 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.826841 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.841356 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.841652 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.841989 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.842174 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.842320 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.842474 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.842661 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.842832 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843008 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843157 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843303 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843707 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843912 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844073 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.842100 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844219 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844220 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843696 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844341 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844380 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844416 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844452 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844489 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844521 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844558 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844621 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844657 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844691 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.842220 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.842659 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.842855 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843089 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843187 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843431 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.843464 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844170 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844298 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844386 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844546 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844929 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844723 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845025 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845046 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845060 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845118 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845155 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845192 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845208 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845222 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845253 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845287 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845318 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845349 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845378 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845409 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845414 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845442 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845474 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845652 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845690 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845729 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845761 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845792 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845823 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845859 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845895 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845926 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845961 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845997 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846029 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846061 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846092 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846124 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846155 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846183 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846213 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846245 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846275 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846309 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846349 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846398 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846440 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846483 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846515 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846561 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846622 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846654 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846686 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846716 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846747 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846783 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846825 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846864 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846904 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846943 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846982 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847021 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847058 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847092 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847128 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847160 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847194 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847225 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847263 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847296 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847330 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847361 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847393 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847427 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847459 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847492 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847525 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847558 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847613 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847647 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847681 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847715 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847746 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847780 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847813 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847847 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847906 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847943 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847976 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848010 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848042 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848074 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848106 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848138 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848169 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848202 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848237 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848271 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848304 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848336 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848367 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848401 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848437 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848470 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848504 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848538 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848569 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848901 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848998 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.849046 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845653 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845553 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844705 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.845831 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.844663 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846333 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846331 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846798 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846847 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.846953 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847442 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.847420 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848433 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.848764 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.849087 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.849189 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:22.349149825 +0000 UTC m=+87.290304639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.849533 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.849901 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.850455 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.850823 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.851054 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.851146 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.851214 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.851458 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.851473 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.852444 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.852464 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.852775 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.853116 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.853140 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.853297 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.853526 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.853891 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.854120 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.854176 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.854225 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.854273 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.854312 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.854328 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.854396 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.854445 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.855011 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.855130 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.855254 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.855450 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.849161 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.855743 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.855783 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.855837 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.855657 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856214 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856270 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856309 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856346 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856382 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856417 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856453 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856489 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856525 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856560 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856625 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856667 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856702 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856715 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856739 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856834 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.856938 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857000 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857083 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857103 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857137 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857422 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857478 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857492 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857336 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857740 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.858125 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.858337 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.858487 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.858415 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.858814 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.859221 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.858987 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.859357 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.859567 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.859567 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.859926 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.857795 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860434 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860355 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860463 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860479 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860524 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860558 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860627 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860688 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860739 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860789 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860840 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860893 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860943 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861058 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861120 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861172 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861225 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861277 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861331 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861379 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861430 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861485 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861720 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861784 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861836 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861891 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861940 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861993 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.860966 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861053 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861268 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.863294 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.861950 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.862070 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.862379 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.862936 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.862982 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.863208 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.863794 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.863838 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.863900 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864220 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864283 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864341 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864395 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864450 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864499 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864547 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864629 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864683 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864730 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864768 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864782 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.864885 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865008 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865070 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865094 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865129 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865185 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865193 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865241 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865296 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865355 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865430 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865481 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865528 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865542 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865671 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865734 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865784 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865833 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865875 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865893 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865919 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865937 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:21Z","lastTransitionTime":"2026-03-18T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865838 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.865844 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866383 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866427 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866470 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866509 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866566 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866642 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866701 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866827 4917 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866864 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866894 4917 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866925 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866926 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866956 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.866987 4917 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867016 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867043 4917 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867073 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867095 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867118 4917 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867140 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867161 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867182 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867201 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867220 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867241 4917 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867260 4917 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867280 4917 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867301 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867329 4917 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867348 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867366 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867385 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867405 4917 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867424 4917 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867445 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867465 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867484 4917 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867508 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867535 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867558 4917 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867576 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867630 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867658 4917 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867688 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867711 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867732 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867751 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867772 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867790 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867808 4917 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867828 4917 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867847 4917 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867896 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867916 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867939 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867962 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867982 4917 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868004 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868022 4917 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868041 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868060 4917 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868080 4917 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868099 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868119 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868140 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868158 4917 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868176 4917 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868195 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868213 4917 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867203 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867217 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867530 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.867971 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868095 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868410 4917 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868465 4917 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868499 4917 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868527 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868556 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868621 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868651 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868677 4917 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868703 4917 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868413 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868533 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.868579 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.869168 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.869460 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.869542 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.869701 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.869937 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.870009 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.870162 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.870190 4917 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.870277 4917 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.870366 4917 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.870439 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:22.370411336 +0000 UTC m=+87.311566150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.871213 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.871326 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.871459 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.871704 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.871796 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.872095 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.872194 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.872149 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.872284 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.872568 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.873057 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.873351 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.873567 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.874023 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.874246 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.874319 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.874336 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.874667 4917 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.874973 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:22.374936708 +0000 UTC m=+87.316091472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875102 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875146 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875186 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875225 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875260 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875295 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875329 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875363 4917 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875396 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875428 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875458 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875491 4917 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875521 4917 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875553 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875624 4917 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875654 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875684 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875715 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.875746 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.877105 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.877696 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.877981 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.877835 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.878281 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.878351 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.878379 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.878904 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.879763 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.881541 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.881828 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.883169 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.883277 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.883489 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.883789 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.883696 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.884018 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.884441 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.892248 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.894830 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.897779 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.898244 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.898855 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.901271 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.901429 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.901544 4917 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.903666 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.904640 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:22.4019275 +0000 UTC m=+87.343082234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.904872 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.904896 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.904910 4917 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:21 crc kubenswrapper[4917]: E0318 06:48:21.904960 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:22.404943808 +0000 UTC m=+87.346098532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.907190 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.909509 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.909655 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.910340 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.910719 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.910754 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.911051 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.911333 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.911665 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.911878 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.912036 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.912078 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.912232 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.912377 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.912424 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.912677 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.913110 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.913230 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.914099 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.914332 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.914720 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.914945 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.915132 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.915429 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.915485 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.915560 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.916303 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.916354 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.916791 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.916985 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.917186 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.917727 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.918006 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.918239 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.918644 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.918793 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.919616 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.919721 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.920068 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.920132 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.920825 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.921399 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.921424 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.921640 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.931889 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.936124 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.953184 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.968090 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.968126 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.968138 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.968155 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.968168 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:21Z","lastTransitionTime":"2026-03-18T06:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976701 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976740 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976792 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976807 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976819 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976841 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976853 4917 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976863 4917 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976874 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976885 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976895 4917 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976906 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976917 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976928 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976941 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976952 4917 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976963 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976974 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.976986 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977004 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977016 4917 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977026 4917 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977037 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977049 4917 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977060 4917 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977051 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977071 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977149 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977176 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977196 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977113 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977213 4917 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977252 4917 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977267 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977279 4917 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977290 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977302 4917 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977313 4917 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977324 4917 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977334 4917 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977345 4917 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977356 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977368 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977380 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977393 4917 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977405 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977415 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977426 4917 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977436 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977447 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977459 4917 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977469 4917 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977481 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977492 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977503 4917 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977513 4917 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977525 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977535 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977547 4917 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977557 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977568 4917 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977609 4917 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977626 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977642 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977659 4917 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977674 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977689 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977702 4917 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977713 4917 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977724 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977735 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977748 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977759 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977770 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977781 4917 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977792 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977803 4917 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977813 4917 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977824 4917 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977835 4917 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977846 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977858 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977869 4917 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977880 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977890 4917 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977901 4917 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977913 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977924 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977934 4917 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977945 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977955 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977966 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.977994 4917 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978006 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978017 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978028 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978039 4917 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978050 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978061 4917 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978071 4917 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978082 4917 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978093 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978104 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978115 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978127 4917 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978138 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978149 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:21 crc kubenswrapper[4917]: I0318 06:48:21.978159 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.059950 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.071611 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.071670 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.071683 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.071700 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.071711 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:22Z","lastTransitionTime":"2026-03-18T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.076810 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 18 06:48:22 crc kubenswrapper[4917]: W0318 06:48:22.095027 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-a305f89a18bfe750f513a01286f74a8847960965190b5772d21c85befa0b0d9b WatchSource:0}: Error finding container a305f89a18bfe750f513a01286f74a8847960965190b5772d21c85befa0b0d9b: Status 404 returned error can't find the container with id a305f89a18bfe750f513a01286f74a8847960965190b5772d21c85befa0b0d9b Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.097980 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 18 06:48:22 crc kubenswrapper[4917]: W0318 06:48:22.120538 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-99240aab7e5a836411ee42cdd74975eb3ec0941ba2079cf59baf1a78929efe3f WatchSource:0}: Error finding container 99240aab7e5a836411ee42cdd74975eb3ec0941ba2079cf59baf1a78929efe3f: Status 404 returned error can't find the container with id 99240aab7e5a836411ee42cdd74975eb3ec0941ba2079cf59baf1a78929efe3f Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.177095 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.177470 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.177484 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.177504 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.177517 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:22Z","lastTransitionTime":"2026-03-18T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.184297 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"9f28cecd1955de448a0960b28370bc1c9517d817c3f4e98be12767ed3f11cdb2"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.186085 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"99240aab7e5a836411ee42cdd74975eb3ec0941ba2079cf59baf1a78929efe3f"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.187648 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a305f89a18bfe750f513a01286f74a8847960965190b5772d21c85befa0b0d9b"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.286025 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.286070 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.286106 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.286145 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.286158 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:22Z","lastTransitionTime":"2026-03-18T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.382672 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:23.382631582 +0000 UTC m=+88.323786336 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.382832 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.383654 4917 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.383872 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:23.383795818 +0000 UTC m=+88.324950572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.383233 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.384814 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.385014 4917 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.385125 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:23.385098018 +0000 UTC m=+88.326252782 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.390466 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.390557 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.390673 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.390724 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.390751 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:22Z","lastTransitionTime":"2026-03-18T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.485661 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.485761 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.485929 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.485954 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.485967 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.486037 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.486059 4917 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.486137 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:23.486111115 +0000 UTC m=+88.427265859 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.485975 4917 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:22 crc kubenswrapper[4917]: E0318 06:48:22.486268 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:23.486240748 +0000 UTC m=+88.427395502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.495693 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.495743 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.495760 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.495782 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.495800 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:22Z","lastTransitionTime":"2026-03-18T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.598509 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.598563 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.598575 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.598612 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.598626 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:22Z","lastTransitionTime":"2026-03-18T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.701649 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.701685 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.701696 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.701712 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.701724 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:22Z","lastTransitionTime":"2026-03-18T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.804706 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.804790 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.804810 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.804835 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.804852 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:22Z","lastTransitionTime":"2026-03-18T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.907576 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.907660 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.907682 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.907713 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:22 crc kubenswrapper[4917]: I0318 06:48:22.907738 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:22Z","lastTransitionTime":"2026-03-18T06:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.011144 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.011204 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.011221 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.011245 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.011262 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.113950 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.114014 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.114031 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.114054 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.114071 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.193375 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ceab6fd3cdeb1f6e0fcb41aff5896810ce8783a7bd85b35e808b07dddecc6665"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.193444 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b54150f601916c66b04499dff11b2bd37b3736806665cba94987f79940e091d8"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.195288 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4286ed6c3e93648c2e235cd37743224ca68138b274e25812ddacd4123b71d8b2"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.213793 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.218433 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.218483 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.218500 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.218522 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.218541 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.232578 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.251753 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.268761 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceab6fd3cdeb1f6e0fcb41aff5896810ce8783a7bd85b35e808b07dddecc6665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T06:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54150f601916c66b04499dff11b2bd37b3736806665cba94987f79940e091d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T06:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.291014 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.309453 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.322676 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.322743 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.322764 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.322793 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.322815 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.330999 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ceab6fd3cdeb1f6e0fcb41aff5896810ce8783a7bd85b35e808b07dddecc6665\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T06:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b54150f601916c66b04499dff11b2bd37b3736806665cba94987f79940e091d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T06:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.347413 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.361639 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.382568 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4286ed6c3e93648c2e235cd37743224ca68138b274e25812ddacd4123b71d8b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-18T06:48:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.394874 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.395009 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.395046 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.395216 4917 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.395284 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:25.395263396 +0000 UTC m=+90.336418150 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.395790 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:25.395773418 +0000 UTC m=+90.336928172 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.395856 4917 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.395900 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:25.39588583 +0000 UTC m=+90.337040584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.400570 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.418290 4917 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.425820 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.425859 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.425876 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.425897 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.425914 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.495730 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.495805 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.495936 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.495957 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.495971 4917 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.496022 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:25.496005666 +0000 UTC m=+90.437160380 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.496380 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.496403 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.496414 4917 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.496445 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:25.496435077 +0000 UTC m=+90.437589791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.528693 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.528791 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.528805 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.528822 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.528834 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.632326 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.632388 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.632406 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.632429 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.632445 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.734861 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.734927 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.734944 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.734967 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.734982 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.772350 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.772483 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.772558 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.772822 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.772897 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.772957 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.777116 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.778054 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.779005 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.779839 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.781854 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.782560 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.783409 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.784743 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.785574 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.786923 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.787578 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.789141 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.789837 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.792142 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.792921 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.793644 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.795720 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.796075 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.796623 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.797602 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.798090 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.799030 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.799437 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.800506 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.800945 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.802008 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.802635 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.803098 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.804100 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.804525 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.805372 4917 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.805470 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.807158 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.808480 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.808909 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.810437 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.811048 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.811904 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.812507 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.813477 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.813974 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.814880 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.815830 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.816388 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.816843 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.817777 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.818636 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.819316 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.819925 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.820741 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.821174 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.822051 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.822601 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.823043 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.837723 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.837810 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.837830 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.837853 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.837866 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.940930 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.941000 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.941021 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.941049 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.941068 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.944077 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.944151 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.944166 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.944526 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.944569 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.959295 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8eadbf6-d127-4d55-8d87-8302d8daa3be\\\",\\\"systemUUID\\\":\\\"e77627ae-240f-4740-8125-ab1e275f53d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.963647 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.963690 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.963705 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.963723 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.963735 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.975995 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8eadbf6-d127-4d55-8d87-8302d8daa3be\\\",\\\"systemUUID\\\":\\\"e77627ae-240f-4740-8125-ab1e275f53d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.979487 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.979529 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.979546 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.979569 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.979611 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:23 crc kubenswrapper[4917]: E0318 06:48:23.992807 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8eadbf6-d127-4d55-8d87-8302d8daa3be\\\",\\\"systemUUID\\\":\\\"e77627ae-240f-4740-8125-ab1e275f53d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:23Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.996624 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.996657 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.996669 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.996687 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:23 crc kubenswrapper[4917]: I0318 06:48:23.996702 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:23Z","lastTransitionTime":"2026-03-18T06:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: E0318 06:48:24.009998 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8eadbf6-d127-4d55-8d87-8302d8daa3be\\\",\\\"systemUUID\\\":\\\"e77627ae-240f-4740-8125-ab1e275f53d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:24Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.014364 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.014427 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.014447 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.014476 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.014495 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: E0318 06:48:24.034375 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:48:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-18T06:48:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a8eadbf6-d127-4d55-8d87-8302d8daa3be\\\",\\\"systemUUID\\\":\\\"e77627ae-240f-4740-8125-ab1e275f53d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-18T06:48:24Z is after 2025-08-24T17:21:41Z" Mar 18 06:48:24 crc kubenswrapper[4917]: E0318 06:48:24.034694 4917 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.044672 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.044737 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.044753 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.044784 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.044799 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.147360 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.147399 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.147407 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.147425 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.147437 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.249946 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.249983 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.249992 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.250006 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.250016 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.351842 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.351911 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.351930 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.351954 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.351972 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.454460 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.454526 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.454544 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.454574 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.454642 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.557701 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.557765 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.557783 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.557808 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.557887 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.661308 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.661364 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.661381 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.661404 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.661422 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.764139 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.764257 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.764280 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.764323 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.764346 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.867915 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.867972 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.867986 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.868007 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.868021 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.970711 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.970771 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.970790 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.970814 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:24 crc kubenswrapper[4917]: I0318 06:48:24.970831 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:24Z","lastTransitionTime":"2026-03-18T06:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.073994 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.074062 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.074081 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.074108 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.074126 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:25Z","lastTransitionTime":"2026-03-18T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.177334 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.177377 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.177387 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.177403 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.177412 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:25Z","lastTransitionTime":"2026-03-18T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.280036 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.280092 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.280106 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.280124 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.280136 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:25Z","lastTransitionTime":"2026-03-18T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.384882 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.384977 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.384992 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.385018 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.385033 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:25Z","lastTransitionTime":"2026-03-18T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.412650 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.412875 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:29.41283578 +0000 UTC m=+94.353990524 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.412956 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.413006 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.413115 4917 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.413208 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:29.413185229 +0000 UTC m=+94.354339983 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.413219 4917 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.413344 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:29.413314142 +0000 UTC m=+94.354468866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.488548 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.488641 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.488660 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.488686 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.488705 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:25Z","lastTransitionTime":"2026-03-18T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.513441 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.513513 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.513721 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.513748 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.513767 4917 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.513831 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:29.513809806 +0000 UTC m=+94.454964560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.514320 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.514351 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.514367 4917 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.514411 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:29.51439589 +0000 UTC m=+94.455550644 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.592102 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.592218 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.592237 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.592695 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.592754 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:25Z","lastTransitionTime":"2026-03-18T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.695741 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.695790 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.695804 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.695825 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.695841 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:25Z","lastTransitionTime":"2026-03-18T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.772692 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.772773 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.772894 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.773391 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.773502 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:25 crc kubenswrapper[4917]: E0318 06:48:25.773625 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.783265 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.798246 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.798312 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.798333 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.798358 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.798376 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:25Z","lastTransitionTime":"2026-03-18T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.907338 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.907460 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.907535 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.907575 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.907637 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:25Z","lastTransitionTime":"2026-03-18T06:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:25 crc kubenswrapper[4917]: I0318 06:48:25.912468 4917 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.010413 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.010465 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.010482 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.010506 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.010524 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.114505 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.114549 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.114561 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.114602 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.114613 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.214856 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"aa54fbd7cb7854b7f36359529eb9b94c3a9d9ada21cb01a920fc366996644174"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.216945 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.216979 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.216988 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.217002 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.217014 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.251840 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.251812074 podStartE2EDuration="1.251812074s" podCreationTimestamp="2026-03-18 06:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:26.251793003 +0000 UTC m=+91.192947717" watchObservedRunningTime="2026-03-18 06:48:26.251812074 +0000 UTC m=+91.192966818" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.320267 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.320312 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.320325 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.320343 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.320355 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.422858 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.422928 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.422944 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.422968 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.422984 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.526713 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.526767 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.526784 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.526805 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.526825 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.628833 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.628894 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.628911 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.628935 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.628953 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.731639 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.731683 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.731693 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.731708 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.731720 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.834442 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.834512 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.834530 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.834559 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.834577 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.937010 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.937043 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.937052 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.937064 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:26 crc kubenswrapper[4917]: I0318 06:48:26.937073 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:26Z","lastTransitionTime":"2026-03-18T06:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.040149 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.040190 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.040202 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.040219 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.040232 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.143029 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.143077 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.143091 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.143111 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.143124 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.245666 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.245705 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.245717 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.245735 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.245747 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.348361 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.348404 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.348417 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.348434 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.348456 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.460672 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.460710 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.460721 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.460737 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.460750 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.563752 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.563819 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.563836 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.563863 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.563882 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.666743 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.666800 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.666818 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.666844 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.666861 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.768735 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.768790 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.768808 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.768830 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.768846 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.772562 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.772702 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.772880 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:27 crc kubenswrapper[4917]: E0318 06:48:27.772956 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:27 crc kubenswrapper[4917]: E0318 06:48:27.773133 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:27 crc kubenswrapper[4917]: E0318 06:48:27.773239 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.871722 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.871760 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.871768 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.871782 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.871793 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.974141 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.974228 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.974250 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.974274 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:27 crc kubenswrapper[4917]: I0318 06:48:27.974292 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:27Z","lastTransitionTime":"2026-03-18T06:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.077787 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.077856 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.077878 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.077906 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.077930 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:28Z","lastTransitionTime":"2026-03-18T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.180288 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.180330 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.180338 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.180352 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.180361 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:28Z","lastTransitionTime":"2026-03-18T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.282679 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.282715 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.282725 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.282757 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.282767 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:28Z","lastTransitionTime":"2026-03-18T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.385114 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.385165 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.385177 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.385194 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.385206 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:28Z","lastTransitionTime":"2026-03-18T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.488761 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.488829 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.488848 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.488873 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.488889 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:28Z","lastTransitionTime":"2026-03-18T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.591847 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.591920 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.591942 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.591969 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.591989 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:28Z","lastTransitionTime":"2026-03-18T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.695489 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.695549 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.695566 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.695616 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.695634 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:28Z","lastTransitionTime":"2026-03-18T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.798728 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.798784 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.798802 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.798826 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.798842 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:28Z","lastTransitionTime":"2026-03-18T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.901681 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.901730 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.901752 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.901778 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:28 crc kubenswrapper[4917]: I0318 06:48:28.901799 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:28Z","lastTransitionTime":"2026-03-18T06:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.004405 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.004444 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.004457 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.004473 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.004485 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.108170 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.108208 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.108217 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.108231 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.108241 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.211116 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.211193 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.211210 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.211240 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.211257 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.314391 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.314457 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.314474 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.314500 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.314518 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.417455 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.417569 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.417604 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.417622 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.417634 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.454075 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.454234 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.454273 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:37.454249761 +0000 UTC m=+102.395404545 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.454332 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.454386 4917 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.454457 4917 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.454481 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:37.454454376 +0000 UTC m=+102.395609130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.454718 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:37.454497637 +0000 UTC m=+102.395652491 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.520866 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.520930 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.520951 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.520980 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.521001 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.555008 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.555086 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.555200 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.555235 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.555252 4917 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.555322 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:37.555301909 +0000 UTC m=+102.496456643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.555360 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.555400 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.555414 4917 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.555478 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:37.555460292 +0000 UTC m=+102.496615016 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.623797 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.623847 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.623863 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.623881 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.623893 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.726495 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.726551 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.726568 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.726641 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.726660 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.772474 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.772488 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.772617 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.772723 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.772741 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:29 crc kubenswrapper[4917]: E0318 06:48:29.772956 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.828374 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.828415 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.828426 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.828441 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.828451 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.930860 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.930893 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.930905 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.930919 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:29 crc kubenswrapper[4917]: I0318 06:48:29.930928 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:29Z","lastTransitionTime":"2026-03-18T06:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.033743 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.033780 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.033789 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.033805 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.033815 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.136530 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.136572 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.136618 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.136633 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.136642 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.239680 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.239740 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.239758 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.239781 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.239799 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.343061 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.343147 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.343166 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.343823 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.343907 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.447416 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.447484 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.447505 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.447529 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.447546 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.550730 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.550790 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.550807 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.550831 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.550851 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.654066 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.654154 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.654164 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.654179 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.654190 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.756791 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.756869 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.756882 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.756901 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.756912 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.788214 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.788474 4917 scope.go:117] "RemoveContainer" containerID="0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95" Mar 18 06:48:30 crc kubenswrapper[4917]: E0318 06:48:30.788888 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.860407 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.860476 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.860493 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.860521 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.860541 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.964480 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.964566 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.964745 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.964872 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:30 crc kubenswrapper[4917]: I0318 06:48:30.964901 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:30Z","lastTransitionTime":"2026-03-18T06:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.068184 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.068429 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.068597 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.068689 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.068776 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:31Z","lastTransitionTime":"2026-03-18T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.172432 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.172852 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.173200 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.173410 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.173634 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:31Z","lastTransitionTime":"2026-03-18T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.227496 4917 scope.go:117] "RemoveContainer" containerID="0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95" Mar 18 06:48:31 crc kubenswrapper[4917]: E0318 06:48:31.228168 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.277075 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.277113 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.277125 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.277141 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.277155 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:31Z","lastTransitionTime":"2026-03-18T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.380160 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.380523 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.380785 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.380937 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.381125 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:31Z","lastTransitionTime":"2026-03-18T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.484843 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.489125 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.489181 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.489208 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.489225 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:31Z","lastTransitionTime":"2026-03-18T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.591790 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.591852 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.591869 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.591895 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.591912 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:31Z","lastTransitionTime":"2026-03-18T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.694952 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.694996 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.695009 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.695024 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.695036 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:31Z","lastTransitionTime":"2026-03-18T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.774455 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:31 crc kubenswrapper[4917]: E0318 06:48:31.774603 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.774456 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:31 crc kubenswrapper[4917]: E0318 06:48:31.774940 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.775128 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:31 crc kubenswrapper[4917]: E0318 06:48:31.775479 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.797568 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.798006 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.798154 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.798304 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.798436 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:31Z","lastTransitionTime":"2026-03-18T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.901846 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.901937 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.901956 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.901983 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:31 crc kubenswrapper[4917]: I0318 06:48:31.902000 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:31Z","lastTransitionTime":"2026-03-18T06:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.004750 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.004801 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.004817 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.004839 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.004855 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.108039 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.108656 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.108947 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.109050 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.109174 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.212766 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.212817 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.212830 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.212851 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.212863 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.316198 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.316282 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.316304 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.316340 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.316360 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.419746 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.420306 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.420480 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.420701 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.420847 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.524467 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.524567 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.524631 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.524670 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.524691 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.582852 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7wk2h"] Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.583403 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7wk2h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.585445 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.586055 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.586345 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.598002 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-g5sf8"] Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.598889 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.601008 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.601278 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.601887 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.603993 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.605096 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xp5xk"] Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.605907 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.606970 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.608574 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.608704 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.609092 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.609360 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.610236 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.618111 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pp4xr"] Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.618617 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.620400 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.622282 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.630005 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.630236 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.630254 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.630280 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.630299 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.637009 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f8k4j"] Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.647973 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.653489 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.654276 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.654955 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.655454 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.655624 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.655766 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.655980 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683041 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-slash\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683084 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-log-socket\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683110 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-config\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683136 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-os-release\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683159 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d57962b8-d12d-4a13-b930-ab79563daeb9-cni-binary-copy\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683249 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-kubelet\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683292 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-systemd\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683317 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-etc-openvswitch\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683338 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-bin\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683358 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-env-overrides\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683380 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683421 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-netns\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683500 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-openvswitch\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683578 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4aaf9352-3715-40a0-876a-09f4f27a41c2-cni-binary-copy\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683627 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-systemd-units\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683676 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sssp7\" (UniqueName: \"kubernetes.io/projected/d57962b8-d12d-4a13-b930-ab79563daeb9-kube-api-access-sssp7\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683730 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-hostroot\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683753 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58xt6\" (UniqueName: \"kubernetes.io/projected/4aaf9352-3715-40a0-876a-09f4f27a41c2-kube-api-access-58xt6\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683773 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-ovn\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683794 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-ovn-kubernetes\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683818 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-system-cni-dir\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683841 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-node-log\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683861 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-system-cni-dir\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683883 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-cnibin\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683902 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-cni-dir\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683923 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d57962b8-d12d-4a13-b930-ab79563daeb9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683957 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-var-lib-openvswitch\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683979 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-socket-dir-parent\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.683999 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-var-lib-cni-bin\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684020 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-var-lib-kubelet\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684045 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a395f4a0-65e4-47aa-81cb-5de11c5c30f2-hosts-file\") pod \"node-resolver-7wk2h\" (UID: \"a395f4a0-65e4-47aa-81cb-5de11c5c30f2\") " pod="openshift-dns/node-resolver-7wk2h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684066 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-cnibin\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684087 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-etc-kubernetes\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684107 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cc04c58e-83bd-4c0c-b58a-da7dea820272-rootfs\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684126 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-run-k8s-cni-cncf-io\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684146 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-run-netns\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684165 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-conf-dir\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684194 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684220 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-os-release\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684239 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s22ms\" (UniqueName: \"kubernetes.io/projected/a395f4a0-65e4-47aa-81cb-5de11c5c30f2-kube-api-access-s22ms\") pod \"node-resolver-7wk2h\" (UID: \"a395f4a0-65e4-47aa-81cb-5de11c5c30f2\") " pod="openshift-dns/node-resolver-7wk2h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684259 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-daemon-config\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684279 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-run-multus-certs\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684303 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2tv7\" (UniqueName: \"kubernetes.io/projected/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-kube-api-access-d2tv7\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684330 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc04c58e-83bd-4c0c-b58a-da7dea820272-proxy-tls\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684354 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd9lx\" (UniqueName: \"kubernetes.io/projected/cc04c58e-83bd-4c0c-b58a-da7dea820272-kube-api-access-jd9lx\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684373 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc04c58e-83bd-4c0c-b58a-da7dea820272-mcd-auth-proxy-config\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684397 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-script-lib\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684417 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovn-node-metrics-cert\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684449 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-var-lib-cni-multus\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.684467 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-netd\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.733437 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.733471 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.733481 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.733495 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.733506 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.785820 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-hostroot\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.785881 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58xt6\" (UniqueName: \"kubernetes.io/projected/4aaf9352-3715-40a0-876a-09f4f27a41c2-kube-api-access-58xt6\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.785919 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-systemd-units\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.785951 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sssp7\" (UniqueName: \"kubernetes.io/projected/d57962b8-d12d-4a13-b930-ab79563daeb9-kube-api-access-sssp7\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.785963 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-hostroot\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786080 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-system-cni-dir\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.785982 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-system-cni-dir\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786142 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-ovn\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786181 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-ovn-kubernetes\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786220 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-cnibin\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786249 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-node-log\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786278 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-system-cni-dir\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786312 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-cni-dir\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786344 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d57962b8-d12d-4a13-b930-ab79563daeb9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786377 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-var-lib-openvswitch\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786414 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-systemd-units\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786431 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-socket-dir-parent\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786463 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-var-lib-cni-bin\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786499 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-var-lib-kubelet\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786531 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a395f4a0-65e4-47aa-81cb-5de11c5c30f2-hosts-file\") pod \"node-resolver-7wk2h\" (UID: \"a395f4a0-65e4-47aa-81cb-5de11c5c30f2\") " pod="openshift-dns/node-resolver-7wk2h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786563 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-etc-kubernetes\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786617 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cc04c58e-83bd-4c0c-b58a-da7dea820272-rootfs\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786649 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-cnibin\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786682 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-os-release\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786713 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-run-k8s-cni-cncf-io\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786738 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-var-lib-openvswitch\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786743 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-run-netns\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786780 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-run-netns\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786799 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-conf-dir\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786833 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786846 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-socket-dir-parent\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786870 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s22ms\" (UniqueName: \"kubernetes.io/projected/a395f4a0-65e4-47aa-81cb-5de11c5c30f2-kube-api-access-s22ms\") pod \"node-resolver-7wk2h\" (UID: \"a395f4a0-65e4-47aa-81cb-5de11c5c30f2\") " pod="openshift-dns/node-resolver-7wk2h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786891 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-var-lib-cni-bin\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786903 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-daemon-config\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786935 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-run-multus-certs\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786965 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2tv7\" (UniqueName: \"kubernetes.io/projected/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-kube-api-access-d2tv7\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787028 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc04c58e-83bd-4c0c-b58a-da7dea820272-proxy-tls\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787062 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd9lx\" (UniqueName: \"kubernetes.io/projected/cc04c58e-83bd-4c0c-b58a-da7dea820272-kube-api-access-jd9lx\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787093 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/cc04c58e-83bd-4c0c-b58a-da7dea820272-rootfs\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787097 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-script-lib\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787156 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc04c58e-83bd-4c0c-b58a-da7dea820272-mcd-auth-proxy-config\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787194 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-var-lib-cni-multus\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787227 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-netd\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787259 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovn-node-metrics-cert\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787317 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-kubelet\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787348 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-slash\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787378 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-log-socket\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787425 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-config\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787457 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-os-release\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787487 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d57962b8-d12d-4a13-b930-ab79563daeb9-cni-binary-copy\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787524 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-netns\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787554 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-systemd\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787616 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-etc-openvswitch\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787651 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-bin\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787684 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-env-overrides\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787714 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787763 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4aaf9352-3715-40a0-876a-09f4f27a41c2-cni-binary-copy\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787795 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-openvswitch\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787897 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-openvswitch\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787945 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-cnibin\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788019 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-os-release\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788063 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-run-k8s-cni-cncf-io\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788277 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-script-lib\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788352 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-ovn\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788396 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-ovn-kubernetes\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788455 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-cnibin\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788495 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-node-log\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788539 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-system-cni-dir\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788648 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-cni-dir\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.788830 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-kubelet\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786937 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-var-lib-kubelet\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789052 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc04c58e-83bd-4c0c-b58a-da7dea820272-mcd-auth-proxy-config\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789057 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-netns\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789093 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-bin\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789113 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-systemd\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789170 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-etc-openvswitch\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789242 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-netd\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.790782 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-config\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789693 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d57962b8-d12d-4a13-b930-ab79563daeb9-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789831 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-env-overrides\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789888 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d57962b8-d12d-4a13-b930-ab79563daeb9-cni-binary-copy\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.787031 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-etc-kubernetes\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.790024 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4aaf9352-3715-40a0-876a-09f4f27a41c2-cni-binary-copy\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.790069 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.790105 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-slash\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.790104 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-log-socket\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.786999 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a395f4a0-65e4-47aa-81cb-5de11c5c30f2-hosts-file\") pod \"node-resolver-7wk2h\" (UID: \"a395f4a0-65e4-47aa-81cb-5de11c5c30f2\") " pod="openshift-dns/node-resolver-7wk2h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.790177 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.790203 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-run-multus-certs\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.789277 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-host-var-lib-cni-multus\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.790774 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-conf-dir\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.791191 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4aaf9352-3715-40a0-876a-09f4f27a41c2-multus-daemon-config\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.791507 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d57962b8-d12d-4a13-b930-ab79563daeb9-os-release\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.797081 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc04c58e-83bd-4c0c-b58a-da7dea820272-proxy-tls\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.800220 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovn-node-metrics-cert\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.801418 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sssp7\" (UniqueName: \"kubernetes.io/projected/d57962b8-d12d-4a13-b930-ab79563daeb9-kube-api-access-sssp7\") pod \"multus-additional-cni-plugins-g5sf8\" (UID: \"d57962b8-d12d-4a13-b930-ab79563daeb9\") " pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.805769 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58xt6\" (UniqueName: \"kubernetes.io/projected/4aaf9352-3715-40a0-876a-09f4f27a41c2-kube-api-access-58xt6\") pod \"multus-pp4xr\" (UID: \"4aaf9352-3715-40a0-876a-09f4f27a41c2\") " pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.815201 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-zf77h"] Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.815960 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.817265 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd9lx\" (UniqueName: \"kubernetes.io/projected/cc04c58e-83bd-4c0c-b58a-da7dea820272-kube-api-access-jd9lx\") pod \"machine-config-daemon-xp5xk\" (UID: \"cc04c58e-83bd-4c0c-b58a-da7dea820272\") " pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.817464 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2tv7\" (UniqueName: \"kubernetes.io/projected/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-kube-api-access-d2tv7\") pod \"ovnkube-node-f8k4j\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.819796 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.820691 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.821888 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.821940 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.823085 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s22ms\" (UniqueName: \"kubernetes.io/projected/a395f4a0-65e4-47aa-81cb-5de11c5c30f2-kube-api-access-s22ms\") pod \"node-resolver-7wk2h\" (UID: \"a395f4a0-65e4-47aa-81cb-5de11c5c30f2\") " pod="openshift-dns/node-resolver-7wk2h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.841553 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.841621 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.841634 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.841677 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.841696 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.888848 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77kvz\" (UniqueName: \"kubernetes.io/projected/74c8afa8-e378-494a-9c03-5810fc81744e-kube-api-access-77kvz\") pod \"node-ca-zf77h\" (UID: \"74c8afa8-e378-494a-9c03-5810fc81744e\") " pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.888924 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74c8afa8-e378-494a-9c03-5810fc81744e-host\") pod \"node-ca-zf77h\" (UID: \"74c8afa8-e378-494a-9c03-5810fc81744e\") " pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.888964 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/74c8afa8-e378-494a-9c03-5810fc81744e-serviceca\") pod \"node-ca-zf77h\" (UID: \"74c8afa8-e378-494a-9c03-5810fc81744e\") " pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.898278 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7wk2h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.917434 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" Mar 18 06:48:32 crc kubenswrapper[4917]: W0318 06:48:32.925283 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda395f4a0_65e4_47aa_81cb_5de11c5c30f2.slice/crio-f1d5b4d5f5aebd00e45e94faeb163b4be68786df11cfda398022e50068e3ba9e WatchSource:0}: Error finding container f1d5b4d5f5aebd00e45e94faeb163b4be68786df11cfda398022e50068e3ba9e: Status 404 returned error can't find the container with id f1d5b4d5f5aebd00e45e94faeb163b4be68786df11cfda398022e50068e3ba9e Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.927866 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:48:32 crc kubenswrapper[4917]: W0318 06:48:32.930615 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd57962b8_d12d_4a13_b930_ab79563daeb9.slice/crio-263d832e571713bbc031fe212b61964b1e11349baa2f1200c7b270d5eafc46a6 WatchSource:0}: Error finding container 263d832e571713bbc031fe212b61964b1e11349baa2f1200c7b270d5eafc46a6: Status 404 returned error can't find the container with id 263d832e571713bbc031fe212b61964b1e11349baa2f1200c7b270d5eafc46a6 Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.942469 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pp4xr" Mar 18 06:48:32 crc kubenswrapper[4917]: W0318 06:48:32.942975 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc04c58e_83bd_4c0c_b58a_da7dea820272.slice/crio-207b2e581bb28f9be0c5b3c80a771c2d82d23b7723a7570638a7678298c53b5d WatchSource:0}: Error finding container 207b2e581bb28f9be0c5b3c80a771c2d82d23b7723a7570638a7678298c53b5d: Status 404 returned error can't find the container with id 207b2e581bb28f9be0c5b3c80a771c2d82d23b7723a7570638a7678298c53b5d Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.943878 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.943981 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.944054 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.944132 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.944198 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:32Z","lastTransitionTime":"2026-03-18T06:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.968273 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.989728 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77kvz\" (UniqueName: \"kubernetes.io/projected/74c8afa8-e378-494a-9c03-5810fc81744e-kube-api-access-77kvz\") pod \"node-ca-zf77h\" (UID: \"74c8afa8-e378-494a-9c03-5810fc81744e\") " pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.989821 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74c8afa8-e378-494a-9c03-5810fc81744e-host\") pod \"node-ca-zf77h\" (UID: \"74c8afa8-e378-494a-9c03-5810fc81744e\") " pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.989859 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/74c8afa8-e378-494a-9c03-5810fc81744e-serviceca\") pod \"node-ca-zf77h\" (UID: \"74c8afa8-e378-494a-9c03-5810fc81744e\") " pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.990001 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/74c8afa8-e378-494a-9c03-5810fc81744e-host\") pod \"node-ca-zf77h\" (UID: \"74c8afa8-e378-494a-9c03-5810fc81744e\") " pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:32 crc kubenswrapper[4917]: I0318 06:48:32.991658 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/74c8afa8-e378-494a-9c03-5810fc81744e-serviceca\") pod \"node-ca-zf77h\" (UID: \"74c8afa8-e378-494a-9c03-5810fc81744e\") " pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:33 crc kubenswrapper[4917]: W0318 06:48:33.005319 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dd0f3cd_77e6_44b6_92e3_50740ab1fffa.slice/crio-87d91530680849fe3d74c26838d79a41ba50f8ae6cdcebce96233e3593a8a4ad WatchSource:0}: Error finding container 87d91530680849fe3d74c26838d79a41ba50f8ae6cdcebce96233e3593a8a4ad: Status 404 returned error can't find the container with id 87d91530680849fe3d74c26838d79a41ba50f8ae6cdcebce96233e3593a8a4ad Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.007976 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77kvz\" (UniqueName: \"kubernetes.io/projected/74c8afa8-e378-494a-9c03-5810fc81744e-kube-api-access-77kvz\") pod \"node-ca-zf77h\" (UID: \"74c8afa8-e378-494a-9c03-5810fc81744e\") " pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.021739 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz"] Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.022203 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.025445 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.025782 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.048076 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-ww4d6"] Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.048577 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:33 crc kubenswrapper[4917]: E0318 06:48:33.048668 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ww4d6" podUID="63affb66-9eb5-40ac-9b60-6ff9af511233" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.052863 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.052895 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.052906 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.052924 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.052940 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.090417 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr6zs\" (UniqueName: \"kubernetes.io/projected/61d0a5bc-ec95-459a-a5c0-b0fa766093de-kube-api-access-wr6zs\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.090471 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.090508 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61d0a5bc-ec95-459a-a5c0-b0fa766093de-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.090730 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xnvx\" (UniqueName: \"kubernetes.io/projected/63affb66-9eb5-40ac-9b60-6ff9af511233-kube-api-access-8xnvx\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.090806 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61d0a5bc-ec95-459a-a5c0-b0fa766093de-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.090911 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61d0a5bc-ec95-459a-a5c0-b0fa766093de-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.155707 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.155737 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.155747 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.155764 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.155792 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.191702 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr6zs\" (UniqueName: \"kubernetes.io/projected/61d0a5bc-ec95-459a-a5c0-b0fa766093de-kube-api-access-wr6zs\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.192060 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.192164 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61d0a5bc-ec95-459a-a5c0-b0fa766093de-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.192414 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61d0a5bc-ec95-459a-a5c0-b0fa766093de-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.196860 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xnvx\" (UniqueName: \"kubernetes.io/projected/63affb66-9eb5-40ac-9b60-6ff9af511233-kube-api-access-8xnvx\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.196926 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61d0a5bc-ec95-459a-a5c0-b0fa766093de-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.195617 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/61d0a5bc-ec95-459a-a5c0-b0fa766093de-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.196806 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/61d0a5bc-ec95-459a-a5c0-b0fa766093de-env-overrides\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: E0318 06:48:33.192275 4917 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:33 crc kubenswrapper[4917]: E0318 06:48:33.197066 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs podName:63affb66-9eb5-40ac-9b60-6ff9af511233 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:33.697048351 +0000 UTC m=+98.638203075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs") pod "network-metrics-daemon-ww4d6" (UID: "63affb66-9eb5-40ac-9b60-6ff9af511233") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.193847 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zf77h" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.198232 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/61d0a5bc-ec95-459a-a5c0-b0fa766093de-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.208070 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr6zs\" (UniqueName: \"kubernetes.io/projected/61d0a5bc-ec95-459a-a5c0-b0fa766093de-kube-api-access-wr6zs\") pod \"ovnkube-control-plane-749d76644c-p4hmz\" (UID: \"61d0a5bc-ec95-459a-a5c0-b0fa766093de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.224314 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xnvx\" (UniqueName: \"kubernetes.io/projected/63affb66-9eb5-40ac-9b60-6ff9af511233-kube-api-access-8xnvx\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.233508 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pp4xr" event={"ID":"4aaf9352-3715-40a0-876a-09f4f27a41c2","Type":"ContainerStarted","Data":"15cf267d6e201b9eed42ce3305396a9e1c1634cdcab53dc137bfd3a40c8a776b"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.233570 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pp4xr" event={"ID":"4aaf9352-3715-40a0-876a-09f4f27a41c2","Type":"ContainerStarted","Data":"cf3a9033bbc87741e15b790d1a15ed37a76ac1f9cafe1814aa8ce5feabad4a3a"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.238219 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerID="77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd" exitCode=0 Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.238308 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.238359 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerStarted","Data":"87d91530680849fe3d74c26838d79a41ba50f8ae6cdcebce96233e3593a8a4ad"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.239925 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"f7dccc642dba26d51505fb3c99e7edc3056913afe7dc7e5a4b6671e3bc30ccc1"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.239947 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"8bc780e4080b196acc164a6127722e66896b9364ad9dbfff119e4b793a72e986"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.239955 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"207b2e581bb28f9be0c5b3c80a771c2d82d23b7723a7570638a7678298c53b5d"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.241604 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" event={"ID":"d57962b8-d12d-4a13-b930-ab79563daeb9","Type":"ContainerStarted","Data":"263d832e571713bbc031fe212b61964b1e11349baa2f1200c7b270d5eafc46a6"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.244103 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7wk2h" event={"ID":"a395f4a0-65e4-47aa-81cb-5de11c5c30f2","Type":"ContainerStarted","Data":"abbf4a39c3580514c25840cc108d51d824576cfd7ce26116d6af6a710c4444ab"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.244128 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7wk2h" event={"ID":"a395f4a0-65e4-47aa-81cb-5de11c5c30f2","Type":"ContainerStarted","Data":"f1d5b4d5f5aebd00e45e94faeb163b4be68786df11cfda398022e50068e3ba9e"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.252138 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pp4xr" podStartSLOduration=31.252113508 podStartE2EDuration="31.252113508s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:33.252075437 +0000 UTC m=+98.193230141" watchObservedRunningTime="2026-03-18 06:48:33.252113508 +0000 UTC m=+98.193268232" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.258857 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.258893 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.258903 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.258919 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.258927 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.313664 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7wk2h" podStartSLOduration=31.31363786 podStartE2EDuration="31.31363786s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:33.312013214 +0000 UTC m=+98.253167968" watchObservedRunningTime="2026-03-18 06:48:33.31363786 +0000 UTC m=+98.254792614" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.356389 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.361488 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.361526 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.361537 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.361632 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.361646 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:33 crc kubenswrapper[4917]: W0318 06:48:33.371235 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61d0a5bc_ec95_459a_a5c0_b0fa766093de.slice/crio-475b0435c507ea47f851deac7fc4dcc9a312c5a7cf45ce2613d488b12a53ef08 WatchSource:0}: Error finding container 475b0435c507ea47f851deac7fc4dcc9a312c5a7cf45ce2613d488b12a53ef08: Status 404 returned error can't find the container with id 475b0435c507ea47f851deac7fc4dcc9a312c5a7cf45ce2613d488b12a53ef08 Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.464025 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.464068 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.464078 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.464104 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.464116 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.569002 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.569030 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.569037 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.569050 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.569058 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.671325 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.671362 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.671375 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.671394 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.671405 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.701208 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:33 crc kubenswrapper[4917]: E0318 06:48:33.701415 4917 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:33 crc kubenswrapper[4917]: E0318 06:48:33.701467 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs podName:63affb66-9eb5-40ac-9b60-6ff9af511233 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:34.70144988 +0000 UTC m=+99.642604604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs") pod "network-metrics-daemon-ww4d6" (UID: "63affb66-9eb5-40ac-9b60-6ff9af511233") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.771933 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:33 crc kubenswrapper[4917]: E0318 06:48:33.772083 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.772322 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.772346 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:33 crc kubenswrapper[4917]: E0318 06:48:33.772576 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:33 crc kubenswrapper[4917]: E0318 06:48:33.772464 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.774097 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.774132 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.774143 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.774157 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.774169 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.876652 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.876837 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.876914 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.876996 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.877104 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.985051 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.985081 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.985089 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.985104 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:33 crc kubenswrapper[4917]: I0318 06:48:33.985115 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:33Z","lastTransitionTime":"2026-03-18T06:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.087897 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.088270 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.088286 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.088321 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.088339 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:34Z","lastTransitionTime":"2026-03-18T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.191844 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.191916 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.191941 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.191975 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.192000 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:34Z","lastTransitionTime":"2026-03-18T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.249396 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" event={"ID":"61d0a5bc-ec95-459a-a5c0-b0fa766093de","Type":"ContainerStarted","Data":"6edc95d5ed91c9d2b6bc53ce1514f8b313067e1a592cc27a1e0a774e6d610766"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.249841 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" event={"ID":"61d0a5bc-ec95-459a-a5c0-b0fa766093de","Type":"ContainerStarted","Data":"35ae2999512ffd25d526f274f9646e7fae8bcd5295733b838c106c7f28ba2416"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.250000 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" event={"ID":"61d0a5bc-ec95-459a-a5c0-b0fa766093de","Type":"ContainerStarted","Data":"475b0435c507ea47f851deac7fc4dcc9a312c5a7cf45ce2613d488b12a53ef08"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.260689 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerStarted","Data":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.260749 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerStarted","Data":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.260770 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerStarted","Data":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.260788 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerStarted","Data":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.260806 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerStarted","Data":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.260824 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerStarted","Data":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.263745 4917 generic.go:334] "Generic (PLEG): container finished" podID="d57962b8-d12d-4a13-b930-ab79563daeb9" containerID="f9d9c9ab051e9186677dc2d80b83730110d7a9493ee34988b37acd7010ee3043" exitCode=0 Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.263833 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" event={"ID":"d57962b8-d12d-4a13-b930-ab79563daeb9","Type":"ContainerDied","Data":"f9d9c9ab051e9186677dc2d80b83730110d7a9493ee34988b37acd7010ee3043"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.267276 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podStartSLOduration=32.267262299 podStartE2EDuration="32.267262299s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:33.332123779 +0000 UTC m=+98.273278493" watchObservedRunningTime="2026-03-18 06:48:34.267262299 +0000 UTC m=+99.208417043" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.268152 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zf77h" event={"ID":"74c8afa8-e378-494a-9c03-5810fc81744e","Type":"ContainerStarted","Data":"5f901fa272a02353304a361ff62d6ed07008ce802dcab804cd1b035bec361c31"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.268204 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zf77h" event={"ID":"74c8afa8-e378-494a-9c03-5810fc81744e","Type":"ContainerStarted","Data":"fba1109a00c8358b348afe75a70b6012c1a76740734e8bfb1a7aa83fe559e5c4"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.295433 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.295793 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.295883 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.295964 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.296042 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:34Z","lastTransitionTime":"2026-03-18T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.299655 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-p4hmz" podStartSLOduration=31.299632571 podStartE2EDuration="31.299632571s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:34.267238028 +0000 UTC m=+99.208392822" watchObservedRunningTime="2026-03-18 06:48:34.299632571 +0000 UTC m=+99.240787315" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.318778 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zf77h" podStartSLOduration=32.318748944 podStartE2EDuration="32.318748944s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:34.318416187 +0000 UTC m=+99.259570971" watchObservedRunningTime="2026-03-18 06:48:34.318748944 +0000 UTC m=+99.259903698" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.398407 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.398461 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.398477 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.398500 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.398515 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:34Z","lastTransitionTime":"2026-03-18T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.435148 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.435199 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.435216 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.435242 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.435260 4917 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-18T06:48:34Z","lastTransitionTime":"2026-03-18T06:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.491013 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9"] Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.491451 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.493846 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.494278 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.494278 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.494369 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.510658 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/be684e38-23d2-47e0-9904-e1beaca97722-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.510769 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be684e38-23d2-47e0-9904-e1beaca97722-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.510823 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be684e38-23d2-47e0-9904-e1beaca97722-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.510979 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be684e38-23d2-47e0-9904-e1beaca97722-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.511046 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/be684e38-23d2-47e0-9904-e1beaca97722-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.612377 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/be684e38-23d2-47e0-9904-e1beaca97722-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.612534 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/be684e38-23d2-47e0-9904-e1beaca97722-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.612550 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be684e38-23d2-47e0-9904-e1beaca97722-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.612661 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be684e38-23d2-47e0-9904-e1beaca97722-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.612773 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be684e38-23d2-47e0-9904-e1beaca97722-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.612824 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/be684e38-23d2-47e0-9904-e1beaca97722-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.612901 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/be684e38-23d2-47e0-9904-e1beaca97722-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.613399 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/be684e38-23d2-47e0-9904-e1beaca97722-service-ca\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.620800 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be684e38-23d2-47e0-9904-e1beaca97722-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.634207 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/be684e38-23d2-47e0-9904-e1beaca97722-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-zkkw9\" (UID: \"be684e38-23d2-47e0-9904-e1beaca97722\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.656501 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" Mar 18 06:48:34 crc kubenswrapper[4917]: W0318 06:48:34.673731 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe684e38_23d2_47e0_9904_e1beaca97722.slice/crio-b9739f1d1867a13ede505bdfe510dad7fdcba294da449671f5d7ebe6927760c0 WatchSource:0}: Error finding container b9739f1d1867a13ede505bdfe510dad7fdcba294da449671f5d7ebe6927760c0: Status 404 returned error can't find the container with id b9739f1d1867a13ede505bdfe510dad7fdcba294da449671f5d7ebe6927760c0 Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.713448 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:34 crc kubenswrapper[4917]: E0318 06:48:34.713763 4917 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:34 crc kubenswrapper[4917]: E0318 06:48:34.713829 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs podName:63affb66-9eb5-40ac-9b60-6ff9af511233 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:36.713815108 +0000 UTC m=+101.654969822 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs") pod "network-metrics-daemon-ww4d6" (UID: "63affb66-9eb5-40ac-9b60-6ff9af511233") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.760666 4917 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.773084 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:34 crc kubenswrapper[4917]: E0318 06:48:34.773248 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ww4d6" podUID="63affb66-9eb5-40ac-9b60-6ff9af511233" Mar 18 06:48:34 crc kubenswrapper[4917]: I0318 06:48:34.776907 4917 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 18 06:48:35 crc kubenswrapper[4917]: I0318 06:48:35.272875 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" event={"ID":"be684e38-23d2-47e0-9904-e1beaca97722","Type":"ContainerStarted","Data":"424b4efc2a580a3ecfa99d4583ca1b92efa3f478f89250b6920c9663c56cec56"} Mar 18 06:48:35 crc kubenswrapper[4917]: I0318 06:48:35.273458 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" event={"ID":"be684e38-23d2-47e0-9904-e1beaca97722","Type":"ContainerStarted","Data":"b9739f1d1867a13ede505bdfe510dad7fdcba294da449671f5d7ebe6927760c0"} Mar 18 06:48:35 crc kubenswrapper[4917]: I0318 06:48:35.275927 4917 generic.go:334] "Generic (PLEG): container finished" podID="d57962b8-d12d-4a13-b930-ab79563daeb9" containerID="2726a20a5b53e8435559d8bd20cd05e04d963808e99947ee124fab4620b88d28" exitCode=0 Mar 18 06:48:35 crc kubenswrapper[4917]: I0318 06:48:35.276056 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" event={"ID":"d57962b8-d12d-4a13-b930-ab79563daeb9","Type":"ContainerDied","Data":"2726a20a5b53e8435559d8bd20cd05e04d963808e99947ee124fab4620b88d28"} Mar 18 06:48:35 crc kubenswrapper[4917]: I0318 06:48:35.304468 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-zkkw9" podStartSLOduration=33.304438399 podStartE2EDuration="33.304438399s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:35.302455124 +0000 UTC m=+100.243609918" watchObservedRunningTime="2026-03-18 06:48:35.304438399 +0000 UTC m=+100.245593143" Mar 18 06:48:35 crc kubenswrapper[4917]: I0318 06:48:35.772404 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:35 crc kubenswrapper[4917]: I0318 06:48:35.772439 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:35 crc kubenswrapper[4917]: E0318 06:48:35.773786 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:35 crc kubenswrapper[4917]: I0318 06:48:35.774001 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:35 crc kubenswrapper[4917]: E0318 06:48:35.774106 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:35 crc kubenswrapper[4917]: E0318 06:48:35.774295 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:36 crc kubenswrapper[4917]: I0318 06:48:36.280989 4917 generic.go:334] "Generic (PLEG): container finished" podID="d57962b8-d12d-4a13-b930-ab79563daeb9" containerID="095945239d4b63af9a44d7e6a949b5b4b954a70e36061906558051b4c9ae7194" exitCode=0 Mar 18 06:48:36 crc kubenswrapper[4917]: I0318 06:48:36.281070 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" event={"ID":"d57962b8-d12d-4a13-b930-ab79563daeb9","Type":"ContainerDied","Data":"095945239d4b63af9a44d7e6a949b5b4b954a70e36061906558051b4c9ae7194"} Mar 18 06:48:36 crc kubenswrapper[4917]: I0318 06:48:36.294639 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerStarted","Data":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} Mar 18 06:48:36 crc kubenswrapper[4917]: I0318 06:48:36.753031 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:36 crc kubenswrapper[4917]: E0318 06:48:36.753208 4917 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:36 crc kubenswrapper[4917]: E0318 06:48:36.753269 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs podName:63affb66-9eb5-40ac-9b60-6ff9af511233 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:40.753250448 +0000 UTC m=+105.694405162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs") pod "network-metrics-daemon-ww4d6" (UID: "63affb66-9eb5-40ac-9b60-6ff9af511233") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:36 crc kubenswrapper[4917]: I0318 06:48:36.772017 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:36 crc kubenswrapper[4917]: E0318 06:48:36.772164 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ww4d6" podUID="63affb66-9eb5-40ac-9b60-6ff9af511233" Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.300977 4917 generic.go:334] "Generic (PLEG): container finished" podID="d57962b8-d12d-4a13-b930-ab79563daeb9" containerID="d60538e22131ea324ef9b95dfc939ff0288a412e681dd352aa4e982e4df31f5a" exitCode=0 Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.301027 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" event={"ID":"d57962b8-d12d-4a13-b930-ab79563daeb9","Type":"ContainerDied","Data":"d60538e22131ea324ef9b95dfc939ff0288a412e681dd352aa4e982e4df31f5a"} Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.459668 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.459855 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.459819362 +0000 UTC m=+118.400974106 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.460334 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.460426 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.460546 4917 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.460633 4917 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.460682 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.460657092 +0000 UTC m=+118.401811846 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.460724 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.460699073 +0000 UTC m=+118.401853867 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.561847 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.561908 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.562073 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.562090 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.562103 4917 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.562146 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.56213172 +0000 UTC m=+118.503286444 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.562458 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.562478 4917 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.562488 4917 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.562517 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.562507658 +0000 UTC m=+118.503662382 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.772240 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.772399 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.772405 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:37 crc kubenswrapper[4917]: I0318 06:48:37.772455 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.772552 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:37 crc kubenswrapper[4917]: E0318 06:48:37.772751 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:38 crc kubenswrapper[4917]: I0318 06:48:38.307296 4917 generic.go:334] "Generic (PLEG): container finished" podID="d57962b8-d12d-4a13-b930-ab79563daeb9" containerID="2dc5a97b01369ff37faed0dd950482b7b5ea907eb945ce378bc1f9a1c6a605dc" exitCode=0 Mar 18 06:48:38 crc kubenswrapper[4917]: I0318 06:48:38.307352 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" event={"ID":"d57962b8-d12d-4a13-b930-ab79563daeb9","Type":"ContainerDied","Data":"2dc5a97b01369ff37faed0dd950482b7b5ea907eb945ce378bc1f9a1c6a605dc"} Mar 18 06:48:38 crc kubenswrapper[4917]: I0318 06:48:38.771719 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:38 crc kubenswrapper[4917]: E0318 06:48:38.772254 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ww4d6" podUID="63affb66-9eb5-40ac-9b60-6ff9af511233" Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.313863 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerStarted","Data":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.314618 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.314650 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.314659 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.316697 4917 generic.go:334] "Generic (PLEG): container finished" podID="d57962b8-d12d-4a13-b930-ab79563daeb9" containerID="4831f968e569ee3d58a7cee779b6e33f7f9526b396425f04d3fcc290135302e3" exitCode=0 Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.316734 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" event={"ID":"d57962b8-d12d-4a13-b930-ab79563daeb9","Type":"ContainerDied","Data":"4831f968e569ee3d58a7cee779b6e33f7f9526b396425f04d3fcc290135302e3"} Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.347353 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.347694 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.381488 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" podStartSLOduration=36.381469076 podStartE2EDuration="36.381469076s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:39.356749306 +0000 UTC m=+104.297904020" watchObservedRunningTime="2026-03-18 06:48:39.381469076 +0000 UTC m=+104.322623790" Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.773986 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:39 crc kubenswrapper[4917]: E0318 06:48:39.774456 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.774081 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:39 crc kubenswrapper[4917]: I0318 06:48:39.776004 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:39 crc kubenswrapper[4917]: E0318 06:48:39.775937 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:39 crc kubenswrapper[4917]: E0318 06:48:39.776241 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:40 crc kubenswrapper[4917]: I0318 06:48:40.322873 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" event={"ID":"d57962b8-d12d-4a13-b930-ab79563daeb9","Type":"ContainerStarted","Data":"d85fc34b29b7b8b4dfd3d5783b2c322451c7e9fa54865e4588f446a6d94e1eb9"} Mar 18 06:48:40 crc kubenswrapper[4917]: I0318 06:48:40.353742 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g5sf8" podStartSLOduration=38.35349438 podStartE2EDuration="38.35349438s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:40.352650722 +0000 UTC m=+105.293805456" watchObservedRunningTime="2026-03-18 06:48:40.35349438 +0000 UTC m=+105.294649094" Mar 18 06:48:40 crc kubenswrapper[4917]: I0318 06:48:40.730500 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ww4d6"] Mar 18 06:48:40 crc kubenswrapper[4917]: I0318 06:48:40.730747 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:40 crc kubenswrapper[4917]: E0318 06:48:40.730948 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ww4d6" podUID="63affb66-9eb5-40ac-9b60-6ff9af511233" Mar 18 06:48:40 crc kubenswrapper[4917]: I0318 06:48:40.798412 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:40 crc kubenswrapper[4917]: E0318 06:48:40.798811 4917 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:40 crc kubenswrapper[4917]: E0318 06:48:40.798903 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs podName:63affb66-9eb5-40ac-9b60-6ff9af511233 nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.798874073 +0000 UTC m=+113.740028827 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs") pod "network-metrics-daemon-ww4d6" (UID: "63affb66-9eb5-40ac-9b60-6ff9af511233") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 18 06:48:41 crc kubenswrapper[4917]: I0318 06:48:41.772922 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:41 crc kubenswrapper[4917]: I0318 06:48:41.772983 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:41 crc kubenswrapper[4917]: I0318 06:48:41.772931 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:41 crc kubenswrapper[4917]: E0318 06:48:41.773111 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 18 06:48:41 crc kubenswrapper[4917]: E0318 06:48:41.773214 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 18 06:48:41 crc kubenswrapper[4917]: E0318 06:48:41.773315 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 18 06:48:42 crc kubenswrapper[4917]: I0318 06:48:42.771571 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:42 crc kubenswrapper[4917]: E0318 06:48:42.771755 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ww4d6" podUID="63affb66-9eb5-40ac-9b60-6ff9af511233" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.387754 4917 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.387908 4917 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.436143 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lgc4l"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.436597 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.436971 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.437349 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.437855 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.438956 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.444621 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cm4kv"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.445164 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mgnbj"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.445470 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.445566 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.460054 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.460090 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.460871 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.461408 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.461811 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.462359 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.462576 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.462834 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.462866 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.463469 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.464306 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h5mvm"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.464348 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.464864 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.467175 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.468426 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.468788 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.468914 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.469876 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.470180 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.470349 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.470968 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.471891 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.472158 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.472380 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.472534 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.472644 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.472843 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.472955 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zk4nl"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.473021 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.473186 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.473358 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.506702 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zk4nl" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.508351 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-sgj9g"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.508927 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.508945 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.509148 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.509911 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.509950 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.510104 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.510672 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.510949 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.511037 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.511774 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.511986 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539034 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1839ae9a-5ae3-48fb-912b-93a4242a3ffa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mfbc2\" (UID: \"1839ae9a-5ae3-48fb-912b-93a4242a3ffa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539072 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4fb3fe10-f693-4e74-b72a-f2b10dac3580-node-pullsecrets\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539094 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/65334d30-4340-47c4-b333-1d41d0c2869b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-sbj9l\" (UID: \"65334d30-4340-47c4-b333-1d41d0c2869b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539111 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539125 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4fb3fe10-f693-4e74-b72a-f2b10dac3580-encryption-config\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539150 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8th7\" (UniqueName: \"kubernetes.io/projected/cb1793e3-4625-4f46-8be2-d75629f46371-kube-api-access-n8th7\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539164 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47j5n\" (UniqueName: \"kubernetes.io/projected/4f086875-421c-4218-91f3-1349dcc3247d-kube-api-access-47j5n\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539178 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwmh6\" (UniqueName: \"kubernetes.io/projected/65334d30-4340-47c4-b333-1d41d0c2869b-kube-api-access-hwmh6\") pod \"cluster-samples-operator-665b6dd947-sbj9l\" (UID: \"65334d30-4340-47c4-b333-1d41d0c2869b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539192 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-446w2\" (UniqueName: \"kubernetes.io/projected/f4c8c2e4-554f-423e-81a2-cd9f63eb7250-kube-api-access-446w2\") pod \"downloads-7954f5f757-zk4nl\" (UID: \"f4c8c2e4-554f-423e-81a2-cd9f63eb7250\") " pod="openshift-console/downloads-7954f5f757-zk4nl" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539207 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4fb3fe10-f693-4e74-b72a-f2b10dac3580-audit-dir\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539222 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f086875-421c-4218-91f3-1349dcc3247d-audit-policies\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539236 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb577\" (UniqueName: \"kubernetes.io/projected/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-kube-api-access-mb577\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539251 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8txzs\" (UniqueName: \"kubernetes.io/projected/76c2512e-6f87-4731-a42f-eddd2188ff59-kube-api-access-8txzs\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539268 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f086875-421c-4218-91f3-1349dcc3247d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539282 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539297 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-images\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539314 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/230964d0-ab50-4d70-8379-3250d17aae00-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539332 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f086875-421c-4218-91f3-1349dcc3247d-audit-dir\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539346 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-client-ca\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539360 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c2512e-6f87-4731-a42f-eddd2188ff59-serving-cert\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539379 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtqqr\" (UniqueName: \"kubernetes.io/projected/1839ae9a-5ae3-48fb-912b-93a4242a3ffa-kube-api-access-wtqqr\") pod \"openshift-apiserver-operator-796bbdcf4f-mfbc2\" (UID: \"1839ae9a-5ae3-48fb-912b-93a4242a3ffa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539394 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-image-import-ca\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539409 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkgch\" (UniqueName: \"kubernetes.io/projected/4fb3fe10-f693-4e74-b72a-f2b10dac3580-kube-api-access-qkgch\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539441 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb1793e3-4625-4f46-8be2-d75629f46371-serving-cert\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539457 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-etcd-serving-ca\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539470 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4fb3fe10-f693-4e74-b72a-f2b10dac3580-etcd-client\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539486 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-config\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539502 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-audit\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539516 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-config\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539537 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f086875-421c-4218-91f3-1349dcc3247d-serving-cert\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539551 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f086875-421c-4218-91f3-1349dcc3247d-etcd-client\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539567 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/230964d0-ab50-4d70-8379-3250d17aae00-service-ca-bundle\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539602 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230964d0-ab50-4d70-8379-3250d17aae00-config\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539625 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-config\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539642 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4f086875-421c-4218-91f3-1349dcc3247d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539663 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24ggp\" (UniqueName: \"kubernetes.io/projected/230964d0-ab50-4d70-8379-3250d17aae00-kube-api-access-24ggp\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539681 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fb3fe10-f693-4e74-b72a-f2b10dac3580-serving-cert\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539708 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539726 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/230964d0-ab50-4d70-8379-3250d17aae00-serving-cert\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.539957 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1839ae9a-5ae3-48fb-912b-93a4242a3ffa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mfbc2\" (UID: \"1839ae9a-5ae3-48fb-912b-93a4242a3ffa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.540008 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4f086875-421c-4218-91f3-1349dcc3247d-encryption-config\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.540041 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-client-ca\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.540068 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-config\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.540215 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nbchd"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.540559 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.540614 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.540949 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nkz5k"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.541258 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.541569 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.541625 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.541722 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.541770 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.541837 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.541994 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.542067 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.542276 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.542138 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.542846 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.542872 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.542977 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.543411 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hfll9"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.544096 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.544269 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9xdsf"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.544692 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.545136 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6x6mh"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.545602 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.546826 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.547359 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.549711 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.550129 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.550310 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-66xqj"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.550902 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.551263 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.551850 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.551896 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.552212 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.553118 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.560939 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.561788 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.561883 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.562093 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.563767 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.564557 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.564751 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.565104 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.565308 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.571293 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.571496 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.571774 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.571953 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.572096 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.572422 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.572630 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.572827 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.573749 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.574270 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.579053 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.579233 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.579439 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.579886 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.581694 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.585839 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.586179 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.586188 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.588894 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.591268 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.591993 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.592166 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.592600 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.592720 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.593100 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.593493 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.594128 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.594242 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mgnbj"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.594349 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.594373 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.594460 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.594561 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.594682 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.594706 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.594778 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.607888 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.608257 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.608459 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.608481 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.608659 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.615299 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.615821 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.616411 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.616769 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.616975 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.617235 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.617578 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.617916 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.618577 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.618933 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.619223 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.619603 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.619993 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.620124 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.642240 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.643895 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.643988 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.644450 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.644842 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645559 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtqqr\" (UniqueName: \"kubernetes.io/projected/1839ae9a-5ae3-48fb-912b-93a4242a3ffa-kube-api-access-wtqqr\") pod \"openshift-apiserver-operator-796bbdcf4f-mfbc2\" (UID: \"1839ae9a-5ae3-48fb-912b-93a4242a3ffa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645604 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-image-import-ca\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645627 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkgch\" (UniqueName: \"kubernetes.io/projected/4fb3fe10-f693-4e74-b72a-f2b10dac3580-kube-api-access-qkgch\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645655 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb1793e3-4625-4f46-8be2-d75629f46371-serving-cert\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645673 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4fb3fe10-f693-4e74-b72a-f2b10dac3580-etcd-client\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645689 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-etcd-serving-ca\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645708 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-config\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645755 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-oauth-config\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645776 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-console-config\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645796 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqc4x\" (UniqueName: \"kubernetes.io/projected/42365ba9-ba85-47ed-a93c-543cd2ce8d30-kube-api-access-zqc4x\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645818 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-config\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645841 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-audit\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645871 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scqk7\" (UniqueName: \"kubernetes.io/projected/7c21e973-7d87-496c-81ba-1425ba599774-kube-api-access-scqk7\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.645913 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f086875-421c-4218-91f3-1349dcc3247d-serving-cert\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646216 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/726838f6-bf51-40d1-83ac-c55ef671bf16-etcd-service-ca\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646244 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f086875-421c-4218-91f3-1349dcc3247d-etcd-client\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646265 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/230964d0-ab50-4d70-8379-3250d17aae00-service-ca-bundle\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646288 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-proxy-tls\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646310 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25gpj\" (UniqueName: \"kubernetes.io/projected/b868698f-400d-43b7-ba5a-190668ef2e9e-kube-api-access-25gpj\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646362 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42365ba9-ba85-47ed-a93c-543cd2ce8d30-trusted-ca\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646388 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230964d0-ab50-4d70-8379-3250d17aae00-config\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646409 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-serving-cert\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646439 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b868698f-400d-43b7-ba5a-190668ef2e9e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646465 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42365ba9-ba85-47ed-a93c-543cd2ce8d30-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646489 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-config\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646510 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4f086875-421c-4218-91f3-1349dcc3247d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646530 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24ggp\" (UniqueName: \"kubernetes.io/projected/230964d0-ab50-4d70-8379-3250d17aae00-kube-api-access-24ggp\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646550 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b868698f-400d-43b7-ba5a-190668ef2e9e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646571 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b868698f-400d-43b7-ba5a-190668ef2e9e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646604 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/726838f6-bf51-40d1-83ac-c55ef671bf16-etcd-client\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646684 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fb3fe10-f693-4e74-b72a-f2b10dac3580-serving-cert\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646707 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646726 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-service-ca\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646746 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-images\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646789 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/230964d0-ab50-4d70-8379-3250d17aae00-serving-cert\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646818 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1839ae9a-5ae3-48fb-912b-93a4242a3ffa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mfbc2\" (UID: \"1839ae9a-5ae3-48fb-912b-93a4242a3ffa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646848 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4f086875-421c-4218-91f3-1349dcc3247d-encryption-config\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646868 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-config\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646884 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-client-ca\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.646975 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1839ae9a-5ae3-48fb-912b-93a4242a3ffa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mfbc2\" (UID: \"1839ae9a-5ae3-48fb-912b-93a4242a3ffa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.647004 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4fb3fe10-f693-4e74-b72a-f2b10dac3580-node-pullsecrets\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.647036 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/65334d30-4340-47c4-b333-1d41d0c2869b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-sbj9l\" (UID: \"65334d30-4340-47c4-b333-1d41d0c2869b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.647054 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.674321 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.676425 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.676533 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.677471 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.677894 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.678725 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.678855 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-image-import-ca\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679041 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/230964d0-ab50-4d70-8379-3250d17aae00-config\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679138 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679403 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679630 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1839ae9a-5ae3-48fb-912b-93a4242a3ffa-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mfbc2\" (UID: \"1839ae9a-5ae3-48fb-912b-93a4242a3ffa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679630 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4fb3fe10-f693-4e74-b72a-f2b10dac3580-encryption-config\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679683 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-trusted-ca-bundle\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679724 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8th7\" (UniqueName: \"kubernetes.io/projected/cb1793e3-4625-4f46-8be2-d75629f46371-kube-api-access-n8th7\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679740 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47j5n\" (UniqueName: \"kubernetes.io/projected/4f086875-421c-4218-91f3-1349dcc3247d-kube-api-access-47j5n\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679756 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.680228 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.680770 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.681138 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.681552 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-client-ca\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.681622 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.679758 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt9v9\" (UniqueName: \"kubernetes.io/projected/726838f6-bf51-40d1-83ac-c55ef671bf16-kube-api-access-qt9v9\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.682100 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwmh6\" (UniqueName: \"kubernetes.io/projected/65334d30-4340-47c4-b333-1d41d0c2869b-kube-api-access-hwmh6\") pod \"cluster-samples-operator-665b6dd947-sbj9l\" (UID: \"65334d30-4340-47c4-b333-1d41d0c2869b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.682127 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-446w2\" (UniqueName: \"kubernetes.io/projected/f4c8c2e4-554f-423e-81a2-cd9f63eb7250-kube-api-access-446w2\") pod \"downloads-7954f5f757-zk4nl\" (UID: \"f4c8c2e4-554f-423e-81a2-cd9f63eb7250\") " pod="openshift-console/downloads-7954f5f757-zk4nl" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.682241 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4fb3fe10-f693-4e74-b72a-f2b10dac3580-audit-dir\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.682289 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f086875-421c-4218-91f3-1349dcc3247d-audit-policies\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.682310 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb577\" (UniqueName: \"kubernetes.io/projected/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-kube-api-access-mb577\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.682342 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-oauth-serving-cert\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.682361 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8txzs\" (UniqueName: \"kubernetes.io/projected/76c2512e-6f87-4731-a42f-eddd2188ff59-kube-api-access-8txzs\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.682376 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/726838f6-bf51-40d1-83ac-c55ef671bf16-serving-cert\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.682992 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb1793e3-4625-4f46-8be2-d75629f46371-serving-cert\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.683166 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-etcd-serving-ca\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.683829 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-config\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.684807 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4fb3fe10-f693-4e74-b72a-f2b10dac3580-audit-dir\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.685388 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4f086875-421c-4218-91f3-1349dcc3247d-audit-policies\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.685496 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/230964d0-ab50-4d70-8379-3250d17aae00-service-ca-bundle\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.685514 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f086875-421c-4218-91f3-1349dcc3247d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.685541 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8v8rg"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.685944 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k4f7n"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.685970 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-config\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.686290 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/230964d0-ab50-4d70-8379-3250d17aae00-serving-cert\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.686393 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f086875-421c-4218-91f3-1349dcc3247d-serving-cert\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.686400 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4f086875-421c-4218-91f3-1349dcc3247d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.686441 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42365ba9-ba85-47ed-a93c-543cd2ce8d30-metrics-tls\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.686481 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-images\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.686508 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.686532 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/230964d0-ab50-4d70-8379-3250d17aae00-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.686594 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/726838f6-bf51-40d1-83ac-c55ef671bf16-config\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.686619 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s77lw\" (UniqueName: \"kubernetes.io/projected/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-kube-api-access-s77lw\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.687598 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.688095 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-images\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.689029 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/230964d0-ab50-4d70-8379-3250d17aae00-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.689291 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-config\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.689363 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f086875-421c-4218-91f3-1349dcc3247d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.689957 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4fb3fe10-f693-4e74-b72a-f2b10dac3580-serving-cert\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.690482 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4fb3fe10-f693-4e74-b72a-f2b10dac3580-audit\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.690606 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4fb3fe10-f693-4e74-b72a-f2b10dac3580-node-pullsecrets\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.691518 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1839ae9a-5ae3-48fb-912b-93a4242a3ffa-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mfbc2\" (UID: \"1839ae9a-5ae3-48fb-912b-93a4242a3ffa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.692361 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lgc4l"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.692442 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.692799 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.694060 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4fb3fe10-f693-4e74-b72a-f2b10dac3580-encryption-config\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.695645 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/726838f6-bf51-40d1-83ac-c55ef671bf16-etcd-ca\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.695708 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f086875-421c-4218-91f3-1349dcc3247d-audit-dir\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.695760 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.695799 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4f086875-421c-4218-91f3-1349dcc3247d-audit-dir\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.695842 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-client-ca\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.695886 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c2512e-6f87-4731-a42f-eddd2188ff59-serving-cert\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.696389 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4fb3fe10-f693-4e74-b72a-f2b10dac3580-etcd-client\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.696706 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-client-ca\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.697116 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f086875-421c-4218-91f3-1349dcc3247d-etcd-client\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.697857 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.698779 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-config\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.706812 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4f086875-421c-4218-91f3-1349dcc3247d-encryption-config\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.707511 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c2512e-6f87-4731-a42f-eddd2188ff59-serving-cert\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.707856 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.708123 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.708460 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.712076 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5dkf8"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.712752 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5dkf8" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.712758 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/65334d30-4340-47c4-b333-1d41d0c2869b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-sbj9l\" (UID: \"65334d30-4340-47c4-b333-1d41d0c2869b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.715748 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.715944 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.716504 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.716864 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.717519 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.717665 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.720668 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.721543 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.722128 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sgj9g"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.724163 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.726381 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hfll9"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.727996 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zk4nl"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.729111 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.730064 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h5mvm"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.736954 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.737530 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.737789 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-plhlj"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.738373 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.738545 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.738680 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.739099 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.739421 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cm4kv"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.740330 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gcd99"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.741047 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.741666 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.742200 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.742354 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.743790 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nbchd"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.744843 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.746010 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.746652 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.747516 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.748342 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.750138 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nkz5k"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.750897 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.751853 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.752690 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.753511 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.754517 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-874w9"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.755455 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-d6cp4"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.755972 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.756316 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.756393 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.757352 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8v8rg"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.758535 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.758597 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.759567 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k4f7n"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.760479 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.761404 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.762324 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.763293 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6x6mh"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.764113 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.765036 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9xdsf"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.765908 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.766782 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-874w9"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.767590 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6cp4"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.768406 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.769260 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5dkf8"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.770229 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gcd99"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.771309 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qtcrk"] Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.771835 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.771880 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.771916 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.772043 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.783536 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797084 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-proxy-tls\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797113 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25gpj\" (UniqueName: \"kubernetes.io/projected/b868698f-400d-43b7-ba5a-190668ef2e9e-kube-api-access-25gpj\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797130 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42365ba9-ba85-47ed-a93c-543cd2ce8d30-trusted-ca\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797151 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-serving-cert\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797168 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b868698f-400d-43b7-ba5a-190668ef2e9e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797183 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42365ba9-ba85-47ed-a93c-543cd2ce8d30-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797206 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b868698f-400d-43b7-ba5a-190668ef2e9e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797226 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b868698f-400d-43b7-ba5a-190668ef2e9e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797244 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/726838f6-bf51-40d1-83ac-c55ef671bf16-etcd-client\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797269 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-service-ca\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797284 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-images\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797307 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-trusted-ca-bundle\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797335 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt9v9\" (UniqueName: \"kubernetes.io/projected/726838f6-bf51-40d1-83ac-c55ef671bf16-kube-api-access-qt9v9\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797373 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-oauth-serving-cert\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797394 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/726838f6-bf51-40d1-83ac-c55ef671bf16-serving-cert\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797409 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42365ba9-ba85-47ed-a93c-543cd2ce8d30-metrics-tls\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797429 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/726838f6-bf51-40d1-83ac-c55ef671bf16-config\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797445 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s77lw\" (UniqueName: \"kubernetes.io/projected/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-kube-api-access-s77lw\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797460 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/726838f6-bf51-40d1-83ac-c55ef671bf16-etcd-ca\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797478 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797524 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqc4x\" (UniqueName: \"kubernetes.io/projected/42365ba9-ba85-47ed-a93c-543cd2ce8d30-kube-api-access-zqc4x\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797540 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-oauth-config\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797554 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-console-config\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797571 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scqk7\" (UniqueName: \"kubernetes.io/projected/7c21e973-7d87-496c-81ba-1425ba599774-kube-api-access-scqk7\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.797636 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/726838f6-bf51-40d1-83ac-c55ef671bf16-etcd-service-ca\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.798541 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-images\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.798692 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b868698f-400d-43b7-ba5a-190668ef2e9e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.798720 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-trusted-ca-bundle\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.799232 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.799423 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.799874 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-serving-cert\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.799889 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-service-ca\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.800383 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/726838f6-bf51-40d1-83ac-c55ef671bf16-etcd-ca\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.800668 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-console-config\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.800938 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-oauth-serving-cert\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.801059 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/726838f6-bf51-40d1-83ac-c55ef671bf16-config\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.801186 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/726838f6-bf51-40d1-83ac-c55ef671bf16-etcd-service-ca\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.803085 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/726838f6-bf51-40d1-83ac-c55ef671bf16-etcd-client\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.803129 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/b868698f-400d-43b7-ba5a-190668ef2e9e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.803645 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-oauth-config\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.805120 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/726838f6-bf51-40d1-83ac-c55ef671bf16-serving-cert\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.818716 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.841005 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.852605 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-proxy-tls\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.859344 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.880517 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.899547 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.920437 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.941049 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.961249 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.980309 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 06:48:43 crc kubenswrapper[4917]: I0318 06:48:43.999421 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.019159 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.038931 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.059141 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.078712 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.099547 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.120636 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.133856 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/42365ba9-ba85-47ed-a93c-543cd2ce8d30-metrics-tls\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.150192 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.160181 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.160304 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/42365ba9-ba85-47ed-a93c-543cd2ce8d30-trusted-ca\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.180029 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.200294 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.221110 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.240520 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.260294 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.280647 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.300065 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.319729 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.339825 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.360518 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.381021 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.399767 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.420132 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.439278 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.460231 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.480736 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.500269 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.560561 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.580344 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.627263 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkgch\" (UniqueName: \"kubernetes.io/projected/4fb3fe10-f693-4e74-b72a-f2b10dac3580-kube-api-access-qkgch\") pod \"apiserver-76f77b778f-cm4kv\" (UID: \"4fb3fe10-f693-4e74-b72a-f2b10dac3580\") " pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.639211 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.641238 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtqqr\" (UniqueName: \"kubernetes.io/projected/1839ae9a-5ae3-48fb-912b-93a4242a3ffa-kube-api-access-wtqqr\") pod \"openshift-apiserver-operator-796bbdcf4f-mfbc2\" (UID: \"1839ae9a-5ae3-48fb-912b-93a4242a3ffa\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.661153 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.680311 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.699316 4917 request.go:700] Waited for 1.018699971s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/serviceaccounts/openshift-controller-manager-sa/token Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.708202 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.726563 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8th7\" (UniqueName: \"kubernetes.io/projected/cb1793e3-4625-4f46-8be2-d75629f46371-kube-api-access-n8th7\") pod \"controller-manager-879f6c89f-lgc4l\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.739571 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47j5n\" (UniqueName: \"kubernetes.io/projected/4f086875-421c-4218-91f3-1349dcc3247d-kube-api-access-47j5n\") pod \"apiserver-7bbb656c7d-fj4nz\" (UID: \"4f086875-421c-4218-91f3-1349dcc3247d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.739712 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.761018 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.771732 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.771912 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.779974 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.799844 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.821080 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.841183 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.894732 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwmh6\" (UniqueName: \"kubernetes.io/projected/65334d30-4340-47c4-b333-1d41d0c2869b-kube-api-access-hwmh6\") pod \"cluster-samples-operator-665b6dd947-sbj9l\" (UID: \"65334d30-4340-47c4-b333-1d41d0c2869b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.907519 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-446w2\" (UniqueName: \"kubernetes.io/projected/f4c8c2e4-554f-423e-81a2-cd9f63eb7250-kube-api-access-446w2\") pod \"downloads-7954f5f757-zk4nl\" (UID: \"f4c8c2e4-554f-423e-81a2-cd9f63eb7250\") " pod="openshift-console/downloads-7954f5f757-zk4nl" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.926765 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24ggp\" (UniqueName: \"kubernetes.io/projected/230964d0-ab50-4d70-8379-3250d17aae00-kube-api-access-24ggp\") pod \"authentication-operator-69f744f599-mgnbj\" (UID: \"230964d0-ab50-4d70-8379-3250d17aae00\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.939205 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb577\" (UniqueName: \"kubernetes.io/projected/9d08595e-c5f0-49cd-bc6c-5e248bcc76e7-kube-api-access-mb577\") pod \"machine-api-operator-5694c8668f-h5mvm\" (UID: \"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.952833 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8txzs\" (UniqueName: \"kubernetes.io/projected/76c2512e-6f87-4731-a42f-eddd2188ff59-kube-api-access-8txzs\") pod \"route-controller-manager-6576b87f9c-kpdbw\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.956574 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.961751 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.969916 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:44 crc kubenswrapper[4917]: I0318 06:48:44.979972 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.001345 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.019832 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.032305 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2"] Mar 18 06:48:45 crc kubenswrapper[4917]: W0318 06:48:45.033236 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1839ae9a_5ae3_48fb_912b_93a4242a3ffa.slice/crio-945592b3aa515b3a4827f839ff2131221f6ca8882ca36a6d3275e5956673d138 WatchSource:0}: Error finding container 945592b3aa515b3a4827f839ff2131221f6ca8882ca36a6d3275e5956673d138: Status 404 returned error can't find the container with id 945592b3aa515b3a4827f839ff2131221f6ca8882ca36a6d3275e5956673d138 Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.041895 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.045027 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cm4kv"] Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.051966 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" Mar 18 06:48:45 crc kubenswrapper[4917]: W0318 06:48:45.059730 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fb3fe10_f693_4e74_b72a_f2b10dac3580.slice/crio-d7b963c60a4f33b0d64fbeeaf222f4414e2d345fa247a20690fdf1dc00de596d WatchSource:0}: Error finding container d7b963c60a4f33b0d64fbeeaf222f4414e2d345fa247a20690fdf1dc00de596d: Status 404 returned error can't find the container with id d7b963c60a4f33b0d64fbeeaf222f4414e2d345fa247a20690fdf1dc00de596d Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.060311 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.064771 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.079762 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.090243 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.100879 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.101907 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.113819 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zk4nl" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.119831 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.140949 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.160141 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.181624 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.201714 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.203930 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz"] Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.221019 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.244027 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.249570 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lgc4l"] Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.260862 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.279674 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 18 06:48:45 crc kubenswrapper[4917]: W0318 06:48:45.291324 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb1793e3_4625_4f46_8be2_d75629f46371.slice/crio-e30bd341967e23184c74d64f8e7d959637d8e67976efa1af028acc0c2102a7cf WatchSource:0}: Error finding container e30bd341967e23184c74d64f8e7d959637d8e67976efa1af028acc0c2102a7cf: Status 404 returned error can't find the container with id e30bd341967e23184c74d64f8e7d959637d8e67976efa1af028acc0c2102a7cf Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.299229 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.321254 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.340229 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.348960 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" event={"ID":"4f086875-421c-4218-91f3-1349dcc3247d","Type":"ContainerStarted","Data":"3793a0a2c5e8e74fe5f4d61ecd38ccf997fea5aabc9acdc5fb9642e84338eb4f"} Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.353282 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" event={"ID":"1839ae9a-5ae3-48fb-912b-93a4242a3ffa","Type":"ContainerStarted","Data":"9b1d85f80ce193d1ce89e3969763158d472865e57455a2f921ca82b3d65e3c17"} Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.353328 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" event={"ID":"1839ae9a-5ae3-48fb-912b-93a4242a3ffa","Type":"ContainerStarted","Data":"945592b3aa515b3a4827f839ff2131221f6ca8882ca36a6d3275e5956673d138"} Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.354358 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" event={"ID":"4fb3fe10-f693-4e74-b72a-f2b10dac3580","Type":"ContainerStarted","Data":"d7b963c60a4f33b0d64fbeeaf222f4414e2d345fa247a20690fdf1dc00de596d"} Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.355291 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" event={"ID":"cb1793e3-4625-4f46-8be2-d75629f46371","Type":"ContainerStarted","Data":"e30bd341967e23184c74d64f8e7d959637d8e67976efa1af028acc0c2102a7cf"} Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.359377 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.369507 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zk4nl"] Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.380315 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.399766 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.419946 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.434171 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-h5mvm"] Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.439978 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 06:48:45 crc kubenswrapper[4917]: W0318 06:48:45.449654 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d08595e_c5f0_49cd_bc6c_5e248bcc76e7.slice/crio-b02f79872b18f3aaff1e5ace61e4eb206dc77ccb186c36ea9effecca862ba472 WatchSource:0}: Error finding container b02f79872b18f3aaff1e5ace61e4eb206dc77ccb186c36ea9effecca862ba472: Status 404 returned error can't find the container with id b02f79872b18f3aaff1e5ace61e4eb206dc77ccb186c36ea9effecca862ba472 Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.459361 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.480070 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.501021 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.520060 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.524357 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw"] Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.530840 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l"] Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.540047 4917 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.559302 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.568473 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mgnbj"] Mar 18 06:48:45 crc kubenswrapper[4917]: W0318 06:48:45.573432 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76c2512e_6f87_4731_a42f_eddd2188ff59.slice/crio-9657c34a37b65a52efa092305c647a36c3913ec928114b0f6bbfbb0427dcb40c WatchSource:0}: Error finding container 9657c34a37b65a52efa092305c647a36c3913ec928114b0f6bbfbb0427dcb40c: Status 404 returned error can't find the container with id 9657c34a37b65a52efa092305c647a36c3913ec928114b0f6bbfbb0427dcb40c Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.580680 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.600766 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.620552 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.640648 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.680216 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.681004 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.700098 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.718628 4917 request.go:700] Waited for 1.946167958s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.721100 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.740440 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.760294 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.778029 4917 scope.go:117] "RemoveContainer" containerID="0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.780784 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.823501 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25gpj\" (UniqueName: \"kubernetes.io/projected/b868698f-400d-43b7-ba5a-190668ef2e9e-kube-api-access-25gpj\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.837926 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/42365ba9-ba85-47ed-a93c-543cd2ce8d30-bound-sa-token\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.855033 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt9v9\" (UniqueName: \"kubernetes.io/projected/726838f6-bf51-40d1-83ac-c55ef671bf16-kube-api-access-qt9v9\") pod \"etcd-operator-b45778765-nbchd\" (UID: \"726838f6-bf51-40d1-83ac-c55ef671bf16\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.874158 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b868698f-400d-43b7-ba5a-190668ef2e9e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-t6src\" (UID: \"b868698f-400d-43b7-ba5a-190668ef2e9e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.896838 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s77lw\" (UniqueName: \"kubernetes.io/projected/d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd-kube-api-access-s77lw\") pod \"machine-config-operator-74547568cd-pws8t\" (UID: \"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.914450 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scqk7\" (UniqueName: \"kubernetes.io/projected/7c21e973-7d87-496c-81ba-1425ba599774-kube-api-access-scqk7\") pod \"console-f9d7485db-sgj9g\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.934173 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqc4x\" (UniqueName: \"kubernetes.io/projected/42365ba9-ba85-47ed-a93c-543cd2ce8d30-kube-api-access-zqc4x\") pod \"ingress-operator-5b745b69d9-pkg87\" (UID: \"42365ba9-ba85-47ed-a93c-543cd2ce8d30\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:45 crc kubenswrapper[4917]: I0318 06:48:45.979769 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.001649 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.025387 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.025627 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8800cf27-b6c9-4859-ac70-ee3fc7b774fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pzpt5\" (UID: \"8800cf27-b6c9-4859-ac70-ee3fc7b774fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.025660 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7bf25c7-03d6-49a6-a981-8517bcadee69-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p97tv\" (UID: \"a7bf25c7-03d6-49a6-a981-8517bcadee69\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.025678 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf-metrics-tls\") pod \"dns-operator-744455d44c-nkz5k\" (UID: \"44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.025745 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8800cf27-b6c9-4859-ac70-ee3fc7b774fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pzpt5\" (UID: \"8800cf27-b6c9-4859-ac70-ee3fc7b774fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.025795 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:46.525777891 +0000 UTC m=+111.466932605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026566 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s9jw\" (UniqueName: \"kubernetes.io/projected/c08b892f-c635-444b-a090-fe8fe6aed47b-kube-api-access-4s9jw\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxffn\" (UID: \"c08b892f-c635-444b-a090-fe8fe6aed47b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026618 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsp54\" (UniqueName: \"kubernetes.io/projected/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-kube-api-access-vsp54\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026658 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026686 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7f1745b0-b256-4dac-a5b4-a61d28697707-default-certificate\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026708 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f3668e-a1a2-4c57-b00a-f10a3bbe15a5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8wrcr\" (UID: \"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026743 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026809 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7bf25c7-03d6-49a6-a981-8517bcadee69-config\") pod \"kube-apiserver-operator-766d6c64bb-p97tv\" (UID: \"a7bf25c7-03d6-49a6-a981-8517bcadee69\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026835 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-trusted-ca\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026860 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-trusted-ca\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026888 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfbnx\" (UniqueName: \"kubernetes.io/projected/7f1745b0-b256-4dac-a5b4-a61d28697707-kube-api-access-sfbnx\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026921 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.026966 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6940780c-6aab-4109-bb07-78b1ec265159-machine-approver-tls\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028361 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6940780c-6aab-4109-bb07-78b1ec265159-config\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028388 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028442 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9512feb-84a2-46b7-8df1-a672b069d7bc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028473 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8800cf27-b6c9-4859-ac70-ee3fc7b774fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pzpt5\" (UID: \"8800cf27-b6c9-4859-ac70-ee3fc7b774fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028497 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16f3668e-a1a2-4c57-b00a-f10a3bbe15a5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8wrcr\" (UID: \"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028523 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbkpf\" (UniqueName: \"kubernetes.io/projected/44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf-kube-api-access-qbkpf\") pod \"dns-operator-744455d44c-nkz5k\" (UID: \"44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028549 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028617 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1745b0-b256-4dac-a5b4-a61d28697707-metrics-certs\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028646 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f1745b0-b256-4dac-a5b4-a61d28697707-service-ca-bundle\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028678 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-dir\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028746 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-certificates\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028915 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd52w\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-kube-api-access-fd52w\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028947 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.028972 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7f1745b0-b256-4dac-a5b4-a61d28697707-stats-auth\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029000 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp2xj\" (UniqueName: \"kubernetes.io/projected/c4745aae-34c6-474b-8d7b-c7e7ef9e43cb-kube-api-access-fp2xj\") pod \"openshift-config-operator-7777fb866f-h4w6d\" (UID: \"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029048 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c08b892f-c635-444b-a090-fe8fe6aed47b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxffn\" (UID: \"c08b892f-c635-444b-a090-fe8fe6aed47b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029087 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6940780c-6aab-4109-bb07-78b1ec265159-auth-proxy-config\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029115 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gtlf\" (UniqueName: \"kubernetes.io/projected/6940780c-6aab-4109-bb07-78b1ec265159-kube-api-access-8gtlf\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029140 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4745aae-34c6-474b-8d7b-c7e7ef9e43cb-serving-cert\") pod \"openshift-config-operator-7777fb866f-h4w6d\" (UID: \"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029196 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c08b892f-c635-444b-a090-fe8fe6aed47b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxffn\" (UID: \"c08b892f-c635-444b-a090-fe8fe6aed47b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029214 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a7d05d-7034-44fe-bfa6-ea31a61a9df5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwhm6\" (UID: \"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029230 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9512feb-84a2-46b7-8df1-a672b069d7bc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029247 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029273 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029290 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fmfk\" (UniqueName: \"kubernetes.io/projected/a1a7d05d-7034-44fe-bfa6-ea31a61a9df5-kube-api-access-4fmfk\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwhm6\" (UID: \"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029388 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a7d05d-7034-44fe-bfa6-ea31a61a9df5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwhm6\" (UID: \"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029425 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-config\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029452 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-policies\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029475 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-tls\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029499 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029554 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029653 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c4745aae-34c6-474b-8d7b-c7e7ef9e43cb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h4w6d\" (UID: \"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029696 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029719 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-serving-cert\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029740 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-bound-sa-token\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029791 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfzw4\" (UniqueName: \"kubernetes.io/projected/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-kube-api-access-cfzw4\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029819 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f3668e-a1a2-4c57-b00a-f10a3bbe15a5-config\") pod \"kube-controller-manager-operator-78b949d7b-8wrcr\" (UID: \"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.029845 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7bf25c7-03d6-49a6-a981-8517bcadee69-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p97tv\" (UID: \"a7bf25c7-03d6-49a6-a981-8517bcadee69\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.045042 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.118722 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.132593 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.132920 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:46.632891606 +0000 UTC m=+111.574046320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133184 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133199 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c08b892f-c635-444b-a090-fe8fe6aed47b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxffn\" (UID: \"c08b892f-c635-444b-a090-fe8fe6aed47b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133229 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6940780c-6aab-4109-bb07-78b1ec265159-auth-proxy-config\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133257 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6381840d-a689-48da-803a-06ed230e7a62-apiservice-cert\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133289 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92ks\" (UniqueName: \"kubernetes.io/projected/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-kube-api-access-q92ks\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133306 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-csi-data-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133365 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ebb2697-baf2-4d66-b4c0-6b1b8bef646f-signing-key\") pod \"service-ca-9c57cc56f-gcd99\" (UID: \"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133401 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44xlt\" (UniqueName: \"kubernetes.io/projected/5beb9e93-3da2-4bc9-b40a-24406435d739-kube-api-access-44xlt\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrjf8\" (UID: \"5beb9e93-3da2-4bc9-b40a-24406435d739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133425 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g8td\" (UniqueName: \"kubernetes.io/projected/491de775-4251-427e-9065-9560931583b0-kube-api-access-9g8td\") pod \"service-ca-operator-777779d784-wgdbd\" (UID: \"491de775-4251-427e-9065-9560931583b0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133457 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gtlf\" (UniqueName: \"kubernetes.io/projected/6940780c-6aab-4109-bb07-78b1ec265159-kube-api-access-8gtlf\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133477 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4a676eea-cd1e-4098-95ec-48cdeff1a5c9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n67xx\" (UID: \"4a676eea-cd1e-4098-95ec-48cdeff1a5c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133494 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-mountpoint-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133522 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnb59\" (UniqueName: \"kubernetes.io/projected/049d29dc-f129-448c-a11e-9ffbe7e44334-kube-api-access-gnb59\") pod \"ingress-canary-5dkf8\" (UID: \"049d29dc-f129-448c-a11e-9ffbe7e44334\") " pod="openshift-ingress-canary/ingress-canary-5dkf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133541 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/491de775-4251-427e-9065-9560931583b0-serving-cert\") pod \"service-ca-operator-777779d784-wgdbd\" (UID: \"491de775-4251-427e-9065-9560931583b0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133560 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4745aae-34c6-474b-8d7b-c7e7ef9e43cb-serving-cert\") pod \"openshift-config-operator-7777fb866f-h4w6d\" (UID: \"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133594 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c08b892f-c635-444b-a090-fe8fe6aed47b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxffn\" (UID: \"c08b892f-c635-444b-a090-fe8fe6aed47b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133613 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1d145faa-cd72-4149-b5fa-cd47f39e1c27-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zhd5k\" (UID: \"1d145faa-cd72-4149-b5fa-cd47f39e1c27\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133632 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhpkk\" (UniqueName: \"kubernetes.io/projected/1d145faa-cd72-4149-b5fa-cd47f39e1c27-kube-api-access-fhpkk\") pod \"olm-operator-6b444d44fb-zhd5k\" (UID: \"1d145faa-cd72-4149-b5fa-cd47f39e1c27\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133664 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9512feb-84a2-46b7-8df1-a672b069d7bc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133686 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a7d05d-7034-44fe-bfa6-ea31a61a9df5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwhm6\" (UID: \"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133710 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133730 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-ready\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133759 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133782 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133813 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7196170-031f-4cbe-ac17-50996ecc6fe6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7rqbb\" (UID: \"d7196170-031f-4cbe-ac17-50996ecc6fe6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133852 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fmfk\" (UniqueName: \"kubernetes.io/projected/a1a7d05d-7034-44fe-bfa6-ea31a61a9df5-kube-api-access-4fmfk\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwhm6\" (UID: \"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133916 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a7d05d-7034-44fe-bfa6-ea31a61a9df5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwhm6\" (UID: \"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133940 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8v8rg\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133958 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-policies\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.133984 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-config\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134019 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-tls\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134042 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/48203287-e84d-4f62-88dc-7d204cd4f257-certs\") pod \"machine-config-server-qtcrk\" (UID: \"48203287-e84d-4f62-88dc-7d204cd4f257\") " pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134068 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134093 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134118 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/48203287-e84d-4f62-88dc-7d204cd4f257-node-bootstrap-token\") pod \"machine-config-server-qtcrk\" (UID: \"48203287-e84d-4f62-88dc-7d204cd4f257\") " pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134142 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6vzf\" (UniqueName: \"kubernetes.io/projected/48203287-e84d-4f62-88dc-7d204cd4f257-kube-api-access-t6vzf\") pod \"machine-config-server-qtcrk\" (UID: \"48203287-e84d-4f62-88dc-7d204cd4f257\") " pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134177 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzx8j\" (UniqueName: \"kubernetes.io/projected/4a676eea-cd1e-4098-95ec-48cdeff1a5c9-kube-api-access-xzx8j\") pod \"machine-config-controller-84d6567774-n67xx\" (UID: \"4a676eea-cd1e-4098-95ec-48cdeff1a5c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134211 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134241 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e1a866-4233-4bbf-a1ba-209ffd3a9980-metrics-tls\") pod \"dns-default-d6cp4\" (UID: \"74e1a866-4233-4bbf-a1ba-209ffd3a9980\") " pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134286 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c4745aae-34c6-474b-8d7b-c7e7ef9e43cb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h4w6d\" (UID: \"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134305 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-serving-cert\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134324 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134343 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-bound-sa-token\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134363 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5beb9e93-3da2-4bc9-b40a-24406435d739-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrjf8\" (UID: \"5beb9e93-3da2-4bc9-b40a-24406435d739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134388 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfzw4\" (UniqueName: \"kubernetes.io/projected/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-kube-api-access-cfzw4\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134426 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7bf25c7-03d6-49a6-a981-8517bcadee69-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p97tv\" (UID: \"a7bf25c7-03d6-49a6-a981-8517bcadee69\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134446 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f3668e-a1a2-4c57-b00a-f10a3bbe15a5-config\") pod \"kube-controller-manager-operator-78b949d7b-8wrcr\" (UID: \"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134468 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39975bd4-f79a-464b-a4cc-65220c2ee731-secret-volume\") pod \"collect-profiles-29563605-jgd8c\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134485 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-socket-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134526 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134545 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8800cf27-b6c9-4859-ac70-ee3fc7b774fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pzpt5\" (UID: \"8800cf27-b6c9-4859-ac70-ee3fc7b774fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134567 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfdwm\" (UniqueName: \"kubernetes.io/projected/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-kube-api-access-bfdwm\") pod \"marketplace-operator-79b997595-8v8rg\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134611 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7bf25c7-03d6-49a6-a981-8517bcadee69-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p97tv\" (UID: \"a7bf25c7-03d6-49a6-a981-8517bcadee69\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134629 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6381840d-a689-48da-803a-06ed230e7a62-tmpfs\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134648 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf-metrics-tls\") pod \"dns-operator-744455d44c-nkz5k\" (UID: \"44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134669 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8800cf27-b6c9-4859-ac70-ee3fc7b774fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pzpt5\" (UID: \"8800cf27-b6c9-4859-ac70-ee3fc7b774fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134696 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b8pf\" (UniqueName: \"kubernetes.io/projected/708177a0-d2d0-4e6c-9f75-3149faf99718-kube-api-access-7b8pf\") pod \"migrator-59844c95c7-jchjc\" (UID: \"708177a0-d2d0-4e6c-9f75-3149faf99718\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134743 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s9jw\" (UniqueName: \"kubernetes.io/projected/c08b892f-c635-444b-a090-fe8fe6aed47b-kube-api-access-4s9jw\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxffn\" (UID: \"c08b892f-c635-444b-a090-fe8fe6aed47b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134768 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134788 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsp54\" (UniqueName: \"kubernetes.io/projected/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-kube-api-access-vsp54\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134809 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk4cb\" (UniqueName: \"kubernetes.io/projected/74e1a866-4233-4bbf-a1ba-209ffd3a9980-kube-api-access-dk4cb\") pod \"dns-default-d6cp4\" (UID: \"74e1a866-4233-4bbf-a1ba-209ffd3a9980\") " pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134827 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8v8rg\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134857 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7f1745b0-b256-4dac-a5b4-a61d28697707-default-certificate\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134873 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f3668e-a1a2-4c57-b00a-f10a3bbe15a5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8wrcr\" (UID: \"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134886 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c08b892f-c635-444b-a090-fe8fe6aed47b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxffn\" (UID: \"c08b892f-c635-444b-a090-fe8fe6aed47b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134891 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134957 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzbjl\" (UniqueName: \"kubernetes.io/projected/6381840d-a689-48da-803a-06ed230e7a62-kube-api-access-mzbjl\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.134988 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ebb2697-baf2-4d66-b4c0-6b1b8bef646f-signing-cabundle\") pod \"service-ca-9c57cc56f-gcd99\" (UID: \"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135015 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwgrb\" (UniqueName: \"kubernetes.io/projected/07e47035-8f83-4334-b82d-c3fa26dfe8f9-kube-api-access-mwgrb\") pod \"catalog-operator-68c6474976-spgtc\" (UID: \"07e47035-8f83-4334-b82d-c3fa26dfe8f9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135037 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1d145faa-cd72-4149-b5fa-cd47f39e1c27-srv-cert\") pod \"olm-operator-6b444d44fb-zhd5k\" (UID: \"1d145faa-cd72-4149-b5fa-cd47f39e1c27\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135071 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-trusted-ca\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135089 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/07e47035-8f83-4334-b82d-c3fa26dfe8f9-srv-cert\") pod \"catalog-operator-68c6474976-spgtc\" (UID: \"07e47035-8f83-4334-b82d-c3fa26dfe8f9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135109 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-registration-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135125 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-plugins-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135146 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7bf25c7-03d6-49a6-a981-8517bcadee69-config\") pod \"kube-apiserver-operator-766d6c64bb-p97tv\" (UID: \"a7bf25c7-03d6-49a6-a981-8517bcadee69\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135168 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-trusted-ca\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135187 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a676eea-cd1e-4098-95ec-48cdeff1a5c9-proxy-tls\") pod \"machine-config-controller-84d6567774-n67xx\" (UID: \"4a676eea-cd1e-4098-95ec-48cdeff1a5c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135207 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/049d29dc-f129-448c-a11e-9ffbe7e44334-cert\") pod \"ingress-canary-5dkf8\" (UID: \"049d29dc-f129-448c-a11e-9ffbe7e44334\") " pod="openshift-ingress-canary/ingress-canary-5dkf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135229 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfbnx\" (UniqueName: \"kubernetes.io/projected/7f1745b0-b256-4dac-a5b4-a61d28697707-kube-api-access-sfbnx\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135280 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135305 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6381840d-a689-48da-803a-06ed230e7a62-webhook-cert\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135329 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/309bbd8c-0c3c-45bf-be12-3fb27218938c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k4f7n\" (UID: \"309bbd8c-0c3c-45bf-be12-3fb27218938c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135353 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6940780c-6aab-4109-bb07-78b1ec265159-machine-approver-tls\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135371 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6940780c-6aab-4109-bb07-78b1ec265159-config\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135390 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135424 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9512feb-84a2-46b7-8df1-a672b069d7bc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135442 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8800cf27-b6c9-4859-ac70-ee3fc7b774fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pzpt5\" (UID: \"8800cf27-b6c9-4859-ac70-ee3fc7b774fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135460 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16f3668e-a1a2-4c57-b00a-f10a3bbe15a5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8wrcr\" (UID: \"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135476 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwsr7\" (UniqueName: \"kubernetes.io/projected/309bbd8c-0c3c-45bf-be12-3fb27218938c-kube-api-access-fwsr7\") pod \"multus-admission-controller-857f4d67dd-k4f7n\" (UID: \"309bbd8c-0c3c-45bf-be12-3fb27218938c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135498 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39975bd4-f79a-464b-a4cc-65220c2ee731-config-volume\") pod \"collect-profiles-29563605-jgd8c\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135513 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjlr\" (UniqueName: \"kubernetes.io/projected/39975bd4-f79a-464b-a4cc-65220c2ee731-kube-api-access-mdjlr\") pod \"collect-profiles-29563605-jgd8c\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135533 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbkpf\" (UniqueName: \"kubernetes.io/projected/44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf-kube-api-access-qbkpf\") pod \"dns-operator-744455d44c-nkz5k\" (UID: \"44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135550 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135570 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1745b0-b256-4dac-a5b4-a61d28697707-metrics-certs\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135619 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/07e47035-8f83-4334-b82d-c3fa26dfe8f9-profile-collector-cert\") pod \"catalog-operator-68c6474976-spgtc\" (UID: \"07e47035-8f83-4334-b82d-c3fa26dfe8f9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135636 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq72q\" (UniqueName: \"kubernetes.io/projected/e10ffc7c-e1b0-4b26-a30d-687ef976191b-kube-api-access-rq72q\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135652 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcs8m\" (UniqueName: \"kubernetes.io/projected/d7196170-031f-4cbe-ac17-50996ecc6fe6-kube-api-access-xcs8m\") pod \"package-server-manager-789f6589d5-7rqbb\" (UID: \"d7196170-031f-4cbe-ac17-50996ecc6fe6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135671 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f1745b0-b256-4dac-a5b4-a61d28697707-service-ca-bundle\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135690 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/491de775-4251-427e-9065-9560931583b0-config\") pod \"service-ca-operator-777779d784-wgdbd\" (UID: \"491de775-4251-427e-9065-9560931583b0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135708 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-dir\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135739 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-certificates\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135759 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd52w\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-kube-api-access-fd52w\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135776 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74e1a866-4233-4bbf-a1ba-209ffd3a9980-config-volume\") pod \"dns-default-d6cp4\" (UID: \"74e1a866-4233-4bbf-a1ba-209ffd3a9980\") " pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135809 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135845 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7f1745b0-b256-4dac-a5b4-a61d28697707-stats-auth\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135862 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p56wz\" (UniqueName: \"kubernetes.io/projected/0ebb2697-baf2-4d66-b4c0-6b1b8bef646f-kube-api-access-p56wz\") pod \"service-ca-9c57cc56f-gcd99\" (UID: \"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.135915 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp2xj\" (UniqueName: \"kubernetes.io/projected/c4745aae-34c6-474b-8d7b-c7e7ef9e43cb-kube-api-access-fp2xj\") pod \"openshift-config-operator-7777fb866f-h4w6d\" (UID: \"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.136798 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.138056 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6940780c-6aab-4109-bb07-78b1ec265159-config\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.139770 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-trusted-ca\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.140270 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7bf25c7-03d6-49a6-a981-8517bcadee69-config\") pod \"kube-apiserver-operator-766d6c64bb-p97tv\" (UID: \"a7bf25c7-03d6-49a6-a981-8517bcadee69\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.141755 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.142174 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9512feb-84a2-46b7-8df1-a672b069d7bc-ca-trust-extracted\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.142571 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-trusted-ca\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.143372 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c4745aae-34c6-474b-8d7b-c7e7ef9e43cb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-h4w6d\" (UID: \"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.143708 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8800cf27-b6c9-4859-ac70-ee3fc7b774fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pzpt5\" (UID: \"8800cf27-b6c9-4859-ac70-ee3fc7b774fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.144167 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f1745b0-b256-4dac-a5b4-a61d28697707-service-ca-bundle\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.144280 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-dir\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.145555 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.148856 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-certificates\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.150827 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf-metrics-tls\") pod \"dns-operator-744455d44c-nkz5k\" (UID: \"44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.151106 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7bf25c7-03d6-49a6-a981-8517bcadee69-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-p97tv\" (UID: \"a7bf25c7-03d6-49a6-a981-8517bcadee69\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.152139 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f3668e-a1a2-4c57-b00a-f10a3bbe15a5-config\") pod \"kube-controller-manager-operator-78b949d7b-8wrcr\" (UID: \"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.152478 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.152538 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6940780c-6aab-4109-bb07-78b1ec265159-machine-approver-tls\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.153536 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:46.653515883 +0000 UTC m=+111.594670607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.153644 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4745aae-34c6-474b-8d7b-c7e7ef9e43cb-serving-cert\") pod \"openshift-config-operator-7777fb866f-h4w6d\" (UID: \"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.153948 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.154546 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.154752 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7f1745b0-b256-4dac-a5b4-a61d28697707-metrics-certs\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.155470 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8800cf27-b6c9-4859-ac70-ee3fc7b774fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pzpt5\" (UID: \"8800cf27-b6c9-4859-ac70-ee3fc7b774fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.155748 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.155788 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1a7d05d-7034-44fe-bfa6-ea31a61a9df5-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwhm6\" (UID: \"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.156806 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.157067 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.157302 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-policies\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.158823 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6940780c-6aab-4109-bb07-78b1ec265159-auth-proxy-config\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.159465 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7f1745b0-b256-4dac-a5b4-a61d28697707-stats-auth\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.160060 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.160743 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9512feb-84a2-46b7-8df1-a672b069d7bc-installation-pull-secrets\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.160754 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-config\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.161521 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c08b892f-c635-444b-a090-fe8fe6aed47b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxffn\" (UID: \"c08b892f-c635-444b-a090-fe8fe6aed47b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.162726 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-tls\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.163568 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f3668e-a1a2-4c57-b00a-f10a3bbe15a5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-8wrcr\" (UID: \"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.164438 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a1a7d05d-7034-44fe-bfa6-ea31a61a9df5-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwhm6\" (UID: \"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.165060 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.167032 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7f1745b0-b256-4dac-a5b4-a61d28697707-default-certificate\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.172090 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-serving-cert\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.173875 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.177190 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp2xj\" (UniqueName: \"kubernetes.io/projected/c4745aae-34c6-474b-8d7b-c7e7ef9e43cb-kube-api-access-fp2xj\") pod \"openshift-config-operator-7777fb866f-h4w6d\" (UID: \"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.183198 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.190091 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gtlf\" (UniqueName: \"kubernetes.io/projected/6940780c-6aab-4109-bb07-78b1ec265159-kube-api-access-8gtlf\") pod \"machine-approver-56656f9798-bvz7t\" (UID: \"6940780c-6aab-4109-bb07-78b1ec265159\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.197671 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.220629 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbkpf\" (UniqueName: \"kubernetes.io/projected/44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf-kube-api-access-qbkpf\") pod \"dns-operator-744455d44c-nkz5k\" (UID: \"44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf\") " pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.236830 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.237062 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:46.737033523 +0000 UTC m=+111.678188237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237290 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1d145faa-cd72-4149-b5fa-cd47f39e1c27-srv-cert\") pod \"olm-operator-6b444d44fb-zhd5k\" (UID: \"1d145faa-cd72-4149-b5fa-cd47f39e1c27\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237321 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwgrb\" (UniqueName: \"kubernetes.io/projected/07e47035-8f83-4334-b82d-c3fa26dfe8f9-kube-api-access-mwgrb\") pod \"catalog-operator-68c6474976-spgtc\" (UID: \"07e47035-8f83-4334-b82d-c3fa26dfe8f9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237343 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/07e47035-8f83-4334-b82d-c3fa26dfe8f9-srv-cert\") pod \"catalog-operator-68c6474976-spgtc\" (UID: \"07e47035-8f83-4334-b82d-c3fa26dfe8f9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237362 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-registration-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237379 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-plugins-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237399 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a676eea-cd1e-4098-95ec-48cdeff1a5c9-proxy-tls\") pod \"machine-config-controller-84d6567774-n67xx\" (UID: \"4a676eea-cd1e-4098-95ec-48cdeff1a5c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237418 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/049d29dc-f129-448c-a11e-9ffbe7e44334-cert\") pod \"ingress-canary-5dkf8\" (UID: \"049d29dc-f129-448c-a11e-9ffbe7e44334\") " pod="openshift-ingress-canary/ingress-canary-5dkf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237443 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6381840d-a689-48da-803a-06ed230e7a62-webhook-cert\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237463 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/309bbd8c-0c3c-45bf-be12-3fb27218938c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k4f7n\" (UID: \"309bbd8c-0c3c-45bf-be12-3fb27218938c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237498 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwsr7\" (UniqueName: \"kubernetes.io/projected/309bbd8c-0c3c-45bf-be12-3fb27218938c-kube-api-access-fwsr7\") pod \"multus-admission-controller-857f4d67dd-k4f7n\" (UID: \"309bbd8c-0c3c-45bf-be12-3fb27218938c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237618 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjlr\" (UniqueName: \"kubernetes.io/projected/39975bd4-f79a-464b-a4cc-65220c2ee731-kube-api-access-mdjlr\") pod \"collect-profiles-29563605-jgd8c\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237646 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39975bd4-f79a-464b-a4cc-65220c2ee731-config-volume\") pod \"collect-profiles-29563605-jgd8c\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237666 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq72q\" (UniqueName: \"kubernetes.io/projected/e10ffc7c-e1b0-4b26-a30d-687ef976191b-kube-api-access-rq72q\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237683 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcs8m\" (UniqueName: \"kubernetes.io/projected/d7196170-031f-4cbe-ac17-50996ecc6fe6-kube-api-access-xcs8m\") pod \"package-server-manager-789f6589d5-7rqbb\" (UID: \"d7196170-031f-4cbe-ac17-50996ecc6fe6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237702 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/07e47035-8f83-4334-b82d-c3fa26dfe8f9-profile-collector-cert\") pod \"catalog-operator-68c6474976-spgtc\" (UID: \"07e47035-8f83-4334-b82d-c3fa26dfe8f9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237720 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/491de775-4251-427e-9065-9560931583b0-config\") pod \"service-ca-operator-777779d784-wgdbd\" (UID: \"491de775-4251-427e-9065-9560931583b0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237756 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74e1a866-4233-4bbf-a1ba-209ffd3a9980-config-volume\") pod \"dns-default-d6cp4\" (UID: \"74e1a866-4233-4bbf-a1ba-209ffd3a9980\") " pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237776 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p56wz\" (UniqueName: \"kubernetes.io/projected/0ebb2697-baf2-4d66-b4c0-6b1b8bef646f-kube-api-access-p56wz\") pod \"service-ca-9c57cc56f-gcd99\" (UID: \"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237797 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6381840d-a689-48da-803a-06ed230e7a62-apiservice-cert\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237814 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-csi-data-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237833 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ebb2697-baf2-4d66-b4c0-6b1b8bef646f-signing-key\") pod \"service-ca-9c57cc56f-gcd99\" (UID: \"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237852 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q92ks\" (UniqueName: \"kubernetes.io/projected/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-kube-api-access-q92ks\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237871 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44xlt\" (UniqueName: \"kubernetes.io/projected/5beb9e93-3da2-4bc9-b40a-24406435d739-kube-api-access-44xlt\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrjf8\" (UID: \"5beb9e93-3da2-4bc9-b40a-24406435d739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237889 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g8td\" (UniqueName: \"kubernetes.io/projected/491de775-4251-427e-9065-9560931583b0-kube-api-access-9g8td\") pod \"service-ca-operator-777779d784-wgdbd\" (UID: \"491de775-4251-427e-9065-9560931583b0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237951 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4a676eea-cd1e-4098-95ec-48cdeff1a5c9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n67xx\" (UID: \"4a676eea-cd1e-4098-95ec-48cdeff1a5c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237971 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-mountpoint-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.237992 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnb59\" (UniqueName: \"kubernetes.io/projected/049d29dc-f129-448c-a11e-9ffbe7e44334-kube-api-access-gnb59\") pod \"ingress-canary-5dkf8\" (UID: \"049d29dc-f129-448c-a11e-9ffbe7e44334\") " pod="openshift-ingress-canary/ingress-canary-5dkf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238010 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/491de775-4251-427e-9065-9560931583b0-serving-cert\") pod \"service-ca-operator-777779d784-wgdbd\" (UID: \"491de775-4251-427e-9065-9560931583b0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238032 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1d145faa-cd72-4149-b5fa-cd47f39e1c27-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zhd5k\" (UID: \"1d145faa-cd72-4149-b5fa-cd47f39e1c27\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238051 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhpkk\" (UniqueName: \"kubernetes.io/projected/1d145faa-cd72-4149-b5fa-cd47f39e1c27-kube-api-access-fhpkk\") pod \"olm-operator-6b444d44fb-zhd5k\" (UID: \"1d145faa-cd72-4149-b5fa-cd47f39e1c27\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238211 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-ready\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238262 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238282 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7196170-031f-4cbe-ac17-50996ecc6fe6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7rqbb\" (UID: \"d7196170-031f-4cbe-ac17-50996ecc6fe6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238310 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8v8rg\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238333 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/48203287-e84d-4f62-88dc-7d204cd4f257-certs\") pod \"machine-config-server-qtcrk\" (UID: \"48203287-e84d-4f62-88dc-7d204cd4f257\") " pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238350 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238366 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/48203287-e84d-4f62-88dc-7d204cd4f257-node-bootstrap-token\") pod \"machine-config-server-qtcrk\" (UID: \"48203287-e84d-4f62-88dc-7d204cd4f257\") " pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238382 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6vzf\" (UniqueName: \"kubernetes.io/projected/48203287-e84d-4f62-88dc-7d204cd4f257-kube-api-access-t6vzf\") pod \"machine-config-server-qtcrk\" (UID: \"48203287-e84d-4f62-88dc-7d204cd4f257\") " pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238401 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzx8j\" (UniqueName: \"kubernetes.io/projected/4a676eea-cd1e-4098-95ec-48cdeff1a5c9-kube-api-access-xzx8j\") pod \"machine-config-controller-84d6567774-n67xx\" (UID: \"4a676eea-cd1e-4098-95ec-48cdeff1a5c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238417 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e1a866-4233-4bbf-a1ba-209ffd3a9980-metrics-tls\") pod \"dns-default-d6cp4\" (UID: \"74e1a866-4233-4bbf-a1ba-209ffd3a9980\") " pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238439 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5beb9e93-3da2-4bc9-b40a-24406435d739-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrjf8\" (UID: \"5beb9e93-3da2-4bc9-b40a-24406435d739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238481 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39975bd4-f79a-464b-a4cc-65220c2ee731-secret-volume\") pod \"collect-profiles-29563605-jgd8c\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238499 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-socket-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238519 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238536 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfdwm\" (UniqueName: \"kubernetes.io/projected/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-kube-api-access-bfdwm\") pod \"marketplace-operator-79b997595-8v8rg\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238559 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6381840d-a689-48da-803a-06ed230e7a62-tmpfs\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238619 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b8pf\" (UniqueName: \"kubernetes.io/projected/708177a0-d2d0-4e6c-9f75-3149faf99718-kube-api-access-7b8pf\") pod \"migrator-59844c95c7-jchjc\" (UID: \"708177a0-d2d0-4e6c-9f75-3149faf99718\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238658 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk4cb\" (UniqueName: \"kubernetes.io/projected/74e1a866-4233-4bbf-a1ba-209ffd3a9980-kube-api-access-dk4cb\") pod \"dns-default-d6cp4\" (UID: \"74e1a866-4233-4bbf-a1ba-209ffd3a9980\") " pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238673 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8v8rg\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238698 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzbjl\" (UniqueName: \"kubernetes.io/projected/6381840d-a689-48da-803a-06ed230e7a62-kube-api-access-mzbjl\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.238715 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ebb2697-baf2-4d66-b4c0-6b1b8bef646f-signing-cabundle\") pod \"service-ca-9c57cc56f-gcd99\" (UID: \"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.241814 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-ready\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.242179 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/6381840d-a689-48da-803a-06ed230e7a62-tmpfs\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.242813 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.243689 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.244791 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8800cf27-b6c9-4859-ac70-ee3fc7b774fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pzpt5\" (UID: \"8800cf27-b6c9-4859-ac70-ee3fc7b774fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.244934 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:46.744910442 +0000 UTC m=+111.686065156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.245078 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-socket-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.246954 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-registration-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.249485 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39975bd4-f79a-464b-a4cc-65220c2ee731-secret-volume\") pod \"collect-profiles-29563605-jgd8c\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.250816 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6381840d-a689-48da-803a-06ed230e7a62-webhook-cert\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.251344 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/48203287-e84d-4f62-88dc-7d204cd4f257-node-bootstrap-token\") pod \"machine-config-server-qtcrk\" (UID: \"48203287-e84d-4f62-88dc-7d204cd4f257\") " pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.251410 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-mountpoint-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.251419 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-8v8rg\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.252075 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d7196170-031f-4cbe-ac17-50996ecc6fe6-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7rqbb\" (UID: \"d7196170-031f-4cbe-ac17-50996ecc6fe6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.252222 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4a676eea-cd1e-4098-95ec-48cdeff1a5c9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n67xx\" (UID: \"4a676eea-cd1e-4098-95ec-48cdeff1a5c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.255524 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/491de775-4251-427e-9065-9560931583b0-serving-cert\") pod \"service-ca-operator-777779d784-wgdbd\" (UID: \"491de775-4251-427e-9065-9560931583b0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.255956 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39975bd4-f79a-464b-a4cc-65220c2ee731-config-volume\") pod \"collect-profiles-29563605-jgd8c\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.256077 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-csi-data-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.256103 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e10ffc7c-e1b0-4b26-a30d-687ef976191b-plugins-dir\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.256519 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/5beb9e93-3da2-4bc9-b40a-24406435d739-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrjf8\" (UID: \"5beb9e93-3da2-4bc9-b40a-24406435d739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.256633 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74e1a866-4233-4bbf-a1ba-209ffd3a9980-config-volume\") pod \"dns-default-d6cp4\" (UID: \"74e1a866-4233-4bbf-a1ba-209ffd3a9980\") " pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.257084 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-8v8rg\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.258199 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/491de775-4251-427e-9065-9560931583b0-config\") pod \"service-ca-operator-777779d784-wgdbd\" (UID: \"491de775-4251-427e-9065-9560931583b0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.258555 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0ebb2697-baf2-4d66-b4c0-6b1b8bef646f-signing-cabundle\") pod \"service-ca-9c57cc56f-gcd99\" (UID: \"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.260774 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1d145faa-cd72-4149-b5fa-cd47f39e1c27-srv-cert\") pod \"olm-operator-6b444d44fb-zhd5k\" (UID: \"1d145faa-cd72-4149-b5fa-cd47f39e1c27\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.261838 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4a676eea-cd1e-4098-95ec-48cdeff1a5c9-proxy-tls\") pod \"machine-config-controller-84d6567774-n67xx\" (UID: \"4a676eea-cd1e-4098-95ec-48cdeff1a5c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.263387 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/07e47035-8f83-4334-b82d-c3fa26dfe8f9-profile-collector-cert\") pod \"catalog-operator-68c6474976-spgtc\" (UID: \"07e47035-8f83-4334-b82d-c3fa26dfe8f9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.267911 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/07e47035-8f83-4334-b82d-c3fa26dfe8f9-srv-cert\") pod \"catalog-operator-68c6474976-spgtc\" (UID: \"07e47035-8f83-4334-b82d-c3fa26dfe8f9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.268965 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/74e1a866-4233-4bbf-a1ba-209ffd3a9980-metrics-tls\") pod \"dns-default-d6cp4\" (UID: \"74e1a866-4233-4bbf-a1ba-209ffd3a9980\") " pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.269298 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6381840d-a689-48da-803a-06ed230e7a62-apiservice-cert\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.269337 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/049d29dc-f129-448c-a11e-9ffbe7e44334-cert\") pod \"ingress-canary-5dkf8\" (UID: \"049d29dc-f129-448c-a11e-9ffbe7e44334\") " pod="openshift-ingress-canary/ingress-canary-5dkf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.270498 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16f3668e-a1a2-4c57-b00a-f10a3bbe15a5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-8wrcr\" (UID: \"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.270640 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/48203287-e84d-4f62-88dc-7d204cd4f257-certs\") pod \"machine-config-server-qtcrk\" (UID: \"48203287-e84d-4f62-88dc-7d204cd4f257\") " pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.271276 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0ebb2697-baf2-4d66-b4c0-6b1b8bef646f-signing-key\") pod \"service-ca-9c57cc56f-gcd99\" (UID: \"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.277331 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/309bbd8c-0c3c-45bf-be12-3fb27218938c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-k4f7n\" (UID: \"309bbd8c-0c3c-45bf-be12-3fb27218938c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.277925 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1d145faa-cd72-4149-b5fa-cd47f39e1c27-profile-collector-cert\") pod \"olm-operator-6b444d44fb-zhd5k\" (UID: \"1d145faa-cd72-4149-b5fa-cd47f39e1c27\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.291681 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfzw4\" (UniqueName: \"kubernetes.io/projected/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-kube-api-access-cfzw4\") pod \"oauth-openshift-558db77b4-9xdsf\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.303910 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-bound-sa-token\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.322930 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7bf25c7-03d6-49a6-a981-8517bcadee69-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-p97tv\" (UID: \"a7bf25c7-03d6-49a6-a981-8517bcadee69\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.347051 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.347635 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:46.847617687 +0000 UTC m=+111.788772401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.350177 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fmfk\" (UniqueName: \"kubernetes.io/projected/a1a7d05d-7034-44fe-bfa6-ea31a61a9df5-kube-api-access-4fmfk\") pod \"kube-storage-version-migrator-operator-b67b599dd-lwhm6\" (UID: \"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.361201 4917 generic.go:334] "Generic (PLEG): container finished" podID="4f086875-421c-4218-91f3-1349dcc3247d" containerID="380f989d35d6cb85eab252f7a87d5f46b6cca4b6a6ba0028e4f6a55b74fa15ef" exitCode=0 Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.361260 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" event={"ID":"4f086875-421c-4218-91f3-1349dcc3247d","Type":"ContainerDied","Data":"380f989d35d6cb85eab252f7a87d5f46b6cca4b6a6ba0028e4f6a55b74fa15ef"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.371165 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfbnx\" (UniqueName: \"kubernetes.io/projected/7f1745b0-b256-4dac-a5b4-a61d28697707-kube-api-access-sfbnx\") pod \"router-default-5444994796-66xqj\" (UID: \"7f1745b0-b256-4dac-a5b4-a61d28697707\") " pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.383466 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.383509 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsp54\" (UniqueName: \"kubernetes.io/projected/2c47e8a8-91a9-47dc-b379-5dcbaae96a7b-kube-api-access-vsp54\") pod \"console-operator-58897d9998-6x6mh\" (UID: \"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b\") " pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.395816 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sgj9g"] Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.398076 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.400291 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.400906 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.406541 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s9jw\" (UniqueName: \"kubernetes.io/projected/c08b892f-c635-444b-a090-fe8fe6aed47b-kube-api-access-4s9jw\") pod \"openshift-controller-manager-operator-756b6f6bc6-zxffn\" (UID: \"c08b892f-c635-444b-a090-fe8fe6aed47b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: W0318 06:48:46.421304 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c21e973_7d87_496c_81ba_1425ba599774.slice/crio-c34f8ae044f61a88bce171ce491824262324307db188e40ce46ad8a5ecb43ff8 WatchSource:0}: Error finding container c34f8ae044f61a88bce171ce491824262324307db188e40ce46ad8a5ecb43ff8: Status 404 returned error can't find the container with id c34f8ae044f61a88bce171ce491824262324307db188e40ce46ad8a5ecb43ff8 Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.422107 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd52w\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-kube-api-access-fd52w\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.422455 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" event={"ID":"230964d0-ab50-4d70-8379-3250d17aae00","Type":"ContainerStarted","Data":"bc53836b4447d76cb322a3620b529901be333371ddf5c697273cd301534d5fdb"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.422497 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" event={"ID":"230964d0-ab50-4d70-8379-3250d17aae00","Type":"ContainerStarted","Data":"fa0c1412310e362ac4cc9730e11aad5440eee14f991f9af1aa61f304da8c7bdf"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.432913 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" event={"ID":"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7","Type":"ContainerStarted","Data":"1c4821ad0e252d5cf1401921659afb6d53b27aa7c730e63efde6a9f410f1d99f"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.432961 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" event={"ID":"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7","Type":"ContainerStarted","Data":"23eba14cd724d22307b1a9a8d38a3aef0f93da6f1e01ce0eac3ec50df87c3817"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.432971 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" event={"ID":"9d08595e-c5f0-49cd-bc6c-5e248bcc76e7","Type":"ContainerStarted","Data":"b02f79872b18f3aaff1e5ace61e4eb206dc77ccb186c36ea9effecca862ba472"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.441047 4917 generic.go:334] "Generic (PLEG): container finished" podID="4fb3fe10-f693-4e74-b72a-f2b10dac3580" containerID="bce648d67b336490e5612a226154890fb8c1c9cd06d92441eebed4dc8097fad2" exitCode=0 Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.441097 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" event={"ID":"4fb3fe10-f693-4e74-b72a-f2b10dac3580","Type":"ContainerDied","Data":"bce648d67b336490e5612a226154890fb8c1c9cd06d92441eebed4dc8097fad2"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.441145 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.443609 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" event={"ID":"76c2512e-6f87-4731-a42f-eddd2188ff59","Type":"ContainerStarted","Data":"db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.443631 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" event={"ID":"76c2512e-6f87-4731-a42f-eddd2188ff59","Type":"ContainerStarted","Data":"9657c34a37b65a52efa092305c647a36c3913ec928114b0f6bbfbb0427dcb40c"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.443990 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.447891 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.448346 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.450141 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" event={"ID":"65334d30-4340-47c4-b333-1d41d0c2869b","Type":"ContainerStarted","Data":"c7f6f94b841850aea0e8a041b9ff2f71075231fb07adb34c465e17be02f93fff"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.450171 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" event={"ID":"65334d30-4340-47c4-b333-1d41d0c2869b","Type":"ContainerStarted","Data":"2d577756f17929b178c8a0296549c49692d7f453d3578430e85738b93b5e2dbc"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.450181 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" event={"ID":"65334d30-4340-47c4-b333-1d41d0c2869b","Type":"ContainerStarted","Data":"19c86a6a262dc562a88c49b06f351e1f38f9d5b92d8774d6841d93390a5781d2"} Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.450793 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:46.950781033 +0000 UTC m=+111.891935747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.460512 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zk4nl" event={"ID":"f4c8c2e4-554f-423e-81a2-cd9f63eb7250","Type":"ContainerStarted","Data":"1c075134343a97a2d6914bfdbf50d9e9bad0341f190623f83cddb5be97cbb549"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.460557 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nbchd"] Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.460573 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zk4nl" event={"ID":"f4c8c2e4-554f-423e-81a2-cd9f63eb7250","Type":"ContainerStarted","Data":"998b7e0aad5d2f2388ca68f683a8bf3d3fa46a56a4f95c5512d80cce3215f1ff"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.460659 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zk4nl" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.460696 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.462369 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhpkk\" (UniqueName: \"kubernetes.io/projected/1d145faa-cd72-4149-b5fa-cd47f39e1c27-kube-api-access-fhpkk\") pod \"olm-operator-6b444d44fb-zhd5k\" (UID: \"1d145faa-cd72-4149-b5fa-cd47f39e1c27\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.467965 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.473560 4917 patch_prober.go:28] interesting pod/downloads-7954f5f757-zk4nl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.476075 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zk4nl" podUID="f4c8c2e4-554f-423e-81a2-cd9f63eb7250" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.476885 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" event={"ID":"cb1793e3-4625-4f46-8be2-d75629f46371","Type":"ContainerStarted","Data":"f69a9f028e36ae4e0ea6c3aec9fba467c7cd6da72cf93e485a8a4d08f60f5f39"} Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.476936 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.478559 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src"] Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.483877 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfdwm\" (UniqueName: \"kubernetes.io/projected/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-kube-api-access-bfdwm\") pod \"marketplace-operator-79b997595-8v8rg\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.492162 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.495644 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.506198 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.506460 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92ks\" (UniqueName: \"kubernetes.io/projected/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-kube-api-access-q92ks\") pod \"cni-sysctl-allowlist-ds-plhlj\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.511881 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.517798 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.521033 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6vzf\" (UniqueName: \"kubernetes.io/projected/48203287-e84d-4f62-88dc-7d204cd4f257-kube-api-access-t6vzf\") pod \"machine-config-server-qtcrk\" (UID: \"48203287-e84d-4f62-88dc-7d204cd4f257\") " pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.525050 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" Mar 18 06:48:46 crc kubenswrapper[4917]: W0318 06:48:46.526057 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb868698f_400d_43b7_ba5a_190668ef2e9e.slice/crio-bbbae7b5d3c3f13f330f0200b305430fd146b174e712f8e0e91fcd00e3484098 WatchSource:0}: Error finding container bbbae7b5d3c3f13f330f0200b305430fd146b174e712f8e0e91fcd00e3484098: Status 404 returned error can't find the container with id bbbae7b5d3c3f13f330f0200b305430fd146b174e712f8e0e91fcd00e3484098 Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.546512 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwgrb\" (UniqueName: \"kubernetes.io/projected/07e47035-8f83-4334-b82d-c3fa26dfe8f9-kube-api-access-mwgrb\") pod \"catalog-operator-68c6474976-spgtc\" (UID: \"07e47035-8f83-4334-b82d-c3fa26dfe8f9\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.549441 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.549622 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.04960248 +0000 UTC m=+111.990757194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.549706 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.551875 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.051861181 +0000 UTC m=+111.993015895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.563988 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.566073 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjlr\" (UniqueName: \"kubernetes.io/projected/39975bd4-f79a-464b-a4cc-65220c2ee731-kube-api-access-mdjlr\") pod \"collect-profiles-29563605-jgd8c\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.576736 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.587865 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.589110 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44xlt\" (UniqueName: \"kubernetes.io/projected/5beb9e93-3da2-4bc9-b40a-24406435d739-kube-api-access-44xlt\") pod \"control-plane-machine-set-operator-78cbb6b69f-jrjf8\" (UID: \"5beb9e93-3da2-4bc9-b40a-24406435d739\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.591647 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.599418 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.629523 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzx8j\" (UniqueName: \"kubernetes.io/projected/4a676eea-cd1e-4098-95ec-48cdeff1a5c9-kube-api-access-xzx8j\") pod \"machine-config-controller-84d6567774-n67xx\" (UID: \"4a676eea-cd1e-4098-95ec-48cdeff1a5c9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.636955 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g8td\" (UniqueName: \"kubernetes.io/projected/491de775-4251-427e-9065-9560931583b0-kube-api-access-9g8td\") pod \"service-ca-operator-777779d784-wgdbd\" (UID: \"491de775-4251-427e-9065-9560931583b0\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.637830 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d"] Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.652656 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.653992 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.153975362 +0000 UTC m=+112.095130076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.663965 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq72q\" (UniqueName: \"kubernetes.io/projected/e10ffc7c-e1b0-4b26-a30d-687ef976191b-kube-api-access-rq72q\") pod \"csi-hostpathplugin-874w9\" (UID: \"e10ffc7c-e1b0-4b26-a30d-687ef976191b\") " pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.679043 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-874w9" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.682331 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnb59\" (UniqueName: \"kubernetes.io/projected/049d29dc-f129-448c-a11e-9ffbe7e44334-kube-api-access-gnb59\") pod \"ingress-canary-5dkf8\" (UID: \"049d29dc-f129-448c-a11e-9ffbe7e44334\") " pod="openshift-ingress-canary/ingress-canary-5dkf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.682618 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzbjl\" (UniqueName: \"kubernetes.io/projected/6381840d-a689-48da-803a-06ed230e7a62-kube-api-access-mzbjl\") pod \"packageserver-d55dfcdfc-6s8jm\" (UID: \"6381840d-a689-48da-803a-06ed230e7a62\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.697020 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwsr7\" (UniqueName: \"kubernetes.io/projected/309bbd8c-0c3c-45bf-be12-3fb27218938c-kube-api-access-fwsr7\") pod \"multus-admission-controller-857f4d67dd-k4f7n\" (UID: \"309bbd8c-0c3c-45bf-be12-3fb27218938c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.702805 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qtcrk" Mar 18 06:48:46 crc kubenswrapper[4917]: W0318 06:48:46.729043 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f1745b0_b256_4dac_a5b4_a61d28697707.slice/crio-f9fef350d60b4ed84a4f684e432460990917c231cb3699edf31bdfc36caf3da5 WatchSource:0}: Error finding container f9fef350d60b4ed84a4f684e432460990917c231cb3699edf31bdfc36caf3da5: Status 404 returned error can't find the container with id f9fef350d60b4ed84a4f684e432460990917c231cb3699edf31bdfc36caf3da5 Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.741800 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p56wz\" (UniqueName: \"kubernetes.io/projected/0ebb2697-baf2-4d66-b4c0-6b1b8bef646f-kube-api-access-p56wz\") pod \"service-ca-9c57cc56f-gcd99\" (UID: \"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f\") " pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.750200 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b8pf\" (UniqueName: \"kubernetes.io/projected/708177a0-d2d0-4e6c-9f75-3149faf99718-kube-api-access-7b8pf\") pod \"migrator-59844c95c7-jchjc\" (UID: \"708177a0-d2d0-4e6c-9f75-3149faf99718\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.753689 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.754014 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.254004696 +0000 UTC m=+112.195159410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.778020 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t"] Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.788379 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk4cb\" (UniqueName: \"kubernetes.io/projected/74e1a866-4233-4bbf-a1ba-209ffd3a9980-kube-api-access-dk4cb\") pod \"dns-default-d6cp4\" (UID: \"74e1a866-4233-4bbf-a1ba-209ffd3a9980\") " pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.789043 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcs8m\" (UniqueName: \"kubernetes.io/projected/d7196170-031f-4cbe-ac17-50996ecc6fe6-kube-api-access-xcs8m\") pod \"package-server-manager-789f6589d5-7rqbb\" (UID: \"d7196170-031f-4cbe-ac17-50996ecc6fe6\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.829471 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87"] Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.837819 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.837843 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.838732 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.838723195 podStartE2EDuration="1.838723195s" podCreationTimestamp="2026-03-18 06:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:46.836534795 +0000 UTC m=+111.777689519" watchObservedRunningTime="2026-03-18 06:48:46.838723195 +0000 UTC m=+111.779877909" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.854866 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.856121 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.856982 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.857493 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.862562 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.362494133 +0000 UTC m=+112.303648847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.872049 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5dkf8" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.905036 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.912677 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.917217 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.958965 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:46 crc kubenswrapper[4917]: E0318 06:48:46.959483 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.459470948 +0000 UTC m=+112.400625662 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:46 crc kubenswrapper[4917]: I0318 06:48:46.986013 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.027667 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.060056 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.060502 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.560477335 +0000 UTC m=+112.501632049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.060709 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.061095 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.561086829 +0000 UTC m=+112.502241543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.164314 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.164496 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.664470629 +0000 UTC m=+112.605625343 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.164889 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.165183 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.665170375 +0000 UTC m=+112.606325089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.168218 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfbc2" podStartSLOduration=45.168197783 podStartE2EDuration="45.168197783s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:47.106477476 +0000 UTC m=+112.047632180" watchObservedRunningTime="2026-03-18 06:48:47.168197783 +0000 UTC m=+112.109352487" Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.265423 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.265550 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.765533787 +0000 UTC m=+112.706688501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.265682 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.265937 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.765929486 +0000 UTC m=+112.707084200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.366714 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.366838 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.8668242 +0000 UTC m=+112.807978914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.366961 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.367204 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.867198248 +0000 UTC m=+112.808352962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.468198 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.468329 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.968314258 +0000 UTC m=+112.909468972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.468398 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.468676 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:47.968667746 +0000 UTC m=+112.909822460 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.569262 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.569386 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.069375545 +0000 UTC m=+113.010530259 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.569419 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.569652 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.069645641 +0000 UTC m=+113.010800355 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.670406 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.670548 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.170529585 +0000 UTC m=+113.111684299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.670611 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.670913 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.170905073 +0000 UTC m=+113.112059787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.773066 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.773706 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.273691211 +0000 UTC m=+113.214845925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.848717 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" event={"ID":"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb","Type":"ContainerStarted","Data":"55bf7412003fed10b4f1042c5ba4d3bcbdc8c38f18c5ef85b1d8de1e14b3607c"} Mar 18 06:48:47 crc kubenswrapper[4917]: W0318 06:48:47.855713 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48203287_e84d_4f62_88dc_7d204cd4f257.slice/crio-72d05614fffe56afb031ec44d794f3f387901844feca46e5f7aba719ed2c55ff WatchSource:0}: Error finding container 72d05614fffe56afb031ec44d794f3f387901844feca46e5f7aba719ed2c55ff: Status 404 returned error can't find the container with id 72d05614fffe56afb031ec44d794f3f387901844feca46e5f7aba719ed2c55ff Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.885933 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.886344 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.386328951 +0000 UTC m=+113.327483655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.914845 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" event={"ID":"b868698f-400d-43b7-ba5a-190668ef2e9e","Type":"ContainerStarted","Data":"bbbae7b5d3c3f13f330f0200b305430fd146b174e712f8e0e91fcd00e3484098"} Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.916332 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv"] Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.917990 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" event={"ID":"6940780c-6aab-4109-bb07-78b1ec265159","Type":"ContainerStarted","Data":"7ba21f30bbcda86e8e4cdc2a02e747040353c04f601334c7b291e64dee431f18"} Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.923514 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" event={"ID":"42365ba9-ba85-47ed-a93c-543cd2ce8d30","Type":"ContainerStarted","Data":"15d909a096c86ea0a5e8b76caa66279135102dc3e77065c0cd2226bee2e03dd9"} Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.925806 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" event={"ID":"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc","Type":"ContainerStarted","Data":"af255ea577a070a5f34174a38e4d0a05fd92230bc6d5ad5d94f8a5ea265d0690"} Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.926538 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-66xqj" event={"ID":"7f1745b0-b256-4dac-a5b4-a61d28697707","Type":"ContainerStarted","Data":"f9fef350d60b4ed84a4f684e432460990917c231cb3699edf31bdfc36caf3da5"} Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.927254 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" event={"ID":"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd","Type":"ContainerStarted","Data":"2e3a419e94f56fefc1dcc143796e8bb8fa6b1b6912ce9ec87eb430cba6b81322"} Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.928457 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" event={"ID":"726838f6-bf51-40d1-83ac-c55ef671bf16","Type":"ContainerStarted","Data":"63b291342cd42e66ade17f84563a1207474c4d787b13b68d1d8303390210ee97"} Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.982560 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6x6mh"] Mar 18 06:48:47 crc kubenswrapper[4917]: I0318 06:48:47.987243 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:47 crc kubenswrapper[4917]: E0318 06:48:47.987763 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.487748747 +0000 UTC m=+113.428903461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:47.999981 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sgj9g" event={"ID":"7c21e973-7d87-496c-81ba-1425ba599774","Type":"ContainerStarted","Data":"fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3"} Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.000233 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sgj9g" event={"ID":"7c21e973-7d87-496c-81ba-1425ba599774","Type":"ContainerStarted","Data":"c34f8ae044f61a88bce171ce491824262324307db188e40ce46ad8a5ecb43ff8"} Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.002516 4917 patch_prober.go:28] interesting pod/downloads-7954f5f757-zk4nl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.002562 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zk4nl" podUID="f4c8c2e4-554f-423e-81a2-cd9f63eb7250" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.090437 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.115346 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.615328105 +0000 UTC m=+113.556482819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.192011 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.192392 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.692376579 +0000 UTC m=+113.633531293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.206040 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9xdsf"] Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.235856 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5"] Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.294411 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.295089 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.795066244 +0000 UTC m=+113.736220958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.398014 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.398417 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:48.898403883 +0000 UTC m=+113.839558597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.503797 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.504347 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.004335742 +0000 UTC m=+113.945490456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.534112 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nkz5k"] Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.560686 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6"] Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.605981 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.606848 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.106831252 +0000 UTC m=+114.047985966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.623412 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-sbj9l" podStartSLOduration=46.623391607 podStartE2EDuration="46.623391607s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:48.590333958 +0000 UTC m=+113.531488692" watchObservedRunningTime="2026-03-18 06:48:48.623391607 +0000 UTC m=+113.564546321" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.659144 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-mgnbj" podStartSLOduration=46.659117175 podStartE2EDuration="46.659117175s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:48.65799742 +0000 UTC m=+113.599152134" watchObservedRunningTime="2026-03-18 06:48:48.659117175 +0000 UTC m=+113.600271889" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.680818 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" podStartSLOduration=46.680799947 podStartE2EDuration="46.680799947s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:48.679265792 +0000 UTC m=+113.620420506" watchObservedRunningTime="2026-03-18 06:48:48.680799947 +0000 UTC m=+113.621954661" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.708487 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.708975 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.208954194 +0000 UTC m=+114.150108908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.743762 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" podStartSLOduration=45.742752839 podStartE2EDuration="45.742752839s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:48.741276125 +0000 UTC m=+113.682430839" watchObservedRunningTime="2026-03-18 06:48:48.742752839 +0000 UTC m=+113.683907553" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.780092 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-h5mvm" podStartSLOduration=45.780075534 podStartE2EDuration="45.780075534s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:48.777311141 +0000 UTC m=+113.718465855" watchObservedRunningTime="2026-03-18 06:48:48.780075534 +0000 UTC m=+113.721230248" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.816148 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.816295 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.316266473 +0000 UTC m=+114.257421187 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.816749 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.816779 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.817096 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.317085191 +0000 UTC m=+114.258239905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.853899 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/63affb66-9eb5-40ac-9b60-6ff9af511233-metrics-certs\") pod \"network-metrics-daemon-ww4d6\" (UID: \"63affb66-9eb5-40ac-9b60-6ff9af511233\") " pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.882498 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zk4nl" podStartSLOduration=46.882478862 podStartE2EDuration="46.882478862s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:48.871887322 +0000 UTC m=+113.813042036" watchObservedRunningTime="2026-03-18 06:48:48.882478862 +0000 UTC m=+113.823633576" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.920996 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:48 crc kubenswrapper[4917]: E0318 06:48:48.921360 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.421344331 +0000 UTC m=+114.362499045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.962812 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-sgj9g" podStartSLOduration=46.96278975 podStartE2EDuration="46.96278975s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:48.956003546 +0000 UTC m=+113.897158260" watchObservedRunningTime="2026-03-18 06:48:48.96278975 +0000 UTC m=+113.903944464" Mar 18 06:48:48 crc kubenswrapper[4917]: I0318 06:48:48.968324 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=18.968302505 podStartE2EDuration="18.968302505s" podCreationTimestamp="2026-03-18 06:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:48.906982936 +0000 UTC m=+113.848137650" watchObservedRunningTime="2026-03-18 06:48:48.968302505 +0000 UTC m=+113.909457219" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.009482 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ww4d6" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.029710 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.030088 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.530075553 +0000 UTC m=+114.471230267 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.045846 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" event={"ID":"4f086875-421c-4218-91f3-1349dcc3247d","Type":"ContainerStarted","Data":"dd0085ead73f00fb81dfc43667def3273785702a02fda29f1e69a9902957ac98"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.069226 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qtcrk" event={"ID":"48203287-e84d-4f62-88dc-7d204cd4f257","Type":"ContainerStarted","Data":"e723f05d4289c1bc6d2e8e8baba18cb537c8e551ae2142a01021bb2529bee35c"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.069266 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qtcrk" event={"ID":"48203287-e84d-4f62-88dc-7d204cd4f257","Type":"ContainerStarted","Data":"72d05614fffe56afb031ec44d794f3f387901844feca46e5f7aba719ed2c55ff"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.072628 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" event={"ID":"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9","Type":"ContainerStarted","Data":"898a49a448d76e155307493e73ccf10bd1f76e76157c07e02ba1c6c054073f97"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.131428 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.132435 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.63241551 +0000 UTC m=+114.573570224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.133859 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" event={"ID":"b868698f-400d-43b7-ba5a-190668ef2e9e","Type":"ContainerStarted","Data":"c05633babc4236f4a8f9ff1011d264af1b1b20f363b389b269556ac66f16f7d2"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.143960 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" podStartSLOduration=46.143945361 podStartE2EDuration="46.143945361s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:49.13329537 +0000 UTC m=+114.074450084" watchObservedRunningTime="2026-03-18 06:48:49.143945361 +0000 UTC m=+114.085100075" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.150007 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" event={"ID":"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd","Type":"ContainerStarted","Data":"4166f9635a6d9ac19271b91f6286050cde4bcc20bf8a32bcdad23cf69c952fdd"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.176403 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" event={"ID":"42365ba9-ba85-47ed-a93c-543cd2ce8d30","Type":"ContainerStarted","Data":"d86e9d30c91e92ff90f10158c8d7c4f33813b6e3bad1f61dd66f2584824c6ed6"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.185970 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" event={"ID":"8800cf27-b6c9-4859-ac70-ee3fc7b774fc","Type":"ContainerStarted","Data":"0c98466b1327daf6496b54a76e522aee43f43092d40d684834f5c36a3f2b85bc"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.206782 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" event={"ID":"a7bf25c7-03d6-49a6-a981-8517bcadee69","Type":"ContainerStarted","Data":"2961151f555dfa9c544f5d199ba752f5030d070dde2ff9c2f06ba9dd00673b79"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.225258 4917 generic.go:334] "Generic (PLEG): container finished" podID="c4745aae-34c6-474b-8d7b-c7e7ef9e43cb" containerID="4a1a9ec44f80db9d947aa058c2a0a39e5f7d712c79b2c0cd0280bd3ef9853d93" exitCode=0 Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.225317 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" event={"ID":"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb","Type":"ContainerDied","Data":"4a1a9ec44f80db9d947aa058c2a0a39e5f7d712c79b2c0cd0280bd3ef9853d93"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.241423 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.242089 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-t6src" podStartSLOduration=47.242074303 podStartE2EDuration="47.242074303s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:49.240799974 +0000 UTC m=+114.181954688" watchObservedRunningTime="2026-03-18 06:48:49.242074303 +0000 UTC m=+114.183229017" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.242310 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qtcrk" podStartSLOduration=6.242306848 podStartE2EDuration="6.242306848s" podCreationTimestamp="2026-03-18 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:49.157635531 +0000 UTC m=+114.098790245" watchObservedRunningTime="2026-03-18 06:48:49.242306848 +0000 UTC m=+114.183461562" Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.242861 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.74285038 +0000 UTC m=+114.684005084 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.267608 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-66xqj" event={"ID":"7f1745b0-b256-4dac-a5b4-a61d28697707","Type":"ContainerStarted","Data":"192b24497dc323a2122843b8e944f68619a4a8b2d921fa419da709d0fd1077d7"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.273014 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6x6mh" event={"ID":"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b","Type":"ContainerStarted","Data":"577ff93f1acc5dd11400626982599457d352edc61c4d5959e79fe34f8ebe8a9a"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.274455 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.285206 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.322496 4917 patch_prober.go:28] interesting pod/console-operator-58897d9998-6x6mh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/readyz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.323091 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6x6mh" podUID="2c47e8a8-91a9-47dc-b379-5dcbaae96a7b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/readyz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.331189 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" event={"ID":"4fb3fe10-f693-4e74-b72a-f2b10dac3580","Type":"ContainerStarted","Data":"f0ef64df568eed3878d26b88f65da6a9fdb0ffc40e5eff6a17b35bb9450f8222"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.344183 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.344254 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.844240095 +0000 UTC m=+114.785394809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.344935 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.346640 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.846632039 +0000 UTC m=+114.787786753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.358069 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" event={"ID":"44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf","Type":"ContainerStarted","Data":"381329de58c59029360ddf59f32e5225e47d9881b0925740f45c4bd409521c43"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.396670 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-66xqj" podStartSLOduration=46.396653052 podStartE2EDuration="46.396653052s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:49.368880383 +0000 UTC m=+114.310035117" watchObservedRunningTime="2026-03-18 06:48:49.396653052 +0000 UTC m=+114.337807766" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.396947 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6x6mh" podStartSLOduration=47.396943228 podStartE2EDuration="47.396943228s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:49.392659171 +0000 UTC m=+114.333813885" watchObservedRunningTime="2026-03-18 06:48:49.396943228 +0000 UTC m=+114.338097942" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.399356 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" event={"ID":"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5","Type":"ContainerStarted","Data":"947dbfdde6a456d4b2c22ea14b481860f747497d0135e37043218260d7f3f5c0"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.434287 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" podStartSLOduration=47.434269724 podStartE2EDuration="47.434269724s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:49.430736833 +0000 UTC m=+114.371891547" watchObservedRunningTime="2026-03-18 06:48:49.434269724 +0000 UTC m=+114.375424438" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.442798 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.449147 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.449477 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:49.949463048 +0000 UTC m=+114.890617762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: W0318 06:48:49.468154 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39975bd4_f79a_464b_a4cc_65220c2ee731.slice/crio-313ef719145b0f70f3d2f7bf8bb920de5358b8ed5ea8dda1d16174a3bfff1910 WatchSource:0}: Error finding container 313ef719145b0f70f3d2f7bf8bb920de5358b8ed5ea8dda1d16174a3bfff1910: Status 404 returned error can't find the container with id 313ef719145b0f70f3d2f7bf8bb920de5358b8ed5ea8dda1d16174a3bfff1910 Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.468545 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" event={"ID":"726838f6-bf51-40d1-83ac-c55ef671bf16","Type":"ContainerStarted","Data":"925d35165b43ae3656fd268f644345bf4983bb914212d92be209b2375fcb034d"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.477349 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" event={"ID":"6940780c-6aab-4109-bb07-78b1ec265159","Type":"ContainerStarted","Data":"9572636ca752dd7bc708225aad88f9ab585cdd6759e7d322f2fe7238dc589d1f"} Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.493188 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.501110 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:49 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:49 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:49 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.501154 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.522183 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nbchd" podStartSLOduration=47.522168374 podStartE2EDuration="47.522168374s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:49.520598257 +0000 UTC m=+114.461752961" watchObservedRunningTime="2026-03-18 06:48:49.522168374 +0000 UTC m=+114.463323078" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.552181 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.555857 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.055839496 +0000 UTC m=+114.996994210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.560769 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" podStartSLOduration=47.560756087 podStartE2EDuration="47.560756087s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:49.559892578 +0000 UTC m=+114.501047292" watchObservedRunningTime="2026-03-18 06:48:49.560756087 +0000 UTC m=+114.501910791" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.583512 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.583593 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-k4f7n"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.626652 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.635496 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.641275 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.641315 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8v8rg"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.657555 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d6cp4"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.658194 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.658326 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.158308575 +0000 UTC m=+115.099463289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.658429 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.658681 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.158674693 +0000 UTC m=+115.099829407 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.687022 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.742556 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.761696 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.765976 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.265949242 +0000 UTC m=+115.207103956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.772903 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.783639 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.850658 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-gcd99"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.863725 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.864051 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.364036813 +0000 UTC m=+115.305191527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.876162 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx"] Mar 18 06:48:49 crc kubenswrapper[4917]: W0318 06:48:49.899103 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a676eea_cd1e_4098_95ec_48cdeff1a5c9.slice/crio-75b2444ce5342bfc192dac1d925e46b5c91688dae2807803b257a6ea3e5fa986 WatchSource:0}: Error finding container 75b2444ce5342bfc192dac1d925e46b5c91688dae2807803b257a6ea3e5fa986: Status 404 returned error can't find the container with id 75b2444ce5342bfc192dac1d925e46b5c91688dae2807803b257a6ea3e5fa986 Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.905561 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.927538 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-874w9"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.948222 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.958111 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.958524 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.963125 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5dkf8"] Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.965155 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:49 crc kubenswrapper[4917]: E0318 06:48:49.965746 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.465725384 +0000 UTC m=+115.406880098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:49 crc kubenswrapper[4917]: I0318 06:48:49.976138 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ww4d6"] Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.072416 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.074636 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.574570389 +0000 UTC m=+115.515725103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.186549 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.186858 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.68684367 +0000 UTC m=+115.627998384 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.213796 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.291064 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.291329 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.791318916 +0000 UTC m=+115.732473630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.393209 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.393513 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.893499229 +0000 UTC m=+115.834653943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.494855 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.495505 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:50.995494208 +0000 UTC m=+115.936648922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.497102 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:50 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:50 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:50 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.497132 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.537013 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" event={"ID":"4a676eea-cd1e-4098-95ec-48cdeff1a5c9","Type":"ContainerStarted","Data":"75b2444ce5342bfc192dac1d925e46b5c91688dae2807803b257a6ea3e5fa986"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.599117 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.599210 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.099196806 +0000 UTC m=+116.040351520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.599388 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.599657 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.099650686 +0000 UTC m=+116.040805390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.638983 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" event={"ID":"44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf","Type":"ContainerStarted","Data":"659912ce2c9871e8ea370b036ead5d81a8045c0e10cab7f013ada1cece88925c"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.663694 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6cp4" event={"ID":"74e1a866-4233-4bbf-a1ba-209ffd3a9980","Type":"ContainerStarted","Data":"6e1fea6c3f3314a8113d9a91baa3a317abd938b0514dc7ee09cdde995662d504"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.667781 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" event={"ID":"c08b892f-c635-444b-a090-fe8fe6aed47b","Type":"ContainerStarted","Data":"9e2148411ea9f04d6eddc2e56729ec4c35d0b173f3dfb9009ad9610e6d39ab29"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.667822 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" event={"ID":"c08b892f-c635-444b-a090-fe8fe6aed47b","Type":"ContainerStarted","Data":"accde2a469a38b0e853a815a6296195995f2d4eb24fb0c0fb677dbf70f16c56c"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.670766 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" event={"ID":"39975bd4-f79a-464b-a4cc-65220c2ee731","Type":"ContainerStarted","Data":"ec7f4e8b612cb727451561a1ec45ab77dbfd14e43853778c9dbfd5f6e0f8e66c"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.670798 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" event={"ID":"39975bd4-f79a-464b-a4cc-65220c2ee731","Type":"ContainerStarted","Data":"313ef719145b0f70f3d2f7bf8bb920de5358b8ed5ea8dda1d16174a3bfff1910"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.675445 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" event={"ID":"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9","Type":"ContainerStarted","Data":"d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.676433 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.679443 4917 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-9xdsf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.679499 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" podUID="91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.687510 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" event={"ID":"c4745aae-34c6-474b-8d7b-c7e7ef9e43cb","Type":"ContainerStarted","Data":"d66057935ed0703514d6861f95f51d1c18402c8aa919b023358d3d64ed847389"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.687622 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.691526 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6x6mh" event={"ID":"2c47e8a8-91a9-47dc-b379-5dcbaae96a7b","Type":"ContainerStarted","Data":"7826a7ecc09db2401f07bf6503710097975590b079800b74b0c5b4a82c019ce0"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.692474 4917 patch_prober.go:28] interesting pod/console-operator-58897d9998-6x6mh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/readyz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.692502 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6x6mh" podUID="2c47e8a8-91a9-47dc-b379-5dcbaae96a7b" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/readyz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.702360 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.705049 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.205031571 +0000 UTC m=+116.146186285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.711413 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-zxffn" podStartSLOduration=48.711392125 podStartE2EDuration="48.711392125s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:50.685041159 +0000 UTC m=+115.626195883" watchObservedRunningTime="2026-03-18 06:48:50.711392125 +0000 UTC m=+115.652546839" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.711763 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" podStartSLOduration=48.711757763 podStartE2EDuration="48.711757763s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:50.710384572 +0000 UTC m=+115.651539306" watchObservedRunningTime="2026-03-18 06:48:50.711757763 +0000 UTC m=+115.652912487" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.734227 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" podStartSLOduration=48.734204201 podStartE2EDuration="48.734204201s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:50.731129332 +0000 UTC m=+115.672284046" watchObservedRunningTime="2026-03-18 06:48:50.734204201 +0000 UTC m=+115.675358915" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.761195 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" event={"ID":"4fb3fe10-f693-4e74-b72a-f2b10dac3580","Type":"ContainerStarted","Data":"14131f8d114a6571d653848e57ec575ed736c29e9530fb57225183c9c4b5a21d"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.778011 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc" event={"ID":"708177a0-d2d0-4e6c-9f75-3149faf99718","Type":"ContainerStarted","Data":"4400809ae5d6eb2f8198b6224ba779aadb7be3ac663f2d541c4175a5f2eb78e5"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.812448 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.812758 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.312747529 +0000 UTC m=+116.253902243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.845746 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" event={"ID":"d7196170-031f-4cbe-ac17-50996ecc6fe6","Type":"ContainerStarted","Data":"acc0d33b2aef08ff3775e6e9200f32fb0f07b8b07fcdfb31eb881ccde48dfa3b"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.845788 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" event={"ID":"d7196170-031f-4cbe-ac17-50996ecc6fe6","Type":"ContainerStarted","Data":"0b547c7b0af7c6c30f4eadda9e0e20c5e46c57414b6b69562bbadcdee56744dd"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.877753 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" event={"ID":"8800cf27-b6c9-4859-ac70-ee3fc7b774fc","Type":"ContainerStarted","Data":"70487f0ed22a99f5f3034b63455872f0a3662e345fd3c0d9dbbb293057855783"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.910687 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pzpt5" podStartSLOduration=47.910671296 podStartE2EDuration="47.910671296s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:50.90992577 +0000 UTC m=+115.851080484" watchObservedRunningTime="2026-03-18 06:48:50.910671296 +0000 UTC m=+115.851826010" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.911431 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" podStartSLOduration=48.911426844 podStartE2EDuration="48.911426844s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:50.827680918 +0000 UTC m=+115.768835642" watchObservedRunningTime="2026-03-18 06:48:50.911426844 +0000 UTC m=+115.852581558" Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.913694 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.914449 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-874w9" event={"ID":"e10ffc7c-e1b0-4b26-a30d-687ef976191b","Type":"ContainerStarted","Data":"f12b2819655ef149da1a5611b6afda38ffb2f5a9f74cf3d2c190064e1bb15548"} Mar 18 06:48:50 crc kubenswrapper[4917]: E0318 06:48:50.915944 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.415926896 +0000 UTC m=+116.357081670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.928383 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" event={"ID":"309bbd8c-0c3c-45bf-be12-3fb27218938c","Type":"ContainerStarted","Data":"6fd91cc38f94e380e8e05e3ad53f5149f2ca304bc379eac53656aff317d0110e"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.974434 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" event={"ID":"07e47035-8f83-4334-b82d-c3fa26dfe8f9","Type":"ContainerStarted","Data":"443b63554af3f3f2396ba66410b2edce3834e3c34c9b916d824df309eba00fe1"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.998115 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" event={"ID":"5beb9e93-3da2-4bc9-b40a-24406435d739","Type":"ContainerStarted","Data":"0b90c2048d0011f924d4f8daccda6e6cd7e79abc067db510347e7150c55bec5e"} Mar 18 06:48:50 crc kubenswrapper[4917]: I0318 06:48:50.998404 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" event={"ID":"5beb9e93-3da2-4bc9-b40a-24406435d739","Type":"ContainerStarted","Data":"488b37d7da9d86735df3a563a5940282f898036f13a84a2b9426713ce2ef2b9b"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.011196 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" event={"ID":"1d145faa-cd72-4149-b5fa-cd47f39e1c27","Type":"ContainerStarted","Data":"052d0eee71f39e7b269db12533a8239f46c9799badd1d550b0750860ddc477be"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.011246 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" event={"ID":"1d145faa-cd72-4149-b5fa-cd47f39e1c27","Type":"ContainerStarted","Data":"a3948f9461511facec23413da04a322cf15e060e35790fae7eec898dc891a6e6"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.012219 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.017674 4917 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-zhd5k container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.017714 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" podUID="1d145faa-cd72-4149-b5fa-cd47f39e1c27" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.018533 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.019986 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.519969371 +0000 UTC m=+116.461124085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.033130 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60774: no serving certificate available for the kubelet" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.033783 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ww4d6" event={"ID":"63affb66-9eb5-40ac-9b60-6ff9af511233","Type":"ContainerStarted","Data":"6c67c2c9a433bf496ddfae45862ab39d097ffcc344db33b3e3b447a9fcc78713"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.043471 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" event={"ID":"d1ce22c0-a5d2-493b-a3a1-d4d1cd2cf2dd","Type":"ContainerStarted","Data":"d7da22e40640e75156b90bf604b91b19898443b6847e4c20c27d13ad5c9c00ff"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.044664 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jrjf8" podStartSLOduration=48.04465268 podStartE2EDuration="48.04465268s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.0402269 +0000 UTC m=+115.981381624" watchObservedRunningTime="2026-03-18 06:48:51.04465268 +0000 UTC m=+115.985807394" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.067276 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" podStartSLOduration=48.067257901 podStartE2EDuration="48.067257901s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.066195837 +0000 UTC m=+116.007350551" watchObservedRunningTime="2026-03-18 06:48:51.067257901 +0000 UTC m=+116.008412625" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.076881 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" event={"ID":"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5","Type":"ContainerStarted","Data":"8a18b23b6c63de5741a82af5b3a30bea2165c7cb68e45c1130edebc538f232bf"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.114955 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pws8t" podStartSLOduration=48.114937461 podStartE2EDuration="48.114937461s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.105979108 +0000 UTC m=+116.047133822" watchObservedRunningTime="2026-03-18 06:48:51.114937461 +0000 UTC m=+116.056092175" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.119714 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.120162 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.620146659 +0000 UTC m=+116.561301373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.122187 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60788: no serving certificate available for the kubelet" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.125697 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" event={"ID":"a7bf25c7-03d6-49a6-a981-8517bcadee69","Type":"ContainerStarted","Data":"5fe9e9f3dd04dbf893b239222ed1fd05c7e3d379f3fba86e4f94a3078444bd03"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.137015 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" podStartSLOduration=48.13699779 podStartE2EDuration="48.13699779s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.136065659 +0000 UTC m=+116.077220373" watchObservedRunningTime="2026-03-18 06:48:51.13699779 +0000 UTC m=+116.078152504" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.154915 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" event={"ID":"a1a7d05d-7034-44fe-bfa6-ea31a61a9df5","Type":"ContainerStarted","Data":"6a74341ba6422aa3e73fa23f47547b74e773237e1739f32ebb4acd7ef64bcc51"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.200026 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" event={"ID":"6381840d-a689-48da-803a-06ed230e7a62","Type":"ContainerStarted","Data":"c3de9ebc2f60d977e719baf75bb9399f79aada9770affe715d6c5724113b9480"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.200917 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.212385 4917 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6s8jm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.212440 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" podUID="6381840d-a689-48da-803a-06ed230e7a62" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.221156 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.223151 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.72314019 +0000 UTC m=+116.664294904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.231911 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60790: no serving certificate available for the kubelet" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.237744 4917 patch_prober.go:28] interesting pod/apiserver-76f77b778f-cm4kv container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]log ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]etcd ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/generic-apiserver-start-informers ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/max-in-flight-filter ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 18 06:48:51 crc kubenswrapper[4917]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 18 06:48:51 crc kubenswrapper[4917]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/project.openshift.io-projectcache ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/openshift.io-startinformers ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 18 06:48:51 crc kubenswrapper[4917]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 18 06:48:51 crc kubenswrapper[4917]: livez check failed Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.237796 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" podUID="4fb3fe10-f693-4e74-b72a-f2b10dac3580" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.239421 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-p97tv" podStartSLOduration=48.239405919 podStartE2EDuration="48.239405919s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.187004893 +0000 UTC m=+116.128159607" watchObservedRunningTime="2026-03-18 06:48:51.239405919 +0000 UTC m=+116.180560633" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.239789 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lwhm6" podStartSLOduration=48.239784277 podStartE2EDuration="48.239784277s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.23814478 +0000 UTC m=+116.179299494" watchObservedRunningTime="2026-03-18 06:48:51.239784277 +0000 UTC m=+116.180938991" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.249749 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bvz7t" event={"ID":"6940780c-6aab-4109-bb07-78b1ec265159","Type":"ContainerStarted","Data":"e2f45eee6beb0cd2419eeac3413f1d73b7963a7b35d9735f1e341f421b9c765c"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.268857 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" event={"ID":"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f","Type":"ContainerStarted","Data":"f9990c1921fba20fa0f10309d72cbd54d0bcbd2d99b59f7967338a8d40809493"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.283676 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" event={"ID":"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26","Type":"ContainerStarted","Data":"2cb5a86b18e339bb747f77f1ea22db0d407ca548dc3d90d43a6d4b27b8e36189"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.284695 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.297123 4917 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8v8rg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.297170 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.300257 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" event={"ID":"491de775-4251-427e-9065-9560931583b0","Type":"ContainerStarted","Data":"8b506737bc27eec646157f4ecfb09a6f3e4b651c2e8e92acc3956983215a5b8e"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.300535 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" podStartSLOduration=48.300512242 podStartE2EDuration="48.300512242s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.284827616 +0000 UTC m=+116.225982330" watchObservedRunningTime="2026-03-18 06:48:51.300512242 +0000 UTC m=+116.241666956" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.321966 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.322939 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.822918699 +0000 UTC m=+116.764073413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.328264 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" event={"ID":"42365ba9-ba85-47ed-a93c-543cd2ce8d30","Type":"ContainerStarted","Data":"7532408d1c4c59728cb598ab828e362cef35ee6ce66333625cff44d446d86355"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.330755 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60806: no serving certificate available for the kubelet" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.352473 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" podStartSLOduration=48.352453158 podStartE2EDuration="48.352453158s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.319519462 +0000 UTC m=+116.260674176" watchObservedRunningTime="2026-03-18 06:48:51.352453158 +0000 UTC m=+116.293607882" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.377257 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5dkf8" event={"ID":"049d29dc-f129-448c-a11e-9ffbe7e44334","Type":"ContainerStarted","Data":"4f81998bc719b08b34fe043430a29297c95830f4911ec5d030aff845ff4b0efa"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.379854 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" event={"ID":"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc","Type":"ContainerStarted","Data":"dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0"} Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.379884 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.386527 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" podStartSLOduration=48.386508229 podStartE2EDuration="48.386508229s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.354418582 +0000 UTC m=+116.295573296" watchObservedRunningTime="2026-03-18 06:48:51.386508229 +0000 UTC m=+116.327662933" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.392270 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fj4nz" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.424473 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.426618 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:51.926605437 +0000 UTC m=+116.867760151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.448657 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" podStartSLOduration=48.448637635 podStartE2EDuration="48.448637635s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.391932212 +0000 UTC m=+116.333086926" watchObservedRunningTime="2026-03-18 06:48:51.448637635 +0000 UTC m=+116.389792349" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.455781 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60814: no serving certificate available for the kubelet" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.470662 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.475334 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-pkg87" podStartSLOduration=49.47532255 podStartE2EDuration="49.47532255s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.447863538 +0000 UTC m=+116.389018252" watchObservedRunningTime="2026-03-18 06:48:51.47532255 +0000 UTC m=+116.416477264" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.506198 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:51 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:51 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:51 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.506238 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.526255 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.527284 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.027270016 +0000 UTC m=+116.968424730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.527327 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.528082 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.028075263 +0000 UTC m=+116.969229967 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.535575 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60820: no serving certificate available for the kubelet" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.566729 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5dkf8" podStartSLOduration=8.566710638 podStartE2EDuration="8.566710638s" podCreationTimestamp="2026-03-18 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.515060089 +0000 UTC m=+116.456214803" watchObservedRunningTime="2026-03-18 06:48:51.566710638 +0000 UTC m=+116.507865352" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.578844 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" podStartSLOduration=8.578829842 podStartE2EDuration="8.578829842s" podCreationTimestamp="2026-03-18 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:51.576733875 +0000 UTC m=+116.517888579" watchObservedRunningTime="2026-03-18 06:48:51.578829842 +0000 UTC m=+116.519984556" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.633048 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.633166 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.133146582 +0000 UTC m=+117.074301296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.633794 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.634186 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.134178386 +0000 UTC m=+117.075333100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.639555 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60822: no serving certificate available for the kubelet" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.734769 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.735242 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.235224693 +0000 UTC m=+117.176379407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.752305 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60836: no serving certificate available for the kubelet" Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.833187 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-plhlj"] Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.836345 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.836678 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.33666282 +0000 UTC m=+117.277817534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.937456 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.937662 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.437642166 +0000 UTC m=+117.378796880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:51 crc kubenswrapper[4917]: I0318 06:48:51.937938 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:51 crc kubenswrapper[4917]: E0318 06:48:51.938187 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.438174557 +0000 UTC m=+117.379329271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.038692 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.038785 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.538770575 +0000 UTC m=+117.479925289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.039044 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.039377 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.539361928 +0000 UTC m=+117.480516642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.140428 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.140650 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.640634851 +0000 UTC m=+117.581789565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.140759 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.141035 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.64102923 +0000 UTC m=+117.582183944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.242385 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.242758 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.742734503 +0000 UTC m=+117.683889217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.343647 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.344274 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.84426357 +0000 UTC m=+117.785418284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.417937 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ww4d6" event={"ID":"63affb66-9eb5-40ac-9b60-6ff9af511233","Type":"ContainerStarted","Data":"63d171455cc69ea29506f9080bda769f750b63a39531584d26396253cd42f673"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.418002 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ww4d6" event={"ID":"63affb66-9eb5-40ac-9b60-6ff9af511233","Type":"ContainerStarted","Data":"ab3bf4c19d9711770a8396bc483364494f219adf5109b9dc7b0534cf8c23d99e"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.421365 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wgdbd" event={"ID":"491de775-4251-427e-9065-9560931583b0","Type":"ContainerStarted","Data":"d8201176c36d58238357d51e736434dd3112820e388035e67b1655e7ed390ebb"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.427133 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" event={"ID":"309bbd8c-0c3c-45bf-be12-3fb27218938c","Type":"ContainerStarted","Data":"fa5eaf9457e9dff99a1ba63cf95adb9017fac28525823c77063f95b3088b8bd8"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.427170 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" event={"ID":"309bbd8c-0c3c-45bf-be12-3fb27218938c","Type":"ContainerStarted","Data":"46f1034bce7c7674eecc0b4e5fccb542b5054572bc628c7eb43e6fa0664b82b9"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.435727 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" event={"ID":"07e47035-8f83-4334-b82d-c3fa26dfe8f9","Type":"ContainerStarted","Data":"b1c5eaa480d7a1df6cced21d6cf8f10dc0f5755012ae7cc80f6b0120a9f3b9dc"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.436394 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.443052 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ww4d6" podStartSLOduration=50.443041977 podStartE2EDuration="50.443041977s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:52.442125656 +0000 UTC m=+117.383280370" watchObservedRunningTime="2026-03-18 06:48:52.443041977 +0000 UTC m=+117.384196691" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.446824 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.446839 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" event={"ID":"44297a9f-ff7c-4daf-b6e1-d07eeac4aeaf","Type":"ContainerStarted","Data":"f553d396b80dfacd4d909f0f242e31b9bf71080773c5a5a57768f4249d4863a5"} Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.447177 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.94715847 +0000 UTC m=+117.888313184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.447402 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.448197 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:52.948177353 +0000 UTC m=+117.889332067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.453095 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" event={"ID":"d7196170-031f-4cbe-ac17-50996ecc6fe6","Type":"ContainerStarted","Data":"79fb1551d6c816066325306847a3956291d762c1a8930bd38a08649db970da18"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.454411 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.476046 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60838: no serving certificate available for the kubelet" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.478240 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-k4f7n" podStartSLOduration=49.478216843 podStartE2EDuration="49.478216843s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:52.476262208 +0000 UTC m=+117.417416922" watchObservedRunningTime="2026-03-18 06:48:52.478216843 +0000 UTC m=+117.419371557" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.490722 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.507716 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:52 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:52 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:52 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.507790 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.513724 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-spgtc" podStartSLOduration=49.513703456 podStartE2EDuration="49.513703456s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:52.510158437 +0000 UTC m=+117.451313151" watchObservedRunningTime="2026-03-18 06:48:52.513703456 +0000 UTC m=+117.454858170" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.516973 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5dkf8" event={"ID":"049d29dc-f129-448c-a11e-9ffbe7e44334","Type":"ContainerStarted","Data":"635a3149f6e152eafdf58a7e444a5b4e7730614eac7f0f0993bb64fdb8da4609"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.547064 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" podStartSLOduration=49.547047001 podStartE2EDuration="49.547047001s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:52.546064919 +0000 UTC m=+117.487219643" watchObservedRunningTime="2026-03-18 06:48:52.547047001 +0000 UTC m=+117.488201705" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.549174 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.550079 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" event={"ID":"4a676eea-cd1e-4098-95ec-48cdeff1a5c9","Type":"ContainerStarted","Data":"d1951d9bd6917bd4dc076c12b01208b85ad1ef9cc8966b5b4532e4eeafb78193"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.550149 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" event={"ID":"4a676eea-cd1e-4098-95ec-48cdeff1a5c9","Type":"ContainerStarted","Data":"27b4aaaa5dead44bcec0d4ff0bc67e0f03d65565e3ab09ea441498153d02672c"} Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.552761 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.05272009 +0000 UTC m=+117.993874804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.567397 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc" event={"ID":"708177a0-d2d0-4e6c-9f75-3149faf99718","Type":"ContainerStarted","Data":"2a2f7a1bc706863876e9e8037a15dd3e544ef1aaec0f14811799625041125d46"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.567448 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc" event={"ID":"708177a0-d2d0-4e6c-9f75-3149faf99718","Type":"ContainerStarted","Data":"4ee2d77eee9934d811f25035280fc555b51ed3bb49b24f19a0d9c165690a82a6"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.569034 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-gcd99" event={"ID":"0ebb2697-baf2-4d66-b4c0-6b1b8bef646f","Type":"ContainerStarted","Data":"99995d5b6021f9b9fa63f2ff8382f4b7c19206ef13db6a03774605eb8d2e00c6"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.576649 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-874w9" event={"ID":"e10ffc7c-e1b0-4b26-a30d-687ef976191b","Type":"ContainerStarted","Data":"b370a03b69c00aca62e1b31764797059e831f147b1cf1b9a1a058fe39d97f09c"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.585331 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" event={"ID":"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26","Type":"ContainerStarted","Data":"5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.590980 4917 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8v8rg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.591037 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.593042 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6cp4" event={"ID":"74e1a866-4233-4bbf-a1ba-209ffd3a9980","Type":"ContainerStarted","Data":"1d6a303dc5e39ca5ed435df5f7e34a2e181c9ed5f30528ef7709b182af633c94"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.593101 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d6cp4" event={"ID":"74e1a866-4233-4bbf-a1ba-209ffd3a9980","Type":"ContainerStarted","Data":"4a8b4d49f9b3bdd59ee2d6d743d0429259f1b671eb62314a1762d03ee1883022"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.594067 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-d6cp4" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.602174 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-8wrcr" event={"ID":"16f3668e-a1a2-4c57-b00a-f10a3bbe15a5","Type":"ContainerStarted","Data":"c883c4742c8b9b9c484824cdb2a2ad9ae319a788644621ed5ea6b7f54e08d3d1"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.623678 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" event={"ID":"6381840d-a689-48da-803a-06ed230e7a62","Type":"ContainerStarted","Data":"fa4684b69d2f6d2be5176a34fc14ff6b6113ab4b2d4e0f5fb60472a86b06a48f"} Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.631909 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nkz5k" podStartSLOduration=50.631886242 podStartE2EDuration="50.631886242s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:52.62474738 +0000 UTC m=+117.565902094" watchObservedRunningTime="2026-03-18 06:48:52.631886242 +0000 UTC m=+117.573040956" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.635355 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6x6mh" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.643491 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-zhd5k" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.653448 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.662778 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.664344 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.164327617 +0000 UTC m=+118.105482541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.666907 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-d6cp4" podStartSLOduration=9.666882015 podStartE2EDuration="9.666882015s" podCreationTimestamp="2026-03-18 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:52.663493297 +0000 UTC m=+117.604648021" watchObservedRunningTime="2026-03-18 06:48:52.666882015 +0000 UTC m=+117.608036739" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.754098 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.754893 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.254876936 +0000 UTC m=+118.196031650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.780395 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n67xx" podStartSLOduration=49.780376524 podStartE2EDuration="49.780376524s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:52.727935446 +0000 UTC m=+117.669090160" watchObservedRunningTime="2026-03-18 06:48:52.780376524 +0000 UTC m=+117.721531228" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.818440 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-jchjc" podStartSLOduration=49.818421495 podStartE2EDuration="49.818421495s" podCreationTimestamp="2026-03-18 06:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:52.781517029 +0000 UTC m=+117.722671733" watchObservedRunningTime="2026-03-18 06:48:52.818421495 +0000 UTC m=+117.759576209" Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.863738 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.864061 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.364050017 +0000 UTC m=+118.305204731 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.964621 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.964834 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.464809228 +0000 UTC m=+118.405963942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:52 crc kubenswrapper[4917]: I0318 06:48:52.964946 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:52 crc kubenswrapper[4917]: E0318 06:48:52.965258 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.465247449 +0000 UTC m=+118.406402163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.042927 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-h4w6d" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.076209 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.076749 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.576734402 +0000 UTC m=+118.517889116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.178620 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.178964 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.678953376 +0000 UTC m=+118.620108080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.280205 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.280396 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.780370493 +0000 UTC m=+118.721525207 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.280548 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.280833 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.780821963 +0000 UTC m=+118.721976677 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.381853 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.382100 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.882074134 +0000 UTC m=+118.823228848 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.382345 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.382626 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.882619487 +0000 UTC m=+118.823774201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.390041 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6s8jm" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.472921 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lgc4l"] Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.473433 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" podUID="cb1793e3-4625-4f46-8be2-d75629f46371" containerName="controller-manager" containerID="cri-o://f69a9f028e36ae4e0ea6c3aec9fba467c7cd6da72cf93e485a8a4d08f60f5f39" gracePeriod=30 Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.483342 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.483453 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.483536 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.484177 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:53.984157686 +0000 UTC m=+118.925312400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.484528 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.493258 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.496536 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:53 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:53 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:53 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.496598 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.521754 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw"] Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.521964 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" podUID="76c2512e-6f87-4731-a42f-eddd2188ff59" containerName="route-controller-manager" containerID="cri-o://db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44" gracePeriod=30 Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.572782 4917 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.584566 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.584643 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.584696 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.585478 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:54.08546607 +0000 UTC m=+119.026620784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.587955 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.590189 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.594344 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.606404 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.611384 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.666978 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-874w9" event={"ID":"e10ffc7c-e1b0-4b26-a30d-687ef976191b","Type":"ContainerStarted","Data":"edc3ac9c296fa5e44b29a2af315637dd74b7cce05beba924e68a30f815e92d97"} Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.667020 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-874w9" event={"ID":"e10ffc7c-e1b0-4b26-a30d-687ef976191b","Type":"ContainerStarted","Data":"de923cc93081196e7c0f031538702f50fbca9db944c5744fa2e318e086116b7e"} Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.682281 4917 generic.go:334] "Generic (PLEG): container finished" podID="cb1793e3-4625-4f46-8be2-d75629f46371" containerID="f69a9f028e36ae4e0ea6c3aec9fba467c7cd6da72cf93e485a8a4d08f60f5f39" exitCode=0 Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.683000 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" event={"ID":"cb1793e3-4625-4f46-8be2-d75629f46371","Type":"ContainerDied","Data":"f69a9f028e36ae4e0ea6c3aec9fba467c7cd6da72cf93e485a8a4d08f60f5f39"} Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.684496 4917 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-8v8rg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.684540 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.684560 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" podUID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" gracePeriod=30 Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.687477 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.687762 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:54.187750565 +0000 UTC m=+119.128905279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.687898 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.688147 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:54.188140594 +0000 UTC m=+119.129295308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.790322 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.792034 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-18 06:48:54.292019805 +0000 UTC m=+119.233174519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.824033 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60852: no serving certificate available for the kubelet" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.877726 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.891536 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-client-ca\") pod \"cb1793e3-4625-4f46-8be2-d75629f46371\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.891577 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8th7\" (UniqueName: \"kubernetes.io/projected/cb1793e3-4625-4f46-8be2-d75629f46371-kube-api-access-n8th7\") pod \"cb1793e3-4625-4f46-8be2-d75629f46371\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.891696 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-proxy-ca-bundles\") pod \"cb1793e3-4625-4f46-8be2-d75629f46371\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.891742 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb1793e3-4625-4f46-8be2-d75629f46371-serving-cert\") pod \"cb1793e3-4625-4f46-8be2-d75629f46371\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.891767 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-config\") pod \"cb1793e3-4625-4f46-8be2-d75629f46371\" (UID: \"cb1793e3-4625-4f46-8be2-d75629f46371\") " Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.891918 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.892401 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-client-ca" (OuterVolumeSpecName: "client-ca") pod "cb1793e3-4625-4f46-8be2-d75629f46371" (UID: "cb1793e3-4625-4f46-8be2-d75629f46371"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.892888 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cb1793e3-4625-4f46-8be2-d75629f46371" (UID: "cb1793e3-4625-4f46-8be2-d75629f46371"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.894189 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-18 06:48:54.394170028 +0000 UTC m=+119.335324742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-hfll9" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.894316 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-config" (OuterVolumeSpecName: "config") pod "cb1793e3-4625-4f46-8be2-d75629f46371" (UID: "cb1793e3-4625-4f46-8be2-d75629f46371"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.906179 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb1793e3-4625-4f46-8be2-d75629f46371-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cb1793e3-4625-4f46-8be2-d75629f46371" (UID: "cb1793e3-4625-4f46-8be2-d75629f46371"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.914567 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb1793e3-4625-4f46-8be2-d75629f46371-kube-api-access-n8th7" (OuterVolumeSpecName: "kube-api-access-n8th7") pod "cb1793e3-4625-4f46-8be2-d75629f46371" (UID: "cb1793e3-4625-4f46-8be2-d75629f46371"). InnerVolumeSpecName "kube-api-access-n8th7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.932899 4917 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-18T06:48:53.572809403Z","Handler":null,"Name":""} Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.939576 4917 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.939621 4917 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.966073 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z9rcc"] Mar 18 06:48:53 crc kubenswrapper[4917]: E0318 06:48:53.966248 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb1793e3-4625-4f46-8be2-d75629f46371" containerName="controller-manager" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.966261 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb1793e3-4625-4f46-8be2-d75629f46371" containerName="controller-manager" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.966367 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb1793e3-4625-4f46-8be2-d75629f46371" containerName="controller-manager" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.966998 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.969503 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.976111 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z9rcc"] Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.992772 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.993033 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-catalog-content\") pod \"certified-operators-z9rcc\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.993132 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcvls\" (UniqueName: \"kubernetes.io/projected/552a2644-267a-4c40-8ee5-f91bd933e7f2-kube-api-access-hcvls\") pod \"certified-operators-z9rcc\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.993202 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-utilities\") pod \"certified-operators-z9rcc\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.993325 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.993392 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8th7\" (UniqueName: \"kubernetes.io/projected/cb1793e3-4625-4f46-8be2-d75629f46371-kube-api-access-n8th7\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.993453 4917 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.993506 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb1793e3-4625-4f46-8be2-d75629f46371-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:53 crc kubenswrapper[4917]: I0318 06:48:53.993564 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb1793e3-4625-4f46-8be2-d75629f46371-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.003107 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.094451 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.094805 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-catalog-content\") pod \"certified-operators-z9rcc\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.094932 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcvls\" (UniqueName: \"kubernetes.io/projected/552a2644-267a-4c40-8ee5-f91bd933e7f2-kube-api-access-hcvls\") pod \"certified-operators-z9rcc\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.095686 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-utilities\") pod \"certified-operators-z9rcc\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.095277 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-catalog-content\") pod \"certified-operators-z9rcc\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.096040 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-utilities\") pod \"certified-operators-z9rcc\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.098547 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.098667 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.120665 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcvls\" (UniqueName: \"kubernetes.io/projected/552a2644-267a-4c40-8ee5-f91bd933e7f2-kube-api-access-hcvls\") pod \"certified-operators-z9rcc\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.143712 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-hfll9\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.171573 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6mhrt"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.173465 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.175478 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.184019 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.189866 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mhrt"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.196671 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-config\") pod \"76c2512e-6f87-4731-a42f-eddd2188ff59\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.196707 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-client-ca\") pod \"76c2512e-6f87-4731-a42f-eddd2188ff59\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.196777 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c2512e-6f87-4731-a42f-eddd2188ff59-serving-cert\") pod \"76c2512e-6f87-4731-a42f-eddd2188ff59\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.196872 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8txzs\" (UniqueName: \"kubernetes.io/projected/76c2512e-6f87-4731-a42f-eddd2188ff59-kube-api-access-8txzs\") pod \"76c2512e-6f87-4731-a42f-eddd2188ff59\" (UID: \"76c2512e-6f87-4731-a42f-eddd2188ff59\") " Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.197137 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-utilities\") pod \"community-operators-6mhrt\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.197162 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x6m4\" (UniqueName: \"kubernetes.io/projected/52df4f75-850c-4266-b94d-909e90669389-kube-api-access-5x6m4\") pod \"community-operators-6mhrt\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.197187 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-catalog-content\") pod \"community-operators-6mhrt\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.198399 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-client-ca" (OuterVolumeSpecName: "client-ca") pod "76c2512e-6f87-4731-a42f-eddd2188ff59" (UID: "76c2512e-6f87-4731-a42f-eddd2188ff59"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.203456 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-config" (OuterVolumeSpecName: "config") pod "76c2512e-6f87-4731-a42f-eddd2188ff59" (UID: "76c2512e-6f87-4731-a42f-eddd2188ff59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.204265 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c2512e-6f87-4731-a42f-eddd2188ff59-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76c2512e-6f87-4731-a42f-eddd2188ff59" (UID: "76c2512e-6f87-4731-a42f-eddd2188ff59"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.204875 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c2512e-6f87-4731-a42f-eddd2188ff59-kube-api-access-8txzs" (OuterVolumeSpecName: "kube-api-access-8txzs") pod "76c2512e-6f87-4731-a42f-eddd2188ff59" (UID: "76c2512e-6f87-4731-a42f-eddd2188ff59"). InnerVolumeSpecName "kube-api-access-8txzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.255474 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.299289 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-utilities\") pod \"community-operators-6mhrt\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.299850 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x6m4\" (UniqueName: \"kubernetes.io/projected/52df4f75-850c-4266-b94d-909e90669389-kube-api-access-5x6m4\") pod \"community-operators-6mhrt\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.299882 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-catalog-content\") pod \"community-operators-6mhrt\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.299944 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76c2512e-6f87-4731-a42f-eddd2188ff59-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.299962 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8txzs\" (UniqueName: \"kubernetes.io/projected/76c2512e-6f87-4731-a42f-eddd2188ff59-kube-api-access-8txzs\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.299975 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.299984 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76c2512e-6f87-4731-a42f-eddd2188ff59-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.300199 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-utilities\") pod \"community-operators-6mhrt\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.300469 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-catalog-content\") pod \"community-operators-6mhrt\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.311807 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c77467474-sm4rm"] Mar 18 06:48:54 crc kubenswrapper[4917]: E0318 06:48:54.312037 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c2512e-6f87-4731-a42f-eddd2188ff59" containerName="route-controller-manager" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.312049 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c2512e-6f87-4731-a42f-eddd2188ff59" containerName="route-controller-manager" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.312174 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c2512e-6f87-4731-a42f-eddd2188ff59" containerName="route-controller-manager" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.312559 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.327353 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c77467474-sm4rm"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.344258 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x6m4\" (UniqueName: \"kubernetes.io/projected/52df4f75-850c-4266-b94d-909e90669389-kube-api-access-5x6m4\") pod \"community-operators-6mhrt\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.353507 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.354235 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.362702 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.379070 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5svt4"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.380662 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.386673 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.388111 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5svt4"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.402356 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-utilities\") pod \"certified-operators-5svt4\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.402393 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr82h\" (UniqueName: \"kubernetes.io/projected/4405ec92-0cef-47b9-a1bd-07ceda6138df-kube-api-access-sr82h\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.402420 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqv5k\" (UniqueName: \"kubernetes.io/projected/11ad5fdc-a94e-46bb-b6d8-703862926e33-kube-api-access-wqv5k\") pod \"certified-operators-5svt4\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.402480 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlklc\" (UniqueName: \"kubernetes.io/projected/c89ac573-aadc-4bc4-8b50-4b06787e12e8-kube-api-access-nlklc\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.402523 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-proxy-ca-bundles\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.402540 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-client-ca\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.402603 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c89ac573-aadc-4bc4-8b50-4b06787e12e8-serving-cert\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.402621 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4405ec92-0cef-47b9-a1bd-07ceda6138df-serving-cert\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.402640 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-config\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.408253 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-config\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.408307 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-client-ca\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.408346 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-catalog-content\") pod \"certified-operators-5svt4\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: W0318 06:48:54.409227 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-50807d6275377f79480e1a4f341dd9510b38ad98d6db680f8903b3c0fd30464e WatchSource:0}: Error finding container 50807d6275377f79480e1a4f341dd9510b38ad98d6db680f8903b3c0fd30464e: Status 404 returned error can't find the container with id 50807d6275377f79480e1a4f341dd9510b38ad98d6db680f8903b3c0fd30464e Mar 18 06:48:54 crc kubenswrapper[4917]: W0318 06:48:54.452876 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-9ff2b11126c2a701d9a4bc1f553ffe20ae9e8cd6b5a6f140818c1d3a847eca99 WatchSource:0}: Error finding container 9ff2b11126c2a701d9a4bc1f553ffe20ae9e8cd6b5a6f140818c1d3a847eca99: Status 404 returned error can't find the container with id 9ff2b11126c2a701d9a4bc1f553ffe20ae9e8cd6b5a6f140818c1d3a847eca99 Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.498185 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.499101 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:54 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:54 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:54 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.499155 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.512772 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-catalog-content\") pod \"certified-operators-5svt4\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.512823 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-utilities\") pod \"certified-operators-5svt4\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.512844 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr82h\" (UniqueName: \"kubernetes.io/projected/4405ec92-0cef-47b9-a1bd-07ceda6138df-kube-api-access-sr82h\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.512869 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqv5k\" (UniqueName: \"kubernetes.io/projected/11ad5fdc-a94e-46bb-b6d8-703862926e33-kube-api-access-wqv5k\") pod \"certified-operators-5svt4\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.512891 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlklc\" (UniqueName: \"kubernetes.io/projected/c89ac573-aadc-4bc4-8b50-4b06787e12e8-kube-api-access-nlklc\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.512930 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-proxy-ca-bundles\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.512948 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-client-ca\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.512966 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c89ac573-aadc-4bc4-8b50-4b06787e12e8-serving-cert\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.512980 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4405ec92-0cef-47b9-a1bd-07ceda6138df-serving-cert\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.513001 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-config\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.513018 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-config\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.513043 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-client-ca\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.517204 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-proxy-ca-bundles\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.517296 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-client-ca\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.517475 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-catalog-content\") pod \"certified-operators-5svt4\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.518725 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-utilities\") pod \"certified-operators-5svt4\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.518927 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-client-ca\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.519721 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-config\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.526207 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c89ac573-aadc-4bc4-8b50-4b06787e12e8-serving-cert\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.526354 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4405ec92-0cef-47b9-a1bd-07ceda6138df-serving-cert\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.526546 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-config\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.538246 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqv5k\" (UniqueName: \"kubernetes.io/projected/11ad5fdc-a94e-46bb-b6d8-703862926e33-kube-api-access-wqv5k\") pod \"certified-operators-5svt4\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.538967 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlklc\" (UniqueName: \"kubernetes.io/projected/c89ac573-aadc-4bc4-8b50-4b06787e12e8-kube-api-access-nlklc\") pod \"controller-manager-7c77467474-sm4rm\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.539121 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr82h\" (UniqueName: \"kubernetes.io/projected/4405ec92-0cef-47b9-a1bd-07ceda6138df-kube-api-access-sr82h\") pod \"route-controller-manager-6dbdf68785-z68w8\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.568707 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m8w6h"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.571487 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.581183 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8w6h"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.602031 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hfll9"] Mar 18 06:48:54 crc kubenswrapper[4917]: W0318 06:48:54.618594 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9512feb_84a2_46b7_8df1_a672b069d7bc.slice/crio-58f464e54bad617c117f04ada1bc7b03514007489e20247928c1a7d1acb09cb9 WatchSource:0}: Error finding container 58f464e54bad617c117f04ada1bc7b03514007489e20247928c1a7d1acb09cb9: Status 404 returned error can't find the container with id 58f464e54bad617c117f04ada1bc7b03514007489e20247928c1a7d1acb09cb9 Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.646214 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.664461 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z9rcc"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.680227 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.708731 4917 generic.go:334] "Generic (PLEG): container finished" podID="76c2512e-6f87-4731-a42f-eddd2188ff59" containerID="db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44" exitCode=0 Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.708787 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" event={"ID":"76c2512e-6f87-4731-a42f-eddd2188ff59","Type":"ContainerDied","Data":"db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.708812 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" event={"ID":"76c2512e-6f87-4731-a42f-eddd2188ff59","Type":"ContainerDied","Data":"9657c34a37b65a52efa092305c647a36c3913ec928114b0f6bbfbb0427dcb40c"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.708831 4917 scope.go:117] "RemoveContainer" containerID="db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.708937 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.711148 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" event={"ID":"cb1793e3-4625-4f46-8be2-d75629f46371","Type":"ContainerDied","Data":"e30bd341967e23184c74d64f8e7d959637d8e67976efa1af028acc0c2102a7cf"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.711182 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lgc4l" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.712461 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" event={"ID":"d9512feb-84a2-46b7-8df1-a672b069d7bc","Type":"ContainerStarted","Data":"58f464e54bad617c117f04ada1bc7b03514007489e20247928c1a7d1acb09cb9"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.714248 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-utilities\") pod \"community-operators-m8w6h\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.714291 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6jsk\" (UniqueName: \"kubernetes.io/projected/7a48f389-623d-42a9-ad95-0e07c27b2eed-kube-api-access-v6jsk\") pod \"community-operators-m8w6h\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.714354 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-catalog-content\") pod \"community-operators-m8w6h\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.718554 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e637fbbf245ef50bb0d2c9153ab88a6a4077471e69cf2747bf17c4b68905213b"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.718609 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"50807d6275377f79480e1a4f341dd9510b38ad98d6db680f8903b3c0fd30464e"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.720277 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ee125a5bcd08616f70cb8746a09a468efdc6f18e1b11638afcc6901fd92aa2c1"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.720310 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9ff2b11126c2a701d9a4bc1f553ffe20ae9e8cd6b5a6f140818c1d3a847eca99"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.720467 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.722936 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-874w9" event={"ID":"e10ffc7c-e1b0-4b26-a30d-687ef976191b","Type":"ContainerStarted","Data":"c56406aba67acd90ecc25ceb17ec5ba340d88085e2c556cff05826e8049ef2f9"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.724654 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.734860 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"67dcf2e2aad82f72c285c467525ae1d7a1ec217013b7a30077eb3f770fa7ba62"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.734893 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"aa649609edb444188a626ad91a6793c0879623a7be56cb192554365f347e4c6a"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.757134 4917 generic.go:334] "Generic (PLEG): container finished" podID="39975bd4-f79a-464b-a4cc-65220c2ee731" containerID="ec7f4e8b612cb727451561a1ec45ab77dbfd14e43853778c9dbfd5f6e0f8e66c" exitCode=0 Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.759484 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" event={"ID":"39975bd4-f79a-464b-a4cc-65220c2ee731","Type":"ContainerDied","Data":"ec7f4e8b612cb727451561a1ec45ab77dbfd14e43853778c9dbfd5f6e0f8e66c"} Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.780984 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6mhrt"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.781371 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-874w9" podStartSLOduration=11.781356012 podStartE2EDuration="11.781356012s" podCreationTimestamp="2026-03-18 06:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:54.777411263 +0000 UTC m=+119.718565987" watchObservedRunningTime="2026-03-18 06:48:54.781356012 +0000 UTC m=+119.722510716" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.783251 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.788485 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cm4kv" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.806661 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lgc4l"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.811684 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lgc4l"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.815386 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-catalog-content\") pod \"community-operators-m8w6h\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.815476 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-utilities\") pod \"community-operators-m8w6h\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.815525 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6jsk\" (UniqueName: \"kubernetes.io/projected/7a48f389-623d-42a9-ad95-0e07c27b2eed-kube-api-access-v6jsk\") pod \"community-operators-m8w6h\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.816331 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-catalog-content\") pod \"community-operators-m8w6h\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.816671 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-utilities\") pod \"community-operators-m8w6h\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.816909 4917 scope.go:117] "RemoveContainer" containerID="db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44" Mar 18 06:48:54 crc kubenswrapper[4917]: E0318 06:48:54.823022 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44\": container with ID starting with db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44 not found: ID does not exist" containerID="db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.823069 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44"} err="failed to get container status \"db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44\": rpc error: code = NotFound desc = could not find container \"db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44\": container with ID starting with db9932db9c5cfbe5ba2f26ade8052b4d35271e2bc00719699d4b01e4ed3b2c44 not found: ID does not exist" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.823098 4917 scope.go:117] "RemoveContainer" containerID="f69a9f028e36ae4e0ea6c3aec9fba467c7cd6da72cf93e485a8a4d08f60f5f39" Mar 18 06:48:54 crc kubenswrapper[4917]: W0318 06:48:54.830868 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52df4f75_850c_4266_b94d_909e90669389.slice/crio-d6e3a08a85af864915b0040a1d02ef50f44f2e15cf79d08a19597f3736af6186 WatchSource:0}: Error finding container d6e3a08a85af864915b0040a1d02ef50f44f2e15cf79d08a19597f3736af6186: Status 404 returned error can't find the container with id d6e3a08a85af864915b0040a1d02ef50f44f2e15cf79d08a19597f3736af6186 Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.848007 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6jsk\" (UniqueName: \"kubernetes.io/projected/7a48f389-623d-42a9-ad95-0e07c27b2eed-kube-api-access-v6jsk\") pod \"community-operators-m8w6h\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.865523 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.871462 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kpdbw"] Mar 18 06:48:54 crc kubenswrapper[4917]: I0318 06:48:54.912401 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.120096 4917 patch_prober.go:28] interesting pod/downloads-7954f5f757-zk4nl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.120411 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zk4nl" podUID="f4c8c2e4-554f-423e-81a2-cd9f63eb7250" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.120194 4917 patch_prober.go:28] interesting pod/downloads-7954f5f757-zk4nl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.120762 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zk4nl" podUID="f4c8c2e4-554f-423e-81a2-cd9f63eb7250" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.201794 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c77467474-sm4rm"] Mar 18 06:48:55 crc kubenswrapper[4917]: W0318 06:48:55.224684 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89ac573_aadc_4bc4_8b50_4b06787e12e8.slice/crio-f81b7782d85da6f15b648d3792c61e390558fbf89e1b55186707fc87412329e7 WatchSource:0}: Error finding container f81b7782d85da6f15b648d3792c61e390558fbf89e1b55186707fc87412329e7: Status 404 returned error can't find the container with id f81b7782d85da6f15b648d3792c61e390558fbf89e1b55186707fc87412329e7 Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.376724 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5svt4"] Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.439623 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m8w6h"] Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.502220 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:55 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:55 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:55 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.502328 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.651439 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8"] Mar 18 06:48:55 crc kubenswrapper[4917]: W0318 06:48:55.668838 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4405ec92_0cef_47b9_a1bd_07ceda6138df.slice/crio-943b6d2b26e2ef49cf2c9efb99bdb3c6aed8a82cf7e56cd08cb4b5783817e46a WatchSource:0}: Error finding container 943b6d2b26e2ef49cf2c9efb99bdb3c6aed8a82cf7e56cd08cb4b5783817e46a: Status 404 returned error can't find the container with id 943b6d2b26e2ef49cf2c9efb99bdb3c6aed8a82cf7e56cd08cb4b5783817e46a Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.785304 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c2512e-6f87-4731-a42f-eddd2188ff59" path="/var/lib/kubelet/pods/76c2512e-6f87-4731-a42f-eddd2188ff59/volumes" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.786797 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.787485 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb1793e3-4625-4f46-8be2-d75629f46371" path="/var/lib/kubelet/pods/cb1793e3-4625-4f46-8be2-d75629f46371/volumes" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.789511 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" event={"ID":"c89ac573-aadc-4bc4-8b50-4b06787e12e8","Type":"ContainerStarted","Data":"3a231f0c8be252db9a5a038863f52b5deaebe79fdcf5a821cbbb2f3c7c144035"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.789553 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" event={"ID":"c89ac573-aadc-4bc4-8b50-4b06787e12e8","Type":"ContainerStarted","Data":"f81b7782d85da6f15b648d3792c61e390558fbf89e1b55186707fc87412329e7"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.789599 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.789617 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.789629 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" event={"ID":"d9512feb-84a2-46b7-8df1-a672b069d7bc","Type":"ContainerStarted","Data":"a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.789640 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" event={"ID":"4405ec92-0cef-47b9-a1bd-07ceda6138df","Type":"ContainerStarted","Data":"943b6d2b26e2ef49cf2c9efb99bdb3c6aed8a82cf7e56cd08cb4b5783817e46a"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.792234 4917 generic.go:334] "Generic (PLEG): container finished" podID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerID="187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a" exitCode=0 Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.792315 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9rcc" event={"ID":"552a2644-267a-4c40-8ee5-f91bd933e7f2","Type":"ContainerDied","Data":"187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.792351 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9rcc" event={"ID":"552a2644-267a-4c40-8ee5-f91bd933e7f2","Type":"ContainerStarted","Data":"a28a0d25ad783182c4169cf0bb0a9935b4f68c637bf5bcb794238b3f93320e1d"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.793814 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.795204 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.795993 4917 generic.go:334] "Generic (PLEG): container finished" podID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerID="2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0" exitCode=0 Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.796060 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5svt4" event={"ID":"11ad5fdc-a94e-46bb-b6d8-703862926e33","Type":"ContainerDied","Data":"2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.796084 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5svt4" event={"ID":"11ad5fdc-a94e-46bb-b6d8-703862926e33","Type":"ContainerStarted","Data":"a2fbfeef147774529ccee50e65bb3cee5c22bbb40f64bc273503835d265b2af6"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.802221 4917 generic.go:334] "Generic (PLEG): container finished" podID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerID="59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434" exitCode=0 Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.802845 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8w6h" event={"ID":"7a48f389-623d-42a9-ad95-0e07c27b2eed","Type":"ContainerDied","Data":"59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.802887 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8w6h" event={"ID":"7a48f389-623d-42a9-ad95-0e07c27b2eed","Type":"ContainerStarted","Data":"27440f3578e16fac87b0153c3fe9596d9f90ab7a6bb2f6a272d0942a3c7b0b92"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.808507 4917 generic.go:334] "Generic (PLEG): container finished" podID="52df4f75-850c-4266-b94d-909e90669389" containerID="06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354" exitCode=0 Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.810686 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mhrt" event={"ID":"52df4f75-850c-4266-b94d-909e90669389","Type":"ContainerDied","Data":"06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.810752 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mhrt" event={"ID":"52df4f75-850c-4266-b94d-909e90669389","Type":"ContainerStarted","Data":"d6e3a08a85af864915b0040a1d02ef50f44f2e15cf79d08a19597f3736af6186"} Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.974127 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mmg2f"] Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.976002 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:55 crc kubenswrapper[4917]: I0318 06:48:55.986087 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.006682 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmg2f"] Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.046790 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-utilities\") pod \"redhat-marketplace-mmg2f\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.046939 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xvhm\" (UniqueName: \"kubernetes.io/projected/528debb6-ed0b-4099-a21d-81d1da5ba9f6-kube-api-access-2xvhm\") pod \"redhat-marketplace-mmg2f\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.047035 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-catalog-content\") pod \"redhat-marketplace-mmg2f\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.120560 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.120621 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.129301 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" podStartSLOduration=54.129264666 podStartE2EDuration="54.129264666s" podCreationTimestamp="2026-03-18 06:48:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:56.127111088 +0000 UTC m=+121.068265802" watchObservedRunningTime="2026-03-18 06:48:56.129264666 +0000 UTC m=+121.070419380" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.131452 4917 patch_prober.go:28] interesting pod/console-f9d7485db-sgj9g container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.131523 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-sgj9g" podUID="7c21e973-7d87-496c-81ba-1425ba599774" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.147945 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-catalog-content\") pod \"redhat-marketplace-mmg2f\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.148019 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-utilities\") pod \"redhat-marketplace-mmg2f\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.148086 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xvhm\" (UniqueName: \"kubernetes.io/projected/528debb6-ed0b-4099-a21d-81d1da5ba9f6-kube-api-access-2xvhm\") pod \"redhat-marketplace-mmg2f\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.151725 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-catalog-content\") pod \"redhat-marketplace-mmg2f\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.151827 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-utilities\") pod \"redhat-marketplace-mmg2f\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.171944 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xvhm\" (UniqueName: \"kubernetes.io/projected/528debb6-ed0b-4099-a21d-81d1da5ba9f6-kube-api-access-2xvhm\") pod \"redhat-marketplace-mmg2f\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.182200 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" podStartSLOduration=2.182176774 podStartE2EDuration="2.182176774s" podCreationTimestamp="2026-03-18 06:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:56.173799734 +0000 UTC m=+121.114954448" watchObservedRunningTime="2026-03-18 06:48:56.182176774 +0000 UTC m=+121.123331488" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.194476 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.250321 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdjlr\" (UniqueName: \"kubernetes.io/projected/39975bd4-f79a-464b-a4cc-65220c2ee731-kube-api-access-mdjlr\") pod \"39975bd4-f79a-464b-a4cc-65220c2ee731\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.250463 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39975bd4-f79a-464b-a4cc-65220c2ee731-secret-volume\") pod \"39975bd4-f79a-464b-a4cc-65220c2ee731\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.250502 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39975bd4-f79a-464b-a4cc-65220c2ee731-config-volume\") pod \"39975bd4-f79a-464b-a4cc-65220c2ee731\" (UID: \"39975bd4-f79a-464b-a4cc-65220c2ee731\") " Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.253844 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39975bd4-f79a-464b-a4cc-65220c2ee731-config-volume" (OuterVolumeSpecName: "config-volume") pod "39975bd4-f79a-464b-a4cc-65220c2ee731" (UID: "39975bd4-f79a-464b-a4cc-65220c2ee731"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.255208 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39975bd4-f79a-464b-a4cc-65220c2ee731-kube-api-access-mdjlr" (OuterVolumeSpecName: "kube-api-access-mdjlr") pod "39975bd4-f79a-464b-a4cc-65220c2ee731" (UID: "39975bd4-f79a-464b-a4cc-65220c2ee731"). InnerVolumeSpecName "kube-api-access-mdjlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.255701 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39975bd4-f79a-464b-a4cc-65220c2ee731-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "39975bd4-f79a-464b-a4cc-65220c2ee731" (UID: "39975bd4-f79a-464b-a4cc-65220c2ee731"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.311233 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.351878 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdjlr\" (UniqueName: \"kubernetes.io/projected/39975bd4-f79a-464b-a4cc-65220c2ee731-kube-api-access-mdjlr\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.351919 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/39975bd4-f79a-464b-a4cc-65220c2ee731-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.351935 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39975bd4-f79a-464b-a4cc-65220c2ee731-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.368262 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mzdbq"] Mar 18 06:48:56 crc kubenswrapper[4917]: E0318 06:48:56.368494 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39975bd4-f79a-464b-a4cc-65220c2ee731" containerName="collect-profiles" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.368513 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="39975bd4-f79a-464b-a4cc-65220c2ee731" containerName="collect-profiles" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.368647 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="39975bd4-f79a-464b-a4cc-65220c2ee731" containerName="collect-profiles" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.369504 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.385205 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzdbq"] Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.416092 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60862: no serving certificate available for the kubelet" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.453140 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-utilities\") pod \"redhat-marketplace-mzdbq\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.453617 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-catalog-content\") pod \"redhat-marketplace-mzdbq\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.453672 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gts6j\" (UniqueName: \"kubernetes.io/projected/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-kube-api-access-gts6j\") pod \"redhat-marketplace-mzdbq\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.494513 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.506813 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:56 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:56 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:56 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.506869 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.554459 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-catalog-content\") pod \"redhat-marketplace-mzdbq\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.554569 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gts6j\" (UniqueName: \"kubernetes.io/projected/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-kube-api-access-gts6j\") pod \"redhat-marketplace-mzdbq\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.554628 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-utilities\") pod \"redhat-marketplace-mzdbq\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.556359 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-utilities\") pod \"redhat-marketplace-mzdbq\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.556603 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-catalog-content\") pod \"redhat-marketplace-mzdbq\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.570813 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.594335 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gts6j\" (UniqueName: \"kubernetes.io/projected/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-kube-api-access-gts6j\") pod \"redhat-marketplace-mzdbq\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: E0318 06:48:56.610865 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.613840 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmg2f"] Mar 18 06:48:56 crc kubenswrapper[4917]: E0318 06:48:56.615695 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:48:56 crc kubenswrapper[4917]: E0318 06:48:56.629769 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:48:56 crc kubenswrapper[4917]: E0318 06:48:56.629822 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" podUID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" containerName="kube-multus-additional-cni-plugins" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.640824 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.641540 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.643667 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.643904 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.655093 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.704033 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.756436 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.756521 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.844678 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmg2f" event={"ID":"528debb6-ed0b-4099-a21d-81d1da5ba9f6","Type":"ContainerStarted","Data":"c7903086137eafa2ddd8a8596a701bcf8cc924389310159ccdfdfd564d580227"} Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.847310 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" event={"ID":"39975bd4-f79a-464b-a4cc-65220c2ee731","Type":"ContainerDied","Data":"313ef719145b0f70f3d2f7bf8bb920de5358b8ed5ea8dda1d16174a3bfff1910"} Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.847353 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.847367 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313ef719145b0f70f3d2f7bf8bb920de5358b8ed5ea8dda1d16174a3bfff1910" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.850672 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" event={"ID":"4405ec92-0cef-47b9-a1bd-07ceda6138df","Type":"ContainerStarted","Data":"695cc7feaedaea53ff8acc60e61f565975943c57150d4ab27dcfb1e9e450f5e1"} Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.851065 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.857473 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.857549 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.857643 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.866710 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.871799 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.872528 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.875258 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" podStartSLOduration=2.875226513 podStartE2EDuration="2.875226513s" podCreationTimestamp="2026-03-18 06:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:56.87328115 +0000 UTC m=+121.814435864" watchObservedRunningTime="2026-03-18 06:48:56.875226513 +0000 UTC m=+121.816381227" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.876874 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.877046 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.897232 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.899469 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.959167 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.959238 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:48:56 crc kubenswrapper[4917]: I0318 06:48:56.964764 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzdbq"] Mar 18 06:48:56 crc kubenswrapper[4917]: W0318 06:48:56.976870 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b2760aa_6e2d_48ef_8ed6_a0225e59df24.slice/crio-c38875a1a5a0df38cdad652ae5317afb6e50c2a952b8201c7c9778bb97e65295 WatchSource:0}: Error finding container c38875a1a5a0df38cdad652ae5317afb6e50c2a952b8201c7c9778bb97e65295: Status 404 returned error can't find the container with id c38875a1a5a0df38cdad652ae5317afb6e50c2a952b8201c7c9778bb97e65295 Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.023387 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.061275 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.061313 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.061383 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.090139 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.178927 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jlftl"] Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.194720 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.195413 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlftl"] Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.198711 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.200233 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.264399 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-utilities\") pod \"redhat-operators-jlftl\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.264463 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjkr5\" (UniqueName: \"kubernetes.io/projected/edae9f71-365b-48dd-91fa-4ad4d56dcc62-kube-api-access-fjkr5\") pod \"redhat-operators-jlftl\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.264506 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-catalog-content\") pod \"redhat-operators-jlftl\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.365426 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjkr5\" (UniqueName: \"kubernetes.io/projected/edae9f71-365b-48dd-91fa-4ad4d56dcc62-kube-api-access-fjkr5\") pod \"redhat-operators-jlftl\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.365975 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-catalog-content\") pod \"redhat-operators-jlftl\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.366044 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-utilities\") pod \"redhat-operators-jlftl\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.366630 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-utilities\") pod \"redhat-operators-jlftl\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.366809 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-catalog-content\") pod \"redhat-operators-jlftl\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.391303 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjkr5\" (UniqueName: \"kubernetes.io/projected/edae9f71-365b-48dd-91fa-4ad4d56dcc62-kube-api-access-fjkr5\") pod \"redhat-operators-jlftl\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.496006 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:57 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:57 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:57 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.496132 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.514466 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.552927 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.575478 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ql22d"] Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.578028 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.581171 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ql22d"] Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.675379 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8r2l\" (UniqueName: \"kubernetes.io/projected/9cfbe58b-49c6-429c-a32f-f92719d730b8-kube-api-access-p8r2l\") pod \"redhat-operators-ql22d\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.675463 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-utilities\") pod \"redhat-operators-ql22d\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.675509 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-catalog-content\") pod \"redhat-operators-ql22d\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.682428 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 18 06:48:57 crc kubenswrapper[4917]: W0318 06:48:57.692174 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod08bb733a_1708_4ec6_8ea7_a53dbaa28b54.slice/crio-ac43293529b2903781a3d3c8ae8047cd6b656ca41f590eee6a64aa270c8ee1a2 WatchSource:0}: Error finding container ac43293529b2903781a3d3c8ae8047cd6b656ca41f590eee6a64aa270c8ee1a2: Status 404 returned error can't find the container with id ac43293529b2903781a3d3c8ae8047cd6b656ca41f590eee6a64aa270c8ee1a2 Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.779922 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-utilities\") pod \"redhat-operators-ql22d\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.777934 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-utilities\") pod \"redhat-operators-ql22d\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.780667 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-catalog-content\") pod \"redhat-operators-ql22d\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.780709 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8r2l\" (UniqueName: \"kubernetes.io/projected/9cfbe58b-49c6-429c-a32f-f92719d730b8-kube-api-access-p8r2l\") pod \"redhat-operators-ql22d\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.781322 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-catalog-content\") pod \"redhat-operators-ql22d\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.803096 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8r2l\" (UniqueName: \"kubernetes.io/projected/9cfbe58b-49c6-429c-a32f-f92719d730b8-kube-api-access-p8r2l\") pod \"redhat-operators-ql22d\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.880017 4917 generic.go:334] "Generic (PLEG): container finished" podID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerID="4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c" exitCode=0 Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.880197 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzdbq" event={"ID":"0b2760aa-6e2d-48ef-8ed6-a0225e59df24","Type":"ContainerDied","Data":"4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c"} Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.880226 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzdbq" event={"ID":"0b2760aa-6e2d-48ef-8ed6-a0225e59df24","Type":"ContainerStarted","Data":"c38875a1a5a0df38cdad652ae5317afb6e50c2a952b8201c7c9778bb97e65295"} Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.881690 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53","Type":"ContainerStarted","Data":"9e204ec135389c495d18c4a929a9cb0d9856404a9b1078375aca259ebd3421f7"} Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.883348 4917 generic.go:334] "Generic (PLEG): container finished" podID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerID="06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0" exitCode=0 Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.883379 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmg2f" event={"ID":"528debb6-ed0b-4099-a21d-81d1da5ba9f6","Type":"ContainerDied","Data":"06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0"} Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.885526 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"08bb733a-1708-4ec6-8ea7-a53dbaa28b54","Type":"ContainerStarted","Data":"ac43293529b2903781a3d3c8ae8047cd6b656ca41f590eee6a64aa270c8ee1a2"} Mar 18 06:48:57 crc kubenswrapper[4917]: I0318 06:48:57.917266 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.055453 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jlftl"] Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.231797 4917 ???:1] "http: TLS handshake error from 192.168.126.11:60874: no serving certificate available for the kubelet" Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.496480 4917 patch_prober.go:28] interesting pod/router-default-5444994796-66xqj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 18 06:48:58 crc kubenswrapper[4917]: [-]has-synced failed: reason withheld Mar 18 06:48:58 crc kubenswrapper[4917]: [+]process-running ok Mar 18 06:48:58 crc kubenswrapper[4917]: healthz check failed Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.496815 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-66xqj" podUID="7f1745b0-b256-4dac-a5b4-a61d28697707" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.716991 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ql22d"] Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.901199 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql22d" event={"ID":"9cfbe58b-49c6-429c-a32f-f92719d730b8","Type":"ContainerStarted","Data":"38a5c2b22340ffd9f2fd365d586b385ab7480c108ed7b91d755a1e95f7ddbc87"} Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.912341 4917 generic.go:334] "Generic (PLEG): container finished" podID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerID="4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5" exitCode=0 Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.912454 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlftl" event={"ID":"edae9f71-365b-48dd-91fa-4ad4d56dcc62","Type":"ContainerDied","Data":"4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5"} Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.912490 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlftl" event={"ID":"edae9f71-365b-48dd-91fa-4ad4d56dcc62","Type":"ContainerStarted","Data":"7f2a27a1e9dd39ace6ba7ef26f0f7a1025c1aa8a66318999e07e4791e6b86d7b"} Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.924618 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53","Type":"ContainerStarted","Data":"e03329debc22d595ea1856ba7ea1c22c4d8937c3071fee9e98f2d81fffdf5e50"} Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.934970 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"08bb733a-1708-4ec6-8ea7-a53dbaa28b54","Type":"ContainerStarted","Data":"86e97a78d59f0be95f4cfba3577ab2dfb0281a61b3a2d05dddb30c50075b8ff7"} Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.964513 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.964495101 podStartE2EDuration="2.964495101s" podCreationTimestamp="2026-03-18 06:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:58.945689465 +0000 UTC m=+123.886844179" watchObservedRunningTime="2026-03-18 06:48:58.964495101 +0000 UTC m=+123.905649815" Mar 18 06:48:58 crc kubenswrapper[4917]: I0318 06:48:58.965546 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.965523764 podStartE2EDuration="2.965523764s" podCreationTimestamp="2026-03-18 06:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:48:58.962604768 +0000 UTC m=+123.903759482" watchObservedRunningTime="2026-03-18 06:48:58.965523764 +0000 UTC m=+123.906678478" Mar 18 06:48:59 crc kubenswrapper[4917]: I0318 06:48:59.496474 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:59 crc kubenswrapper[4917]: I0318 06:48:59.502973 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-66xqj" Mar 18 06:48:59 crc kubenswrapper[4917]: I0318 06:48:59.957622 4917 generic.go:334] "Generic (PLEG): container finished" podID="08bb733a-1708-4ec6-8ea7-a53dbaa28b54" containerID="86e97a78d59f0be95f4cfba3577ab2dfb0281a61b3a2d05dddb30c50075b8ff7" exitCode=0 Mar 18 06:48:59 crc kubenswrapper[4917]: I0318 06:48:59.957885 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"08bb733a-1708-4ec6-8ea7-a53dbaa28b54","Type":"ContainerDied","Data":"86e97a78d59f0be95f4cfba3577ab2dfb0281a61b3a2d05dddb30c50075b8ff7"} Mar 18 06:48:59 crc kubenswrapper[4917]: I0318 06:48:59.969018 4917 generic.go:334] "Generic (PLEG): container finished" podID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerID="9735fb4ad291fe6ea14c9acc3392fd84e37a7d3fa85356ea99e4b102776705f8" exitCode=0 Mar 18 06:48:59 crc kubenswrapper[4917]: I0318 06:48:59.969099 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql22d" event={"ID":"9cfbe58b-49c6-429c-a32f-f92719d730b8","Type":"ContainerDied","Data":"9735fb4ad291fe6ea14c9acc3392fd84e37a7d3fa85356ea99e4b102776705f8"} Mar 18 06:48:59 crc kubenswrapper[4917]: I0318 06:48:59.988723 4917 generic.go:334] "Generic (PLEG): container finished" podID="21f2ac9e-cf0c-4b04-94fc-3280c1adcd53" containerID="e03329debc22d595ea1856ba7ea1c22c4d8937c3071fee9e98f2d81fffdf5e50" exitCode=0 Mar 18 06:48:59 crc kubenswrapper[4917]: I0318 06:48:59.988853 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53","Type":"ContainerDied","Data":"e03329debc22d595ea1856ba7ea1c22c4d8937c3071fee9e98f2d81fffdf5e50"} Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.389149 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.390161 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.446125 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kube-api-access\") pod \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\" (UID: \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\") " Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.446235 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kubelet-dir\") pod \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\" (UID: \"08bb733a-1708-4ec6-8ea7-a53dbaa28b54\") " Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.446278 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kube-api-access\") pod \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\" (UID: \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\") " Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.446307 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kubelet-dir\") pod \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\" (UID: \"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53\") " Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.446636 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "21f2ac9e-cf0c-4b04-94fc-3280c1adcd53" (UID: "21f2ac9e-cf0c-4b04-94fc-3280c1adcd53"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.446688 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "08bb733a-1708-4ec6-8ea7-a53dbaa28b54" (UID: "08bb733a-1708-4ec6-8ea7-a53dbaa28b54"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.469896 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "08bb733a-1708-4ec6-8ea7-a53dbaa28b54" (UID: "08bb733a-1708-4ec6-8ea7-a53dbaa28b54"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.471980 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "21f2ac9e-cf0c-4b04-94fc-3280c1adcd53" (UID: "21f2ac9e-cf0c-4b04-94fc-3280c1adcd53"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.547944 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.547986 4917 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/08bb733a-1708-4ec6-8ea7-a53dbaa28b54-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.547995 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.548003 4917 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21f2ac9e-cf0c-4b04-94fc-3280c1adcd53-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.563472 4917 ???:1] "http: TLS handshake error from 192.168.126.11:41634: no serving certificate available for the kubelet" Mar 18 06:49:01 crc kubenswrapper[4917]: I0318 06:49:01.989067 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-d6cp4" Mar 18 06:49:02 crc kubenswrapper[4917]: I0318 06:49:02.025951 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 18 06:49:02 crc kubenswrapper[4917]: I0318 06:49:02.025976 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"21f2ac9e-cf0c-4b04-94fc-3280c1adcd53","Type":"ContainerDied","Data":"9e204ec135389c495d18c4a929a9cb0d9856404a9b1078375aca259ebd3421f7"} Mar 18 06:49:02 crc kubenswrapper[4917]: I0318 06:49:02.026041 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e204ec135389c495d18c4a929a9cb0d9856404a9b1078375aca259ebd3421f7" Mar 18 06:49:02 crc kubenswrapper[4917]: I0318 06:49:02.047257 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"08bb733a-1708-4ec6-8ea7-a53dbaa28b54","Type":"ContainerDied","Data":"ac43293529b2903781a3d3c8ae8047cd6b656ca41f590eee6a64aa270c8ee1a2"} Mar 18 06:49:02 crc kubenswrapper[4917]: I0318 06:49:02.047307 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac43293529b2903781a3d3c8ae8047cd6b656ca41f590eee6a64aa270c8ee1a2" Mar 18 06:49:02 crc kubenswrapper[4917]: I0318 06:49:02.047365 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 18 06:49:05 crc kubenswrapper[4917]: I0318 06:49:05.561849 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zk4nl" Mar 18 06:49:05 crc kubenswrapper[4917]: I0318 06:49:05.561914 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:49:05 crc kubenswrapper[4917]: I0318 06:49:05.682958 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:49:06 crc kubenswrapper[4917]: I0318 06:49:06.128352 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:49:06 crc kubenswrapper[4917]: I0318 06:49:06.135463 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 06:49:06 crc kubenswrapper[4917]: E0318 06:49:06.608232 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:49:06 crc kubenswrapper[4917]: E0318 06:49:06.619057 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:49:06 crc kubenswrapper[4917]: E0318 06:49:06.621913 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:49:06 crc kubenswrapper[4917]: E0318 06:49:06.622030 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" podUID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" containerName="kube-multus-additional-cni-plugins" Mar 18 06:49:13 crc kubenswrapper[4917]: I0318 06:49:13.050136 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c77467474-sm4rm"] Mar 18 06:49:13 crc kubenswrapper[4917]: I0318 06:49:13.050424 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8"] Mar 18 06:49:13 crc kubenswrapper[4917]: I0318 06:49:13.050563 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" podUID="4405ec92-0cef-47b9-a1bd-07ceda6138df" containerName="route-controller-manager" containerID="cri-o://695cc7feaedaea53ff8acc60e61f565975943c57150d4ab27dcfb1e9e450f5e1" gracePeriod=30 Mar 18 06:49:13 crc kubenswrapper[4917]: I0318 06:49:13.050794 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" podUID="c89ac573-aadc-4bc4-8b50-4b06787e12e8" containerName="controller-manager" containerID="cri-o://3a231f0c8be252db9a5a038863f52b5deaebe79fdcf5a821cbbb2f3c7c144035" gracePeriod=30 Mar 18 06:49:13 crc kubenswrapper[4917]: I0318 06:49:13.324044 4917 generic.go:334] "Generic (PLEG): container finished" podID="c89ac573-aadc-4bc4-8b50-4b06787e12e8" containerID="3a231f0c8be252db9a5a038863f52b5deaebe79fdcf5a821cbbb2f3c7c144035" exitCode=0 Mar 18 06:49:13 crc kubenswrapper[4917]: I0318 06:49:13.324083 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" event={"ID":"c89ac573-aadc-4bc4-8b50-4b06787e12e8","Type":"ContainerDied","Data":"3a231f0c8be252db9a5a038863f52b5deaebe79fdcf5a821cbbb2f3c7c144035"} Mar 18 06:49:13 crc kubenswrapper[4917]: I0318 06:49:13.326997 4917 generic.go:334] "Generic (PLEG): container finished" podID="4405ec92-0cef-47b9-a1bd-07ceda6138df" containerID="695cc7feaedaea53ff8acc60e61f565975943c57150d4ab27dcfb1e9e450f5e1" exitCode=0 Mar 18 06:49:13 crc kubenswrapper[4917]: I0318 06:49:13.327032 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" event={"ID":"4405ec92-0cef-47b9-a1bd-07ceda6138df","Type":"ContainerDied","Data":"695cc7feaedaea53ff8acc60e61f565975943c57150d4ab27dcfb1e9e450f5e1"} Mar 18 06:49:13 crc kubenswrapper[4917]: I0318 06:49:13.785306 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 18 06:49:14 crc kubenswrapper[4917]: I0318 06:49:14.263738 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:49:14 crc kubenswrapper[4917]: I0318 06:49:14.328705 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=1.328675258 podStartE2EDuration="1.328675258s" podCreationTimestamp="2026-03-18 06:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:14.322345324 +0000 UTC m=+139.263500068" watchObservedRunningTime="2026-03-18 06:49:14.328675258 +0000 UTC m=+139.269830032" Mar 18 06:49:14 crc kubenswrapper[4917]: I0318 06:49:14.647497 4917 patch_prober.go:28] interesting pod/controller-manager-7c77467474-sm4rm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 18 06:49:14 crc kubenswrapper[4917]: I0318 06:49:14.647560 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" podUID="c89ac573-aadc-4bc4-8b50-4b06787e12e8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 18 06:49:14 crc kubenswrapper[4917]: I0318 06:49:14.680890 4917 patch_prober.go:28] interesting pod/route-controller-manager-6dbdf68785-z68w8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Mar 18 06:49:14 crc kubenswrapper[4917]: I0318 06:49:14.680956 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" podUID="4405ec92-0cef-47b9-a1bd-07ceda6138df" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.818051 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.827935 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.835998 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-client-ca\") pod \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.836038 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-proxy-ca-bundles\") pod \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.836117 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlklc\" (UniqueName: \"kubernetes.io/projected/c89ac573-aadc-4bc4-8b50-4b06787e12e8-kube-api-access-nlklc\") pod \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.836144 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-config\") pod \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.836169 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c89ac573-aadc-4bc4-8b50-4b06787e12e8-serving-cert\") pod \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\" (UID: \"c89ac573-aadc-4bc4-8b50-4b06787e12e8\") " Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.836877 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-client-ca" (OuterVolumeSpecName: "client-ca") pod "c89ac573-aadc-4bc4-8b50-4b06787e12e8" (UID: "c89ac573-aadc-4bc4-8b50-4b06787e12e8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.837184 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c89ac573-aadc-4bc4-8b50-4b06787e12e8" (UID: "c89ac573-aadc-4bc4-8b50-4b06787e12e8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.837524 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-config" (OuterVolumeSpecName: "config") pod "c89ac573-aadc-4bc4-8b50-4b06787e12e8" (UID: "c89ac573-aadc-4bc4-8b50-4b06787e12e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.842512 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c89ac573-aadc-4bc4-8b50-4b06787e12e8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c89ac573-aadc-4bc4-8b50-4b06787e12e8" (UID: "c89ac573-aadc-4bc4-8b50-4b06787e12e8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.842825 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c89ac573-aadc-4bc4-8b50-4b06787e12e8-kube-api-access-nlklc" (OuterVolumeSpecName: "kube-api-access-nlklc") pod "c89ac573-aadc-4bc4-8b50-4b06787e12e8" (UID: "c89ac573-aadc-4bc4-8b50-4b06787e12e8"). InnerVolumeSpecName "kube-api-access-nlklc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.861314 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b77d7bd8-vnwhd"] Mar 18 06:49:15 crc kubenswrapper[4917]: E0318 06:49:15.861658 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4405ec92-0cef-47b9-a1bd-07ceda6138df" containerName="route-controller-manager" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.861680 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4405ec92-0cef-47b9-a1bd-07ceda6138df" containerName="route-controller-manager" Mar 18 06:49:15 crc kubenswrapper[4917]: E0318 06:49:15.861699 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f2ac9e-cf0c-4b04-94fc-3280c1adcd53" containerName="pruner" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.861710 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f2ac9e-cf0c-4b04-94fc-3280c1adcd53" containerName="pruner" Mar 18 06:49:15 crc kubenswrapper[4917]: E0318 06:49:15.861726 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08bb733a-1708-4ec6-8ea7-a53dbaa28b54" containerName="pruner" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.861737 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="08bb733a-1708-4ec6-8ea7-a53dbaa28b54" containerName="pruner" Mar 18 06:49:15 crc kubenswrapper[4917]: E0318 06:49:15.861756 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c89ac573-aadc-4bc4-8b50-4b06787e12e8" containerName="controller-manager" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.861768 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c89ac573-aadc-4bc4-8b50-4b06787e12e8" containerName="controller-manager" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.862032 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f2ac9e-cf0c-4b04-94fc-3280c1adcd53" containerName="pruner" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.862055 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c89ac573-aadc-4bc4-8b50-4b06787e12e8" containerName="controller-manager" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.862071 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="08bb733a-1708-4ec6-8ea7-a53dbaa28b54" containerName="pruner" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.862085 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4405ec92-0cef-47b9-a1bd-07ceda6138df" containerName="route-controller-manager" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.862544 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.880334 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b77d7bd8-vnwhd"] Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.936741 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-client-ca\") pod \"4405ec92-0cef-47b9-a1bd-07ceda6138df\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.936782 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-config\") pod \"4405ec92-0cef-47b9-a1bd-07ceda6138df\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.936819 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr82h\" (UniqueName: \"kubernetes.io/projected/4405ec92-0cef-47b9-a1bd-07ceda6138df-kube-api-access-sr82h\") pod \"4405ec92-0cef-47b9-a1bd-07ceda6138df\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.936864 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4405ec92-0cef-47b9-a1bd-07ceda6138df-serving-cert\") pod \"4405ec92-0cef-47b9-a1bd-07ceda6138df\" (UID: \"4405ec92-0cef-47b9-a1bd-07ceda6138df\") " Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.936983 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-serving-cert\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.937018 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-client-ca\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.937043 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-config\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.937061 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-proxy-ca-bundles\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.937105 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c6r6\" (UniqueName: \"kubernetes.io/projected/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-kube-api-access-7c6r6\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.937136 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c89ac573-aadc-4bc4-8b50-4b06787e12e8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.937146 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.937154 4917 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.937163 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlklc\" (UniqueName: \"kubernetes.io/projected/c89ac573-aadc-4bc4-8b50-4b06787e12e8-kube-api-access-nlklc\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.937171 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c89ac573-aadc-4bc4-8b50-4b06787e12e8-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.938089 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-config" (OuterVolumeSpecName: "config") pod "4405ec92-0cef-47b9-a1bd-07ceda6138df" (UID: "4405ec92-0cef-47b9-a1bd-07ceda6138df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.938226 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-client-ca" (OuterVolumeSpecName: "client-ca") pod "4405ec92-0cef-47b9-a1bd-07ceda6138df" (UID: "4405ec92-0cef-47b9-a1bd-07ceda6138df"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.941503 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4405ec92-0cef-47b9-a1bd-07ceda6138df-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4405ec92-0cef-47b9-a1bd-07ceda6138df" (UID: "4405ec92-0cef-47b9-a1bd-07ceda6138df"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:49:15 crc kubenswrapper[4917]: I0318 06:49:15.945772 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4405ec92-0cef-47b9-a1bd-07ceda6138df-kube-api-access-sr82h" (OuterVolumeSpecName: "kube-api-access-sr82h") pod "4405ec92-0cef-47b9-a1bd-07ceda6138df" (UID: "4405ec92-0cef-47b9-a1bd-07ceda6138df"). InnerVolumeSpecName "kube-api-access-sr82h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.037867 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c6r6\" (UniqueName: \"kubernetes.io/projected/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-kube-api-access-7c6r6\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.037960 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-serving-cert\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.038043 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-client-ca\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.038124 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-config\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.038182 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-proxy-ca-bundles\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.038278 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4405ec92-0cef-47b9-a1bd-07ceda6138df-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.038310 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.038335 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4405ec92-0cef-47b9-a1bd-07ceda6138df-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.038403 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr82h\" (UniqueName: \"kubernetes.io/projected/4405ec92-0cef-47b9-a1bd-07ceda6138df-kube-api-access-sr82h\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.040955 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-proxy-ca-bundles\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.041318 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-client-ca\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.043554 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-config\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.045796 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-serving-cert\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.054710 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c6r6\" (UniqueName: \"kubernetes.io/projected/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-kube-api-access-7c6r6\") pod \"controller-manager-b77d7bd8-vnwhd\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.196393 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.348715 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" event={"ID":"4405ec92-0cef-47b9-a1bd-07ceda6138df","Type":"ContainerDied","Data":"943b6d2b26e2ef49cf2c9efb99bdb3c6aed8a82cf7e56cd08cb4b5783817e46a"} Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.349102 4917 scope.go:117] "RemoveContainer" containerID="695cc7feaedaea53ff8acc60e61f565975943c57150d4ab27dcfb1e9e450f5e1" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.348731 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.353939 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" event={"ID":"c89ac573-aadc-4bc4-8b50-4b06787e12e8","Type":"ContainerDied","Data":"f81b7782d85da6f15b648d3792c61e390558fbf89e1b55186707fc87412329e7"} Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.354007 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c77467474-sm4rm" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.376033 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8"] Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.378812 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbdf68785-z68w8"] Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.379079 4917 scope.go:117] "RemoveContainer" containerID="3a231f0c8be252db9a5a038863f52b5deaebe79fdcf5a821cbbb2f3c7c144035" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.389038 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c77467474-sm4rm"] Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.391847 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c77467474-sm4rm"] Mar 18 06:49:16 crc kubenswrapper[4917]: E0318 06:49:16.483299 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89ac573_aadc_4bc4_8b50_4b06787e12e8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc89ac573_aadc_4bc4_8b50_4b06787e12e8.slice/crio-f81b7782d85da6f15b648d3792c61e390558fbf89e1b55186707fc87412329e7\": RecentStats: unable to find data in memory cache]" Mar 18 06:49:16 crc kubenswrapper[4917]: E0318 06:49:16.604833 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:49:16 crc kubenswrapper[4917]: E0318 06:49:16.607076 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:49:16 crc kubenswrapper[4917]: E0318 06:49:16.608938 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:49:16 crc kubenswrapper[4917]: E0318 06:49:16.608973 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" podUID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" containerName="kube-multus-additional-cni-plugins" Mar 18 06:49:16 crc kubenswrapper[4917]: I0318 06:49:16.640441 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b77d7bd8-vnwhd"] Mar 18 06:49:17 crc kubenswrapper[4917]: I0318 06:49:17.784977 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4405ec92-0cef-47b9-a1bd-07ceda6138df" path="/var/lib/kubelet/pods/4405ec92-0cef-47b9-a1bd-07ceda6138df/volumes" Mar 18 06:49:17 crc kubenswrapper[4917]: I0318 06:49:17.787061 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c89ac573-aadc-4bc4-8b50-4b06787e12e8" path="/var/lib/kubelet/pods/c89ac573-aadc-4bc4-8b50-4b06787e12e8/volumes" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.421375 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf"] Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.422759 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.424732 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.424833 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.424939 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.424884 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.425105 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.429913 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.433128 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf"] Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.468108 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxdt7\" (UniqueName: \"kubernetes.io/projected/efcd507f-70af-446b-93c3-8ac9ff8f5123-kube-api-access-gxdt7\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.468147 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-client-ca\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.468382 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-config\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.468423 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efcd507f-70af-446b-93c3-8ac9ff8f5123-serving-cert\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.569561 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxdt7\" (UniqueName: \"kubernetes.io/projected/efcd507f-70af-446b-93c3-8ac9ff8f5123-kube-api-access-gxdt7\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.569656 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-client-ca\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.569750 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-config\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.569796 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efcd507f-70af-446b-93c3-8ac9ff8f5123-serving-cert\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.570542 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-client-ca\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.571102 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-config\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.579797 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efcd507f-70af-446b-93c3-8ac9ff8f5123-serving-cert\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.589434 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxdt7\" (UniqueName: \"kubernetes.io/projected/efcd507f-70af-446b-93c3-8ac9ff8f5123-kube-api-access-gxdt7\") pod \"route-controller-manager-566bd6cb4c-pw8rf\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:18 crc kubenswrapper[4917]: I0318 06:49:18.754817 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:22 crc kubenswrapper[4917]: I0318 06:49:22.077047 4917 ???:1] "http: TLS handshake error from 192.168.126.11:51674: no serving certificate available for the kubelet" Mar 18 06:49:24 crc kubenswrapper[4917]: I0318 06:49:24.423342 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-plhlj_6dfc23fb-d9d6-432b-b3dd-7334451b2cfc/kube-multus-additional-cni-plugins/0.log" Mar 18 06:49:24 crc kubenswrapper[4917]: I0318 06:49:24.423776 4917 generic.go:334] "Generic (PLEG): container finished" podID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" exitCode=137 Mar 18 06:49:24 crc kubenswrapper[4917]: I0318 06:49:24.423805 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" event={"ID":"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc","Type":"ContainerDied","Data":"dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0"} Mar 18 06:49:24 crc kubenswrapper[4917]: W0318 06:49:24.464431 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd530b22f_8e6c_4d2f_88a6_f1ec66bf0043.slice/crio-bb0e8fb4bb5aa7f5ca979050cec73a7b704b82e8dc1eae359335aecc2cb21c40 WatchSource:0}: Error finding container bb0e8fb4bb5aa7f5ca979050cec73a7b704b82e8dc1eae359335aecc2cb21c40: Status 404 returned error can't find the container with id bb0e8fb4bb5aa7f5ca979050cec73a7b704b82e8dc1eae359335aecc2cb21c40 Mar 18 06:49:25 crc kubenswrapper[4917]: I0318 06:49:25.429700 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" event={"ID":"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043","Type":"ContainerStarted","Data":"bb0e8fb4bb5aa7f5ca979050cec73a7b704b82e8dc1eae359335aecc2cb21c40"} Mar 18 06:49:26 crc kubenswrapper[4917]: E0318 06:49:26.603292 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0 is running failed: container process not found" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:49:26 crc kubenswrapper[4917]: E0318 06:49:26.604330 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0 is running failed: container process not found" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:49:26 crc kubenswrapper[4917]: E0318 06:49:26.604695 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0 is running failed: container process not found" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 18 06:49:26 crc kubenswrapper[4917]: E0318 06:49:26.604732 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0 is running failed: container process not found" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" podUID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" containerName="kube-multus-additional-cni-plugins" Mar 18 06:49:26 crc kubenswrapper[4917]: I0318 06:49:26.859863 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7rqbb" Mar 18 06:49:27 crc kubenswrapper[4917]: I0318 06:49:27.791156 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.447506 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.448933 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.450562 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.451363 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.451364 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.480166 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=1.480111192 podStartE2EDuration="1.480111192s" podCreationTimestamp="2026-03-18 06:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:28.479517567 +0000 UTC m=+153.420672281" watchObservedRunningTime="2026-03-18 06:49:28.480111192 +0000 UTC m=+153.421265916" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.521337 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.521642 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.623668 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.623787 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.623912 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.644431 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:28 crc kubenswrapper[4917]: I0318 06:49:28.793160 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.628563 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-plhlj_6dfc23fb-d9d6-432b-b3dd-7334451b2cfc/kube-multus-additional-cni-plugins/0.log" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.629172 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.636722 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-ready\") pod \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.636757 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q92ks\" (UniqueName: \"kubernetes.io/projected/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-kube-api-access-q92ks\") pod \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.636869 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-tuning-conf-dir\") pod \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.636929 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-cni-sysctl-allowlist\") pod \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\" (UID: \"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc\") " Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.637670 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" (UID: "6dfc23fb-d9d6-432b-b3dd-7334451b2cfc"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.637967 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-ready" (OuterVolumeSpecName: "ready") pod "6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" (UID: "6dfc23fb-d9d6-432b-b3dd-7334451b2cfc"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.638027 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" (UID: "6dfc23fb-d9d6-432b-b3dd-7334451b2cfc"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.663902 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-kube-api-access-q92ks" (OuterVolumeSpecName: "kube-api-access-q92ks") pod "6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" (UID: "6dfc23fb-d9d6-432b-b3dd-7334451b2cfc"). InnerVolumeSpecName "kube-api-access-q92ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.738164 4917 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-ready\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.738322 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q92ks\" (UniqueName: \"kubernetes.io/projected/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-kube-api-access-q92ks\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.738337 4917 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.738346 4917 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:29 crc kubenswrapper[4917]: I0318 06:49:29.903794 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 18 06:49:29 crc kubenswrapper[4917]: W0318 06:49:29.923980 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod32c6ee95_6cdf_467a_b482_9d2e99db55dd.slice/crio-a45d82f106c8612f5c31279258f0bea76d9dbd8c39152a5be8574a952556278d WatchSource:0}: Error finding container a45d82f106c8612f5c31279258f0bea76d9dbd8c39152a5be8574a952556278d: Status 404 returned error can't find the container with id a45d82f106c8612f5c31279258f0bea76d9dbd8c39152a5be8574a952556278d Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.087178 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf"] Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.479809 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mhrt" event={"ID":"52df4f75-850c-4266-b94d-909e90669389","Type":"ContainerStarted","Data":"da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.483427 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-plhlj_6dfc23fb-d9d6-432b-b3dd-7334451b2cfc/kube-multus-additional-cni-plugins/0.log" Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.483508 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" event={"ID":"6dfc23fb-d9d6-432b-b3dd-7334451b2cfc","Type":"ContainerDied","Data":"af255ea577a070a5f34174a38e4d0a05fd92230bc6d5ad5d94f8a5ea265d0690"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.483539 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-plhlj" Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.483559 4917 scope.go:117] "RemoveContainer" containerID="dea304b0550b8f447c705f95100a148ecd3541690715159654f8739020f92ea0" Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.487479 4917 generic.go:334] "Generic (PLEG): container finished" podID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerID="903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4" exitCode=0 Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.487558 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9rcc" event={"ID":"552a2644-267a-4c40-8ee5-f91bd933e7f2","Type":"ContainerDied","Data":"903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.491632 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql22d" event={"ID":"9cfbe58b-49c6-429c-a32f-f92719d730b8","Type":"ContainerStarted","Data":"cc627489a692761da22691894bcc4af8346534e7a274fc61badeda2089b93f5a"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.493129 4917 generic.go:334] "Generic (PLEG): container finished" podID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerID="0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca" exitCode=0 Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.493168 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzdbq" event={"ID":"0b2760aa-6e2d-48ef-8ed6-a0225e59df24","Type":"ContainerDied","Data":"0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.497902 4917 generic.go:334] "Generic (PLEG): container finished" podID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerID="b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b" exitCode=0 Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.497974 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmg2f" event={"ID":"528debb6-ed0b-4099-a21d-81d1da5ba9f6","Type":"ContainerDied","Data":"b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.499210 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" event={"ID":"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043","Type":"ContainerStarted","Data":"faa57d5677186a489f28d1ab5ab8f6d69f663ca3c16c332d4b7f9699544b5b44"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.500016 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.509007 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" event={"ID":"efcd507f-70af-446b-93c3-8ac9ff8f5123","Type":"ContainerStarted","Data":"bdca802062b0e6fe2b3528ef1f67dacfe8aa6ca54a337b95103a5048cd683687"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.509046 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" event={"ID":"efcd507f-70af-446b-93c3-8ac9ff8f5123","Type":"ContainerStarted","Data":"28a7324e9e2999fadd1c3e0544922b7905a36f1b42e1717d48d6a70983d34899"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.509383 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.511845 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.512261 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8w6h" event={"ID":"7a48f389-623d-42a9-ad95-0e07c27b2eed","Type":"ContainerStarted","Data":"9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.514335 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlftl" event={"ID":"edae9f71-365b-48dd-91fa-4ad4d56dcc62","Type":"ContainerStarted","Data":"ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.515841 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"32c6ee95-6cdf-467a-b482-9d2e99db55dd","Type":"ContainerStarted","Data":"ee963aab50b4d088d78b89aa50e98d745fdfca8acce54aa1d53f666b857ba914"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.515871 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"32c6ee95-6cdf-467a-b482-9d2e99db55dd","Type":"ContainerStarted","Data":"a45d82f106c8612f5c31279258f0bea76d9dbd8c39152a5be8574a952556278d"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.522570 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5svt4" event={"ID":"11ad5fdc-a94e-46bb-b6d8-703862926e33","Type":"ContainerStarted","Data":"075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544"} Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.549300 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" podStartSLOduration=17.549282884 podStartE2EDuration="17.549282884s" podCreationTimestamp="2026-03-18 06:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:30.534903828 +0000 UTC m=+155.476058552" watchObservedRunningTime="2026-03-18 06:49:30.549282884 +0000 UTC m=+155.490437598" Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.551160 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-plhlj"] Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.556807 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-plhlj"] Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.710125 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" podStartSLOduration=17.710101135 podStartE2EDuration="17.710101135s" podCreationTimestamp="2026-03-18 06:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:30.688861863 +0000 UTC m=+155.630016577" watchObservedRunningTime="2026-03-18 06:49:30.710101135 +0000 UTC m=+155.651255849" Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.710953 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.710948243 podStartE2EDuration="2.710948243s" podCreationTimestamp="2026-03-18 06:49:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:30.708360635 +0000 UTC m=+155.649515349" watchObservedRunningTime="2026-03-18 06:49:30.710948243 +0000 UTC m=+155.652102957" Mar 18 06:49:30 crc kubenswrapper[4917]: I0318 06:49:30.919257 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.548083 4917 generic.go:334] "Generic (PLEG): container finished" podID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerID="075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544" exitCode=0 Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.548125 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5svt4" event={"ID":"11ad5fdc-a94e-46bb-b6d8-703862926e33","Type":"ContainerDied","Data":"075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544"} Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.553036 4917 generic.go:334] "Generic (PLEG): container finished" podID="52df4f75-850c-4266-b94d-909e90669389" containerID="da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa" exitCode=0 Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.553095 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mhrt" event={"ID":"52df4f75-850c-4266-b94d-909e90669389","Type":"ContainerDied","Data":"da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa"} Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.556695 4917 generic.go:334] "Generic (PLEG): container finished" podID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerID="9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4" exitCode=0 Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.556777 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8w6h" event={"ID":"7a48f389-623d-42a9-ad95-0e07c27b2eed","Type":"ContainerDied","Data":"9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4"} Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.567761 4917 generic.go:334] "Generic (PLEG): container finished" podID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerID="cc627489a692761da22691894bcc4af8346534e7a274fc61badeda2089b93f5a" exitCode=0 Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.567834 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql22d" event={"ID":"9cfbe58b-49c6-429c-a32f-f92719d730b8","Type":"ContainerDied","Data":"cc627489a692761da22691894bcc4af8346534e7a274fc61badeda2089b93f5a"} Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.570938 4917 generic.go:334] "Generic (PLEG): container finished" podID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerID="ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f" exitCode=0 Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.571008 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlftl" event={"ID":"edae9f71-365b-48dd-91fa-4ad4d56dcc62","Type":"ContainerDied","Data":"ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f"} Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.574204 4917 generic.go:334] "Generic (PLEG): container finished" podID="32c6ee95-6cdf-467a-b482-9d2e99db55dd" containerID="ee963aab50b4d088d78b89aa50e98d745fdfca8acce54aa1d53f666b857ba914" exitCode=0 Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.574653 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"32c6ee95-6cdf-467a-b482-9d2e99db55dd","Type":"ContainerDied","Data":"ee963aab50b4d088d78b89aa50e98d745fdfca8acce54aa1d53f666b857ba914"} Mar 18 06:49:31 crc kubenswrapper[4917]: I0318 06:49:31.792783 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" path="/var/lib/kubelet/pods/6dfc23fb-d9d6-432b-b3dd-7334451b2cfc/volumes" Mar 18 06:49:32 crc kubenswrapper[4917]: I0318 06:49:32.585355 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9rcc" event={"ID":"552a2644-267a-4c40-8ee5-f91bd933e7f2","Type":"ContainerStarted","Data":"7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd"} Mar 18 06:49:32 crc kubenswrapper[4917]: I0318 06:49:32.613942 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z9rcc" podStartSLOduration=3.545199642 podStartE2EDuration="39.613925303s" podCreationTimestamp="2026-03-18 06:48:53 +0000 UTC" firstStartedPulling="2026-03-18 06:48:55.794848426 +0000 UTC m=+120.736003140" lastFinishedPulling="2026-03-18 06:49:31.863574087 +0000 UTC m=+156.804728801" observedRunningTime="2026-03-18 06:49:32.610773622 +0000 UTC m=+157.551928336" watchObservedRunningTime="2026-03-18 06:49:32.613925303 +0000 UTC m=+157.555080017" Mar 18 06:49:32 crc kubenswrapper[4917]: I0318 06:49:32.864181 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.037948 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kube-api-access\") pod \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\" (UID: \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\") " Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.038008 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kubelet-dir\") pod \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\" (UID: \"32c6ee95-6cdf-467a-b482-9d2e99db55dd\") " Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.038195 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "32c6ee95-6cdf-467a-b482-9d2e99db55dd" (UID: "32c6ee95-6cdf-467a-b482-9d2e99db55dd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.038360 4917 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.052557 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "32c6ee95-6cdf-467a-b482-9d2e99db55dd" (UID: "32c6ee95-6cdf-467a-b482-9d2e99db55dd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.070216 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b77d7bd8-vnwhd"] Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.139567 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/32c6ee95-6cdf-467a-b482-9d2e99db55dd-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.155373 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf"] Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.594656 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzdbq" event={"ID":"0b2760aa-6e2d-48ef-8ed6-a0225e59df24","Type":"ContainerStarted","Data":"b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b"} Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.599297 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"32c6ee95-6cdf-467a-b482-9d2e99db55dd","Type":"ContainerDied","Data":"a45d82f106c8612f5c31279258f0bea76d9dbd8c39152a5be8574a952556278d"} Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.599343 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.599353 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a45d82f106c8612f5c31279258f0bea76d9dbd8c39152a5be8574a952556278d" Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.601573 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmg2f" event={"ID":"528debb6-ed0b-4099-a21d-81d1da5ba9f6","Type":"ContainerStarted","Data":"ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9"} Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.601742 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" podUID="d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" containerName="controller-manager" containerID="cri-o://faa57d5677186a489f28d1ab5ab8f6d69f663ca3c16c332d4b7f9699544b5b44" gracePeriod=30 Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.601941 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" podUID="efcd507f-70af-446b-93c3-8ac9ff8f5123" containerName="route-controller-manager" containerID="cri-o://bdca802062b0e6fe2b3528ef1f67dacfe8aa6ca54a337b95103a5048cd683687" gracePeriod=30 Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.632842 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mmg2f" podStartSLOduration=4.201710391 podStartE2EDuration="38.63282435s" podCreationTimestamp="2026-03-18 06:48:55 +0000 UTC" firstStartedPulling="2026-03-18 06:48:57.884301147 +0000 UTC m=+122.825455861" lastFinishedPulling="2026-03-18 06:49:32.315415106 +0000 UTC m=+157.256569820" observedRunningTime="2026-03-18 06:49:33.626573779 +0000 UTC m=+158.567728493" watchObservedRunningTime="2026-03-18 06:49:33.63282435 +0000 UTC m=+158.573979064" Mar 18 06:49:33 crc kubenswrapper[4917]: I0318 06:49:33.643015 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.363661 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.364514 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.618231 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql22d" event={"ID":"9cfbe58b-49c6-429c-a32f-f92719d730b8","Type":"ContainerStarted","Data":"8af9e3616e3ecab67885fc7e64d05fde4e1f66046ff00caecde40a8a5125f745"} Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.623202 4917 generic.go:334] "Generic (PLEG): container finished" podID="d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" containerID="faa57d5677186a489f28d1ab5ab8f6d69f663ca3c16c332d4b7f9699544b5b44" exitCode=0 Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.623236 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" event={"ID":"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043","Type":"ContainerDied","Data":"faa57d5677186a489f28d1ab5ab8f6d69f663ca3c16c332d4b7f9699544b5b44"} Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.627155 4917 generic.go:334] "Generic (PLEG): container finished" podID="efcd507f-70af-446b-93c3-8ac9ff8f5123" containerID="bdca802062b0e6fe2b3528ef1f67dacfe8aa6ca54a337b95103a5048cd683687" exitCode=0 Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.627264 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" event={"ID":"efcd507f-70af-446b-93c3-8ac9ff8f5123","Type":"ContainerDied","Data":"bdca802062b0e6fe2b3528ef1f67dacfe8aa6ca54a337b95103a5048cd683687"} Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.654832 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mzdbq" podStartSLOduration=3.298250576 podStartE2EDuration="38.654817626s" podCreationTimestamp="2026-03-18 06:48:56 +0000 UTC" firstStartedPulling="2026-03-18 06:48:57.882905665 +0000 UTC m=+122.824060369" lastFinishedPulling="2026-03-18 06:49:33.239472705 +0000 UTC m=+158.180627419" observedRunningTime="2026-03-18 06:49:34.653489796 +0000 UTC m=+159.594644510" watchObservedRunningTime="2026-03-18 06:49:34.654817626 +0000 UTC m=+159.595972340" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.854887 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.858965 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.883544 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr"] Mar 18 06:49:34 crc kubenswrapper[4917]: E0318 06:49:34.883809 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" containerName="controller-manager" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.883825 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" containerName="controller-manager" Mar 18 06:49:34 crc kubenswrapper[4917]: E0318 06:49:34.883839 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" containerName="kube-multus-additional-cni-plugins" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.883846 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" containerName="kube-multus-additional-cni-plugins" Mar 18 06:49:34 crc kubenswrapper[4917]: E0318 06:49:34.883854 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c6ee95-6cdf-467a-b482-9d2e99db55dd" containerName="pruner" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.883861 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c6ee95-6cdf-467a-b482-9d2e99db55dd" containerName="pruner" Mar 18 06:49:34 crc kubenswrapper[4917]: E0318 06:49:34.883869 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efcd507f-70af-446b-93c3-8ac9ff8f5123" containerName="route-controller-manager" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.883876 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="efcd507f-70af-446b-93c3-8ac9ff8f5123" containerName="route-controller-manager" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.883984 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c6ee95-6cdf-467a-b482-9d2e99db55dd" containerName="pruner" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.883995 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dfc23fb-d9d6-432b-b3dd-7334451b2cfc" containerName="kube-multus-additional-cni-plugins" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.884008 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="efcd507f-70af-446b-93c3-8ac9ff8f5123" containerName="route-controller-manager" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.884017 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" containerName="controller-manager" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.884425 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.929316 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr"] Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.958815 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-client-ca\") pod \"efcd507f-70af-446b-93c3-8ac9ff8f5123\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.958878 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-client-ca\") pod \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.958910 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-proxy-ca-bundles\") pod \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.958945 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-config\") pod \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.958961 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxdt7\" (UniqueName: \"kubernetes.io/projected/efcd507f-70af-446b-93c3-8ac9ff8f5123-kube-api-access-gxdt7\") pod \"efcd507f-70af-446b-93c3-8ac9ff8f5123\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.959035 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c6r6\" (UniqueName: \"kubernetes.io/projected/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-kube-api-access-7c6r6\") pod \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.959057 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efcd507f-70af-446b-93c3-8ac9ff8f5123-serving-cert\") pod \"efcd507f-70af-446b-93c3-8ac9ff8f5123\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.959660 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" (UID: "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.959798 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-config" (OuterVolumeSpecName: "config") pod "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" (UID: "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960091 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-client-ca" (OuterVolumeSpecName: "client-ca") pod "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" (UID: "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960348 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-serving-cert\") pod \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\" (UID: \"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043\") " Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960396 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-config\") pod \"efcd507f-70af-446b-93c3-8ac9ff8f5123\" (UID: \"efcd507f-70af-446b-93c3-8ac9ff8f5123\") " Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960534 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-serving-cert\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960560 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-config\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960623 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-client-ca\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960673 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlcqg\" (UniqueName: \"kubernetes.io/projected/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-kube-api-access-vlcqg\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960713 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960725 4917 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.960735 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.961091 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-config" (OuterVolumeSpecName: "config") pod "efcd507f-70af-446b-93c3-8ac9ff8f5123" (UID: "efcd507f-70af-446b-93c3-8ac9ff8f5123"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.961138 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-client-ca" (OuterVolumeSpecName: "client-ca") pod "efcd507f-70af-446b-93c3-8ac9ff8f5123" (UID: "efcd507f-70af-446b-93c3-8ac9ff8f5123"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.965081 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" (UID: "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.966284 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efcd507f-70af-446b-93c3-8ac9ff8f5123-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "efcd507f-70af-446b-93c3-8ac9ff8f5123" (UID: "efcd507f-70af-446b-93c3-8ac9ff8f5123"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.966429 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efcd507f-70af-446b-93c3-8ac9ff8f5123-kube-api-access-gxdt7" (OuterVolumeSpecName: "kube-api-access-gxdt7") pod "efcd507f-70af-446b-93c3-8ac9ff8f5123" (UID: "efcd507f-70af-446b-93c3-8ac9ff8f5123"). InnerVolumeSpecName "kube-api-access-gxdt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:34 crc kubenswrapper[4917]: I0318 06:49:34.971691 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-kube-api-access-7c6r6" (OuterVolumeSpecName: "kube-api-access-7c6r6") pod "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" (UID: "d530b22f-8e6c-4d2f-88a6-f1ec66bf0043"). InnerVolumeSpecName "kube-api-access-7c6r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.061729 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-client-ca\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.061797 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlcqg\" (UniqueName: \"kubernetes.io/projected/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-kube-api-access-vlcqg\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.061829 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-serving-cert\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.061846 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-config\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.062514 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.062534 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efcd507f-70af-446b-93c3-8ac9ff8f5123-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.062543 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxdt7\" (UniqueName: \"kubernetes.io/projected/efcd507f-70af-446b-93c3-8ac9ff8f5123-kube-api-access-gxdt7\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.062554 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efcd507f-70af-446b-93c3-8ac9ff8f5123-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.062563 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c6r6\" (UniqueName: \"kubernetes.io/projected/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-kube-api-access-7c6r6\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.062574 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.062947 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-client-ca\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.063083 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-config\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.070132 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-serving-cert\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.081846 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlcqg\" (UniqueName: \"kubernetes.io/projected/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-kube-api-access-vlcqg\") pod \"route-controller-manager-749c8b7d59-4mxrr\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.198790 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.654371 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" event={"ID":"efcd507f-70af-446b-93c3-8ac9ff8f5123","Type":"ContainerDied","Data":"28a7324e9e2999fadd1c3e0544922b7905a36f1b42e1717d48d6a70983d34899"} Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.654636 4917 scope.go:117] "RemoveContainer" containerID="bdca802062b0e6fe2b3528ef1f67dacfe8aa6ca54a337b95103a5048cd683687" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.654737 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.700576 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.700677 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-b77d7bd8-vnwhd" event={"ID":"d530b22f-8e6c-4d2f-88a6-f1ec66bf0043","Type":"ContainerDied","Data":"bb0e8fb4bb5aa7f5ca979050cec73a7b704b82e8dc1eae359335aecc2cb21c40"} Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.717282 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf"] Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.723931 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566bd6cb4c-pw8rf"] Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.732807 4917 scope.go:117] "RemoveContainer" containerID="faa57d5677186a489f28d1ab5ab8f6d69f663ca3c16c332d4b7f9699544b5b44" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.754649 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ql22d" podStartSLOduration=4.611581456 podStartE2EDuration="38.754633363s" podCreationTimestamp="2026-03-18 06:48:57 +0000 UTC" firstStartedPulling="2026-03-18 06:48:59.974621539 +0000 UTC m=+124.915776253" lastFinishedPulling="2026-03-18 06:49:34.117673446 +0000 UTC m=+159.058828160" observedRunningTime="2026-03-18 06:49:35.751413861 +0000 UTC m=+160.692568595" watchObservedRunningTime="2026-03-18 06:49:35.754633363 +0000 UTC m=+160.695788077" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.762917 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b77d7bd8-vnwhd"] Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.766667 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b77d7bd8-vnwhd"] Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.781414 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d530b22f-8e6c-4d2f-88a6-f1ec66bf0043" path="/var/lib/kubelet/pods/d530b22f-8e6c-4d2f-88a6-f1ec66bf0043/volumes" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.782337 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efcd507f-70af-446b-93c3-8ac9ff8f5123" path="/var/lib/kubelet/pods/efcd507f-70af-446b-93c3-8ac9ff8f5123/volumes" Mar 18 06:49:35 crc kubenswrapper[4917]: I0318 06:49:35.857814 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr"] Mar 18 06:49:36 crc kubenswrapper[4917]: I0318 06:49:36.011269 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9xdsf"] Mar 18 06:49:36 crc kubenswrapper[4917]: I0318 06:49:36.280090 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-z9rcc" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerName="registry-server" probeResult="failure" output=< Mar 18 06:49:36 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 06:49:36 crc kubenswrapper[4917]: > Mar 18 06:49:36 crc kubenswrapper[4917]: I0318 06:49:36.312266 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:49:36 crc kubenswrapper[4917]: I0318 06:49:36.312315 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:49:36 crc kubenswrapper[4917]: I0318 06:49:36.704694 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:49:36 crc kubenswrapper[4917]: I0318 06:49:36.704726 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:49:36 crc kubenswrapper[4917]: I0318 06:49:36.707539 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlftl" event={"ID":"edae9f71-365b-48dd-91fa-4ad4d56dcc62","Type":"ContainerStarted","Data":"80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5"} Mar 18 06:49:36 crc kubenswrapper[4917]: I0318 06:49:36.709151 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" event={"ID":"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc","Type":"ContainerStarted","Data":"2c0cdc0655fd6ad00f6e9a7886e7dcee87d4ce3e347dced1ab0614a1d9ff0277"} Mar 18 06:49:36 crc kubenswrapper[4917]: I0318 06:49:36.729393 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jlftl" podStartSLOduration=3.242543244 podStartE2EDuration="39.729377671s" podCreationTimestamp="2026-03-18 06:48:57 +0000 UTC" firstStartedPulling="2026-03-18 06:48:58.922350607 +0000 UTC m=+123.863505321" lastFinishedPulling="2026-03-18 06:49:35.409185034 +0000 UTC m=+160.350339748" observedRunningTime="2026-03-18 06:49:36.727218611 +0000 UTC m=+161.668373315" watchObservedRunningTime="2026-03-18 06:49:36.729377671 +0000 UTC m=+161.670532385" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.352397 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mmg2f" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerName="registry-server" probeResult="failure" output=< Mar 18 06:49:37 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 06:49:37 crc kubenswrapper[4917]: > Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.482132 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc"] Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.482915 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.485845 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.486063 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.486092 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.486246 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.486337 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.486436 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.494226 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.496073 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-proxy-ca-bundles\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.496108 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-client-ca\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.496159 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f24987a-21ec-4686-8efd-4115003e0103-serving-cert\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.496192 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9h5r\" (UniqueName: \"kubernetes.io/projected/6f24987a-21ec-4686-8efd-4115003e0103-kube-api-access-k9h5r\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.496222 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-config\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.498966 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc"] Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.515977 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.516024 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.597284 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f24987a-21ec-4686-8efd-4115003e0103-serving-cert\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.597356 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9h5r\" (UniqueName: \"kubernetes.io/projected/6f24987a-21ec-4686-8efd-4115003e0103-kube-api-access-k9h5r\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.597387 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-config\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.597405 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-proxy-ca-bundles\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.597425 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-client-ca\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.598366 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-client-ca\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.598826 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-proxy-ca-bundles\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.600559 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-config\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.612344 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f24987a-21ec-4686-8efd-4115003e0103-serving-cert\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.619440 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9h5r\" (UniqueName: \"kubernetes.io/projected/6f24987a-21ec-4686-8efd-4115003e0103-kube-api-access-k9h5r\") pod \"controller-manager-7d78ffc85c-dvjkc\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.721784 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" event={"ID":"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc","Type":"ContainerStarted","Data":"06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687"} Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.722239 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.724379 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5svt4" event={"ID":"11ad5fdc-a94e-46bb-b6d8-703862926e33","Type":"ContainerStarted","Data":"8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7"} Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.727294 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mhrt" event={"ID":"52df4f75-850c-4266-b94d-909e90669389","Type":"ContainerStarted","Data":"dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c"} Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.729202 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8w6h" event={"ID":"7a48f389-623d-42a9-ad95-0e07c27b2eed","Type":"ContainerStarted","Data":"3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d"} Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.741466 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" podStartSLOduration=4.741444451 podStartE2EDuration="4.741444451s" podCreationTimestamp="2026-03-18 06:49:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:37.73962366 +0000 UTC m=+162.680778374" watchObservedRunningTime="2026-03-18 06:49:37.741444451 +0000 UTC m=+162.682599175" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.748868 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-mzdbq" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerName="registry-server" probeResult="failure" output=< Mar 18 06:49:37 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 06:49:37 crc kubenswrapper[4917]: > Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.757666 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6mhrt" podStartSLOduration=2.623129106 podStartE2EDuration="43.757649458s" podCreationTimestamp="2026-03-18 06:48:54 +0000 UTC" firstStartedPulling="2026-03-18 06:48:55.814279985 +0000 UTC m=+120.755434699" lastFinishedPulling="2026-03-18 06:49:36.948800327 +0000 UTC m=+161.889955051" observedRunningTime="2026-03-18 06:49:37.756350269 +0000 UTC m=+162.697504983" watchObservedRunningTime="2026-03-18 06:49:37.757649458 +0000 UTC m=+162.698804172" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.784116 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5svt4" podStartSLOduration=2.758193714 podStartE2EDuration="43.784099807s" podCreationTimestamp="2026-03-18 06:48:54 +0000 UTC" firstStartedPulling="2026-03-18 06:48:55.797470015 +0000 UTC m=+120.738624739" lastFinishedPulling="2026-03-18 06:49:36.823376118 +0000 UTC m=+161.764530832" observedRunningTime="2026-03-18 06:49:37.782854529 +0000 UTC m=+162.724009253" watchObservedRunningTime="2026-03-18 06:49:37.784099807 +0000 UTC m=+162.725254511" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.800032 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.829764 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.841063 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m8w6h" podStartSLOduration=2.712337778 podStartE2EDuration="43.841042597s" podCreationTimestamp="2026-03-18 06:48:54 +0000 UTC" firstStartedPulling="2026-03-18 06:48:55.804769571 +0000 UTC m=+120.745924285" lastFinishedPulling="2026-03-18 06:49:36.93347439 +0000 UTC m=+161.874629104" observedRunningTime="2026-03-18 06:49:37.815530628 +0000 UTC m=+162.756685362" watchObservedRunningTime="2026-03-18 06:49:37.841042597 +0000 UTC m=+162.782197311" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.842598 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.843178 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.852254 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.859077 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.873830 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.918638 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:49:37 crc kubenswrapper[4917]: I0318 06:49:37.918680 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.002322 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-var-lock\") pod \"installer-9-crc\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.003366 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f35e52c9-1886-464a-99ac-95e9803226db-kube-api-access\") pod \"installer-9-crc\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.003454 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.067485 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc"] Mar 18 06:49:38 crc kubenswrapper[4917]: W0318 06:49:38.075699 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f24987a_21ec_4686_8efd_4115003e0103.slice/crio-f0c13597fa4edb97efc451ce0fe7362cd82c363a6928335f38f47d47a5a00e62 WatchSource:0}: Error finding container f0c13597fa4edb97efc451ce0fe7362cd82c363a6928335f38f47d47a5a00e62: Status 404 returned error can't find the container with id f0c13597fa4edb97efc451ce0fe7362cd82c363a6928335f38f47d47a5a00e62 Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.106205 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f35e52c9-1886-464a-99ac-95e9803226db-kube-api-access\") pod \"installer-9-crc\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.106255 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.106301 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-var-lock\") pod \"installer-9-crc\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.106391 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-var-lock\") pod \"installer-9-crc\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.106403 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.128827 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f35e52c9-1886-464a-99ac-95e9803226db-kube-api-access\") pod \"installer-9-crc\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.162799 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.370755 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.560244 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jlftl" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerName="registry-server" probeResult="failure" output=< Mar 18 06:49:38 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 06:49:38 crc kubenswrapper[4917]: > Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.736251 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" event={"ID":"6f24987a-21ec-4686-8efd-4115003e0103","Type":"ContainerStarted","Data":"3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871"} Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.736301 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" event={"ID":"6f24987a-21ec-4686-8efd-4115003e0103","Type":"ContainerStarted","Data":"f0c13597fa4edb97efc451ce0fe7362cd82c363a6928335f38f47d47a5a00e62"} Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.736738 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.738862 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f35e52c9-1886-464a-99ac-95e9803226db","Type":"ContainerStarted","Data":"2dc38bfd7adefec982717bd1254001f68dc88008f4e1a31ff4e2570493f79200"} Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.738886 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f35e52c9-1886-464a-99ac-95e9803226db","Type":"ContainerStarted","Data":"00238203e6f710df2e88a3716d7979338e0e78a69980e32a54c2ceb233a4b275"} Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.742341 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.750776 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" podStartSLOduration=5.75075732 podStartE2EDuration="5.75075732s" podCreationTimestamp="2026-03-18 06:49:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:38.748676614 +0000 UTC m=+163.689831328" watchObservedRunningTime="2026-03-18 06:49:38.75075732 +0000 UTC m=+163.691912034" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.791630 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.791617246 podStartE2EDuration="1.791617246s" podCreationTimestamp="2026-03-18 06:49:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:38.789759923 +0000 UTC m=+163.730914637" watchObservedRunningTime="2026-03-18 06:49:38.791617246 +0000 UTC m=+163.732771960" Mar 18 06:49:38 crc kubenswrapper[4917]: I0318 06:49:38.956701 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ql22d" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerName="registry-server" probeResult="failure" output=< Mar 18 06:49:38 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 06:49:38 crc kubenswrapper[4917]: > Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.426920 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.471194 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.498942 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.499012 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.588294 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.725897 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.726027 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.780993 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.817343 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.913070 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.913180 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:49:44 crc kubenswrapper[4917]: I0318 06:49:44.955778 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:49:45 crc kubenswrapper[4917]: I0318 06:49:45.838079 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:49:45 crc kubenswrapper[4917]: I0318 06:49:45.840902 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:49:46 crc kubenswrapper[4917]: I0318 06:49:46.386478 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:49:46 crc kubenswrapper[4917]: I0318 06:49:46.448554 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:49:46 crc kubenswrapper[4917]: I0318 06:49:46.546388 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8w6h"] Mar 18 06:49:46 crc kubenswrapper[4917]: I0318 06:49:46.771984 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:49:46 crc kubenswrapper[4917]: I0318 06:49:46.831872 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:49:47 crc kubenswrapper[4917]: I0318 06:49:47.147391 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5svt4"] Mar 18 06:49:47 crc kubenswrapper[4917]: I0318 06:49:47.565044 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:49:47 crc kubenswrapper[4917]: I0318 06:49:47.615152 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:49:47 crc kubenswrapper[4917]: I0318 06:49:47.794026 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5svt4" podUID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerName="registry-server" containerID="cri-o://8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7" gracePeriod=2 Mar 18 06:49:47 crc kubenswrapper[4917]: I0318 06:49:47.796848 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m8w6h" podUID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerName="registry-server" containerID="cri-o://3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d" gracePeriod=2 Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.251262 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.319270 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.715139 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.756300 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-catalog-content\") pod \"7a48f389-623d-42a9-ad95-0e07c27b2eed\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.768181 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.799557 4917 generic.go:334] "Generic (PLEG): container finished" podID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerID="3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d" exitCode=0 Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.799635 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8w6h" event={"ID":"7a48f389-623d-42a9-ad95-0e07c27b2eed","Type":"ContainerDied","Data":"3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d"} Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.799727 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m8w6h" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.799751 4917 scope.go:117] "RemoveContainer" containerID="3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.799729 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m8w6h" event={"ID":"7a48f389-623d-42a9-ad95-0e07c27b2eed","Type":"ContainerDied","Data":"27440f3578e16fac87b0153c3fe9596d9f90ab7a6bb2f6a272d0942a3c7b0b92"} Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.801510 4917 generic.go:334] "Generic (PLEG): container finished" podID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerID="8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7" exitCode=0 Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.801647 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5svt4" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.801745 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5svt4" event={"ID":"11ad5fdc-a94e-46bb-b6d8-703862926e33","Type":"ContainerDied","Data":"8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7"} Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.801818 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5svt4" event={"ID":"11ad5fdc-a94e-46bb-b6d8-703862926e33","Type":"ContainerDied","Data":"a2fbfeef147774529ccee50e65bb3cee5c22bbb40f64bc273503835d265b2af6"} Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.821399 4917 scope.go:117] "RemoveContainer" containerID="9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.831170 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a48f389-623d-42a9-ad95-0e07c27b2eed" (UID: "7a48f389-623d-42a9-ad95-0e07c27b2eed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.841244 4917 scope.go:117] "RemoveContainer" containerID="59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.857517 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-utilities\") pod \"7a48f389-623d-42a9-ad95-0e07c27b2eed\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.857565 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6jsk\" (UniqueName: \"kubernetes.io/projected/7a48f389-623d-42a9-ad95-0e07c27b2eed-kube-api-access-v6jsk\") pod \"7a48f389-623d-42a9-ad95-0e07c27b2eed\" (UID: \"7a48f389-623d-42a9-ad95-0e07c27b2eed\") " Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.858099 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.859246 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-utilities" (OuterVolumeSpecName: "utilities") pod "7a48f389-623d-42a9-ad95-0e07c27b2eed" (UID: "7a48f389-623d-42a9-ad95-0e07c27b2eed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.860093 4917 scope.go:117] "RemoveContainer" containerID="3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d" Mar 18 06:49:48 crc kubenswrapper[4917]: E0318 06:49:48.860507 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d\": container with ID starting with 3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d not found: ID does not exist" containerID="3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.860545 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d"} err="failed to get container status \"3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d\": rpc error: code = NotFound desc = could not find container \"3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d\": container with ID starting with 3e8d57d3c57159a75c28dd0efabe02776eef5e657467854b72c731cdd4c33e8d not found: ID does not exist" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.860568 4917 scope.go:117] "RemoveContainer" containerID="9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4" Mar 18 06:49:48 crc kubenswrapper[4917]: E0318 06:49:48.861066 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4\": container with ID starting with 9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4 not found: ID does not exist" containerID="9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.861100 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4"} err="failed to get container status \"9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4\": rpc error: code = NotFound desc = could not find container \"9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4\": container with ID starting with 9995822916c3969595203914d907cdc32c025a8a8ced96df7427efe3587f07e4 not found: ID does not exist" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.861122 4917 scope.go:117] "RemoveContainer" containerID="59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434" Mar 18 06:49:48 crc kubenswrapper[4917]: E0318 06:49:48.861560 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434\": container with ID starting with 59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434 not found: ID does not exist" containerID="59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.861684 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434"} err="failed to get container status \"59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434\": rpc error: code = NotFound desc = could not find container \"59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434\": container with ID starting with 59f301f3d9ea35126d29d9134529c17434689f0de0200898c5611e68b477f434 not found: ID does not exist" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.861715 4917 scope.go:117] "RemoveContainer" containerID="8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.872463 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a48f389-623d-42a9-ad95-0e07c27b2eed-kube-api-access-v6jsk" (OuterVolumeSpecName: "kube-api-access-v6jsk") pod "7a48f389-623d-42a9-ad95-0e07c27b2eed" (UID: "7a48f389-623d-42a9-ad95-0e07c27b2eed"). InnerVolumeSpecName "kube-api-access-v6jsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.884173 4917 scope.go:117] "RemoveContainer" containerID="075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.897119 4917 scope.go:117] "RemoveContainer" containerID="2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.911314 4917 scope.go:117] "RemoveContainer" containerID="8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7" Mar 18 06:49:48 crc kubenswrapper[4917]: E0318 06:49:48.911779 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7\": container with ID starting with 8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7 not found: ID does not exist" containerID="8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.911807 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7"} err="failed to get container status \"8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7\": rpc error: code = NotFound desc = could not find container \"8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7\": container with ID starting with 8a0b0a82799e6f47b05b7ec41c2dda678009d52dd31b7a8973dd76e5a289d6b7 not found: ID does not exist" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.911828 4917 scope.go:117] "RemoveContainer" containerID="075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544" Mar 18 06:49:48 crc kubenswrapper[4917]: E0318 06:49:48.912008 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544\": container with ID starting with 075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544 not found: ID does not exist" containerID="075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.912036 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544"} err="failed to get container status \"075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544\": rpc error: code = NotFound desc = could not find container \"075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544\": container with ID starting with 075e1b8f0a210e929b203ae26191efd2b1ca57d96873c75ac1aa752e6e75d544 not found: ID does not exist" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.912048 4917 scope.go:117] "RemoveContainer" containerID="2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0" Mar 18 06:49:48 crc kubenswrapper[4917]: E0318 06:49:48.912213 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0\": container with ID starting with 2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0 not found: ID does not exist" containerID="2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.912232 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0"} err="failed to get container status \"2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0\": rpc error: code = NotFound desc = could not find container \"2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0\": container with ID starting with 2427f85eb83813373af1bfa6b3937590c6f7652d2652998b7feb1e6340b1e2a0 not found: ID does not exist" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.946906 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzdbq"] Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.947155 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mzdbq" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerName="registry-server" containerID="cri-o://b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b" gracePeriod=2 Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.958878 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqv5k\" (UniqueName: \"kubernetes.io/projected/11ad5fdc-a94e-46bb-b6d8-703862926e33-kube-api-access-wqv5k\") pod \"11ad5fdc-a94e-46bb-b6d8-703862926e33\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.960002 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-catalog-content\") pod \"11ad5fdc-a94e-46bb-b6d8-703862926e33\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.960060 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-utilities\") pod \"11ad5fdc-a94e-46bb-b6d8-703862926e33\" (UID: \"11ad5fdc-a94e-46bb-b6d8-703862926e33\") " Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.960287 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a48f389-623d-42a9-ad95-0e07c27b2eed-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.960311 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6jsk\" (UniqueName: \"kubernetes.io/projected/7a48f389-623d-42a9-ad95-0e07c27b2eed-kube-api-access-v6jsk\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.960957 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-utilities" (OuterVolumeSpecName: "utilities") pod "11ad5fdc-a94e-46bb-b6d8-703862926e33" (UID: "11ad5fdc-a94e-46bb-b6d8-703862926e33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:49:48 crc kubenswrapper[4917]: I0318 06:49:48.964065 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ad5fdc-a94e-46bb-b6d8-703862926e33-kube-api-access-wqv5k" (OuterVolumeSpecName: "kube-api-access-wqv5k") pod "11ad5fdc-a94e-46bb-b6d8-703862926e33" (UID: "11ad5fdc-a94e-46bb-b6d8-703862926e33"). InnerVolumeSpecName "kube-api-access-wqv5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.014269 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11ad5fdc-a94e-46bb-b6d8-703862926e33" (UID: "11ad5fdc-a94e-46bb-b6d8-703862926e33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.061822 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.061890 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11ad5fdc-a94e-46bb-b6d8-703862926e33-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.061923 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqv5k\" (UniqueName: \"kubernetes.io/projected/11ad5fdc-a94e-46bb-b6d8-703862926e33-kube-api-access-wqv5k\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.155758 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5svt4"] Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.174955 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5svt4"] Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.181799 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m8w6h"] Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.188418 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m8w6h"] Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.462438 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.570768 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gts6j\" (UniqueName: \"kubernetes.io/projected/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-kube-api-access-gts6j\") pod \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.570856 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-utilities\") pod \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.570900 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-catalog-content\") pod \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\" (UID: \"0b2760aa-6e2d-48ef-8ed6-a0225e59df24\") " Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.572063 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-utilities" (OuterVolumeSpecName: "utilities") pod "0b2760aa-6e2d-48ef-8ed6-a0225e59df24" (UID: "0b2760aa-6e2d-48ef-8ed6-a0225e59df24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.574345 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-kube-api-access-gts6j" (OuterVolumeSpecName: "kube-api-access-gts6j") pod "0b2760aa-6e2d-48ef-8ed6-a0225e59df24" (UID: "0b2760aa-6e2d-48ef-8ed6-a0225e59df24"). InnerVolumeSpecName "kube-api-access-gts6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.615335 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b2760aa-6e2d-48ef-8ed6-a0225e59df24" (UID: "0b2760aa-6e2d-48ef-8ed6-a0225e59df24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.672276 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gts6j\" (UniqueName: \"kubernetes.io/projected/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-kube-api-access-gts6j\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.672326 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.672342 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b2760aa-6e2d-48ef-8ed6-a0225e59df24-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.787855 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ad5fdc-a94e-46bb-b6d8-703862926e33" path="/var/lib/kubelet/pods/11ad5fdc-a94e-46bb-b6d8-703862926e33/volumes" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.789194 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a48f389-623d-42a9-ad95-0e07c27b2eed" path="/var/lib/kubelet/pods/7a48f389-623d-42a9-ad95-0e07c27b2eed/volumes" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.816813 4917 generic.go:334] "Generic (PLEG): container finished" podID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerID="b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b" exitCode=0 Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.816985 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mzdbq" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.817009 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzdbq" event={"ID":"0b2760aa-6e2d-48ef-8ed6-a0225e59df24","Type":"ContainerDied","Data":"b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b"} Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.817532 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mzdbq" event={"ID":"0b2760aa-6e2d-48ef-8ed6-a0225e59df24","Type":"ContainerDied","Data":"c38875a1a5a0df38cdad652ae5317afb6e50c2a952b8201c7c9778bb97e65295"} Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.817565 4917 scope.go:117] "RemoveContainer" containerID="b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.848305 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzdbq"] Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.852125 4917 scope.go:117] "RemoveContainer" containerID="0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.853658 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mzdbq"] Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.876316 4917 scope.go:117] "RemoveContainer" containerID="4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.899551 4917 scope.go:117] "RemoveContainer" containerID="b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b" Mar 18 06:49:49 crc kubenswrapper[4917]: E0318 06:49:49.900348 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b\": container with ID starting with b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b not found: ID does not exist" containerID="b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.900405 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b"} err="failed to get container status \"b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b\": rpc error: code = NotFound desc = could not find container \"b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b\": container with ID starting with b6dabaa6138abc7e992d7d7eeb6744dcfce59f9096b67cf2e955b0e44f0c204b not found: ID does not exist" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.900442 4917 scope.go:117] "RemoveContainer" containerID="0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca" Mar 18 06:49:49 crc kubenswrapper[4917]: E0318 06:49:49.901141 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca\": container with ID starting with 0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca not found: ID does not exist" containerID="0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.901206 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca"} err="failed to get container status \"0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca\": rpc error: code = NotFound desc = could not find container \"0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca\": container with ID starting with 0096793528c9fda94cb04f4510054172b81b8daf2d60b66473a33d4edb0c60ca not found: ID does not exist" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.901253 4917 scope.go:117] "RemoveContainer" containerID="4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c" Mar 18 06:49:49 crc kubenswrapper[4917]: E0318 06:49:49.901874 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c\": container with ID starting with 4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c not found: ID does not exist" containerID="4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c" Mar 18 06:49:49 crc kubenswrapper[4917]: I0318 06:49:49.901926 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c"} err="failed to get container status \"4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c\": rpc error: code = NotFound desc = could not find container \"4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c\": container with ID starting with 4a1ef2a8ff633c8a9e79d7fafcb870a2dc9588d77c80f0dac55bcac7bb90404c not found: ID does not exist" Mar 18 06:49:51 crc kubenswrapper[4917]: I0318 06:49:51.552148 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ql22d"] Mar 18 06:49:51 crc kubenswrapper[4917]: I0318 06:49:51.552713 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ql22d" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerName="registry-server" containerID="cri-o://8af9e3616e3ecab67885fc7e64d05fde4e1f66046ff00caecde40a8a5125f745" gracePeriod=2 Mar 18 06:49:51 crc kubenswrapper[4917]: I0318 06:49:51.784640 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" path="/var/lib/kubelet/pods/0b2760aa-6e2d-48ef-8ed6-a0225e59df24/volumes" Mar 18 06:49:51 crc kubenswrapper[4917]: I0318 06:49:51.842010 4917 generic.go:334] "Generic (PLEG): container finished" podID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerID="8af9e3616e3ecab67885fc7e64d05fde4e1f66046ff00caecde40a8a5125f745" exitCode=0 Mar 18 06:49:51 crc kubenswrapper[4917]: I0318 06:49:51.842149 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql22d" event={"ID":"9cfbe58b-49c6-429c-a32f-f92719d730b8","Type":"ContainerDied","Data":"8af9e3616e3ecab67885fc7e64d05fde4e1f66046ff00caecde40a8a5125f745"} Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.189999 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.205559 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8r2l\" (UniqueName: \"kubernetes.io/projected/9cfbe58b-49c6-429c-a32f-f92719d730b8-kube-api-access-p8r2l\") pod \"9cfbe58b-49c6-429c-a32f-f92719d730b8\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.205664 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-catalog-content\") pod \"9cfbe58b-49c6-429c-a32f-f92719d730b8\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.205719 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-utilities\") pod \"9cfbe58b-49c6-429c-a32f-f92719d730b8\" (UID: \"9cfbe58b-49c6-429c-a32f-f92719d730b8\") " Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.206545 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-utilities" (OuterVolumeSpecName: "utilities") pod "9cfbe58b-49c6-429c-a32f-f92719d730b8" (UID: "9cfbe58b-49c6-429c-a32f-f92719d730b8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.222805 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cfbe58b-49c6-429c-a32f-f92719d730b8-kube-api-access-p8r2l" (OuterVolumeSpecName: "kube-api-access-p8r2l") pod "9cfbe58b-49c6-429c-a32f-f92719d730b8" (UID: "9cfbe58b-49c6-429c-a32f-f92719d730b8"). InnerVolumeSpecName "kube-api-access-p8r2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.307636 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.307712 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8r2l\" (UniqueName: \"kubernetes.io/projected/9cfbe58b-49c6-429c-a32f-f92719d730b8-kube-api-access-p8r2l\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.398200 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cfbe58b-49c6-429c-a32f-f92719d730b8" (UID: "9cfbe58b-49c6-429c-a32f-f92719d730b8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.409406 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cfbe58b-49c6-429c-a32f-f92719d730b8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.861007 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ql22d" event={"ID":"9cfbe58b-49c6-429c-a32f-f92719d730b8","Type":"ContainerDied","Data":"38a5c2b22340ffd9f2fd365d586b385ab7480c108ed7b91d755a1e95f7ddbc87"} Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.861263 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ql22d" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.863289 4917 scope.go:117] "RemoveContainer" containerID="8af9e3616e3ecab67885fc7e64d05fde4e1f66046ff00caecde40a8a5125f745" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.893319 4917 scope.go:117] "RemoveContainer" containerID="cc627489a692761da22691894bcc4af8346534e7a274fc61badeda2089b93f5a" Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.914356 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ql22d"] Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.920565 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ql22d"] Mar 18 06:49:52 crc kubenswrapper[4917]: I0318 06:49:52.953865 4917 scope.go:117] "RemoveContainer" containerID="9735fb4ad291fe6ea14c9acc3392fd84e37a7d3fa85356ea99e4b102776705f8" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.071253 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc"] Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.071742 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" podUID="6f24987a-21ec-4686-8efd-4115003e0103" containerName="controller-manager" containerID="cri-o://3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871" gracePeriod=30 Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.096464 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr"] Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.096785 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" podUID="4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" containerName="route-controller-manager" containerID="cri-o://06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687" gracePeriod=30 Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.554441 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.559429 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.623392 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-serving-cert\") pod \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.623490 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f24987a-21ec-4686-8efd-4115003e0103-serving-cert\") pod \"6f24987a-21ec-4686-8efd-4115003e0103\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.623535 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-client-ca\") pod \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.623704 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-config\") pod \"6f24987a-21ec-4686-8efd-4115003e0103\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.623771 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlcqg\" (UniqueName: \"kubernetes.io/projected/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-kube-api-access-vlcqg\") pod \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.623816 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-config\") pod \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\" (UID: \"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc\") " Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.623877 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-proxy-ca-bundles\") pod \"6f24987a-21ec-4686-8efd-4115003e0103\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.623910 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-client-ca\") pod \"6f24987a-21ec-4686-8efd-4115003e0103\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.623959 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9h5r\" (UniqueName: \"kubernetes.io/projected/6f24987a-21ec-4686-8efd-4115003e0103-kube-api-access-k9h5r\") pod \"6f24987a-21ec-4686-8efd-4115003e0103\" (UID: \"6f24987a-21ec-4686-8efd-4115003e0103\") " Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.624412 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" (UID: "4a3b8cb0-1cb3-4547-8910-268a6c7d31fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.624903 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-config" (OuterVolumeSpecName: "config") pod "4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" (UID: "4a3b8cb0-1cb3-4547-8910-268a6c7d31fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.625434 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6f24987a-21ec-4686-8efd-4115003e0103" (UID: "6f24987a-21ec-4686-8efd-4115003e0103"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.625461 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-client-ca" (OuterVolumeSpecName: "client-ca") pod "6f24987a-21ec-4686-8efd-4115003e0103" (UID: "6f24987a-21ec-4686-8efd-4115003e0103"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.625715 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-config" (OuterVolumeSpecName: "config") pod "6f24987a-21ec-4686-8efd-4115003e0103" (UID: "6f24987a-21ec-4686-8efd-4115003e0103"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.628698 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f24987a-21ec-4686-8efd-4115003e0103-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6f24987a-21ec-4686-8efd-4115003e0103" (UID: "6f24987a-21ec-4686-8efd-4115003e0103"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.628764 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f24987a-21ec-4686-8efd-4115003e0103-kube-api-access-k9h5r" (OuterVolumeSpecName: "kube-api-access-k9h5r") pod "6f24987a-21ec-4686-8efd-4115003e0103" (UID: "6f24987a-21ec-4686-8efd-4115003e0103"). InnerVolumeSpecName "kube-api-access-k9h5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.628782 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" (UID: "4a3b8cb0-1cb3-4547-8910-268a6c7d31fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.628839 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-kube-api-access-vlcqg" (OuterVolumeSpecName: "kube-api-access-vlcqg") pod "4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" (UID: "4a3b8cb0-1cb3-4547-8910-268a6c7d31fc"). InnerVolumeSpecName "kube-api-access-vlcqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.725299 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.725344 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f24987a-21ec-4686-8efd-4115003e0103-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.725356 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.725367 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.725381 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlcqg\" (UniqueName: \"kubernetes.io/projected/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-kube-api-access-vlcqg\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.725394 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.725405 4917 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.725415 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6f24987a-21ec-4686-8efd-4115003e0103-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.725426 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9h5r\" (UniqueName: \"kubernetes.io/projected/6f24987a-21ec-4686-8efd-4115003e0103-kube-api-access-k9h5r\") on node \"crc\" DevicePath \"\"" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.787131 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" path="/var/lib/kubelet/pods/9cfbe58b-49c6-429c-a32f-f92719d730b8/volumes" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.870379 4917 generic.go:334] "Generic (PLEG): container finished" podID="6f24987a-21ec-4686-8efd-4115003e0103" containerID="3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871" exitCode=0 Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.870521 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" event={"ID":"6f24987a-21ec-4686-8efd-4115003e0103","Type":"ContainerDied","Data":"3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871"} Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.870648 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" event={"ID":"6f24987a-21ec-4686-8efd-4115003e0103","Type":"ContainerDied","Data":"f0c13597fa4edb97efc451ce0fe7362cd82c363a6928335f38f47d47a5a00e62"} Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.870704 4917 scope.go:117] "RemoveContainer" containerID="3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.870920 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.875867 4917 generic.go:334] "Generic (PLEG): container finished" podID="4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" containerID="06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687" exitCode=0 Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.875954 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" event={"ID":"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc","Type":"ContainerDied","Data":"06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687"} Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.876020 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" event={"ID":"4a3b8cb0-1cb3-4547-8910-268a6c7d31fc","Type":"ContainerDied","Data":"2c0cdc0655fd6ad00f6e9a7886e7dcee87d4ce3e347dced1ab0614a1d9ff0277"} Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.876312 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.901004 4917 scope.go:117] "RemoveContainer" containerID="3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.902846 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc"] Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.907742 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7d78ffc85c-dvjkc"] Mar 18 06:49:53 crc kubenswrapper[4917]: E0318 06:49:53.907916 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871\": container with ID starting with 3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871 not found: ID does not exist" containerID="3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.907963 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871"} err="failed to get container status \"3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871\": rpc error: code = NotFound desc = could not find container \"3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871\": container with ID starting with 3c3eed9cc68af36b2d79e8b41306862c2e1cacc2a424ebaffd212a051fe5b871 not found: ID does not exist" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.908001 4917 scope.go:117] "RemoveContainer" containerID="06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.915236 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr"] Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.920110 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-749c8b7d59-4mxrr"] Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.944664 4917 scope.go:117] "RemoveContainer" containerID="06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687" Mar 18 06:49:53 crc kubenswrapper[4917]: E0318 06:49:53.945152 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687\": container with ID starting with 06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687 not found: ID does not exist" containerID="06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687" Mar 18 06:49:53 crc kubenswrapper[4917]: I0318 06:49:53.945205 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687"} err="failed to get container status \"06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687\": rpc error: code = NotFound desc = could not find container \"06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687\": container with ID starting with 06ad48b194b616f335143b4402667a61b176cdbe9ad232e8f1342c2da6f5c687 not found: ID does not exist" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.503733 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c"] Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504085 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerName="extract-utilities" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504108 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerName="extract-utilities" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504133 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f24987a-21ec-4686-8efd-4115003e0103" containerName="controller-manager" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504146 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f24987a-21ec-4686-8efd-4115003e0103" containerName="controller-manager" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504171 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerName="extract-utilities" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504184 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerName="extract-utilities" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504203 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerName="extract-content" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504215 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerName="extract-content" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504232 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504244 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504259 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerName="extract-utilities" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504273 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerName="extract-utilities" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504288 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" containerName="route-controller-manager" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504299 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" containerName="route-controller-manager" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504318 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504329 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504348 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerName="extract-content" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504361 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerName="extract-content" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504379 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504390 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504410 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504422 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504439 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerName="extract-utilities" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504452 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerName="extract-utilities" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504471 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerName="extract-content" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504482 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerName="extract-content" Mar 18 06:49:54 crc kubenswrapper[4917]: E0318 06:49:54.504498 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerName="extract-content" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504509 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerName="extract-content" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504712 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" containerName="route-controller-manager" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504731 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b2760aa-6e2d-48ef-8ed6-a0225e59df24" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504752 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cfbe58b-49c6-429c-a32f-f92719d730b8" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504774 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a48f389-623d-42a9-ad95-0e07c27b2eed" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504789 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f24987a-21ec-4686-8efd-4115003e0103" containerName="controller-manager" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.504809 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ad5fdc-a94e-46bb-b6d8-703862926e33" containerName="registry-server" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.505413 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.507040 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-667b974577-z5sxz"] Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.507897 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.509881 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.510968 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.511952 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.513485 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.513524 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.513702 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.514162 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.515945 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.516701 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.518219 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.518224 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.518545 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.526365 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-667b974577-z5sxz"] Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.535401 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-serving-cert\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.535529 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-config\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.535624 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-client-ca\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.535701 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-config\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.535754 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-client-ca\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.535894 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jxsj\" (UniqueName: \"kubernetes.io/projected/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-kube-api-access-2jxsj\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.535996 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf62f\" (UniqueName: \"kubernetes.io/projected/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-kube-api-access-pf62f\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.536055 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-serving-cert\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.536104 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-proxy-ca-bundles\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.538825 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c"] Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.540249 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.637341 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-serving-cert\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.637433 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-config\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.637473 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-client-ca\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.637519 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-config\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.637552 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-client-ca\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.637631 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jxsj\" (UniqueName: \"kubernetes.io/projected/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-kube-api-access-2jxsj\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.637704 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf62f\" (UniqueName: \"kubernetes.io/projected/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-kube-api-access-pf62f\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.637734 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-serving-cert\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.637769 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-proxy-ca-bundles\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.639868 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-proxy-ca-bundles\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.639917 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-client-ca\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.640865 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-config\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.641479 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-config\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.643358 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-client-ca\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.643898 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-serving-cert\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.645120 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-serving-cert\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.662081 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jxsj\" (UniqueName: \"kubernetes.io/projected/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-kube-api-access-2jxsj\") pod \"route-controller-manager-57df96565f-gm68c\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.669983 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf62f\" (UniqueName: \"kubernetes.io/projected/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-kube-api-access-pf62f\") pod \"controller-manager-667b974577-z5sxz\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.842904 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:54 crc kubenswrapper[4917]: I0318 06:49:54.857014 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.119396 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c"] Mar 18 06:49:55 crc kubenswrapper[4917]: W0318 06:49:55.128637 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd6bfa7b_7ba6_4f76_8fd4_512ffab413f9.slice/crio-15be528f720851e6d4bf5685ccc15f569b95523e55ffe4616296db8184d245d7 WatchSource:0}: Error finding container 15be528f720851e6d4bf5685ccc15f569b95523e55ffe4616296db8184d245d7: Status 404 returned error can't find the container with id 15be528f720851e6d4bf5685ccc15f569b95523e55ffe4616296db8184d245d7 Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.397501 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-667b974577-z5sxz"] Mar 18 06:49:55 crc kubenswrapper[4917]: W0318 06:49:55.407653 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac3a4fb0_3fa1_4ba7_aff1_7ca507d008e9.slice/crio-582779d97a141142a85654a46248ef39bf9ef71a1561f4199d7b0d44441bf13e WatchSource:0}: Error finding container 582779d97a141142a85654a46248ef39bf9ef71a1561f4199d7b0d44441bf13e: Status 404 returned error can't find the container with id 582779d97a141142a85654a46248ef39bf9ef71a1561f4199d7b0d44441bf13e Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.780435 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a3b8cb0-1cb3-4547-8910-268a6c7d31fc" path="/var/lib/kubelet/pods/4a3b8cb0-1cb3-4547-8910-268a6c7d31fc/volumes" Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.781051 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f24987a-21ec-4686-8efd-4115003e0103" path="/var/lib/kubelet/pods/6f24987a-21ec-4686-8efd-4115003e0103/volumes" Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.890252 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" event={"ID":"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9","Type":"ContainerStarted","Data":"8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07"} Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.890301 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" event={"ID":"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9","Type":"ContainerStarted","Data":"582779d97a141142a85654a46248ef39bf9ef71a1561f4199d7b0d44441bf13e"} Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.890547 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.891449 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" event={"ID":"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9","Type":"ContainerStarted","Data":"fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715"} Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.891487 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" event={"ID":"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9","Type":"ContainerStarted","Data":"15be528f720851e6d4bf5685ccc15f569b95523e55ffe4616296db8184d245d7"} Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.891750 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.895388 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.897119 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:49:55 crc kubenswrapper[4917]: I0318 06:49:55.912042 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" podStartSLOduration=2.912030251 podStartE2EDuration="2.912030251s" podCreationTimestamp="2026-03-18 06:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:55.908928251 +0000 UTC m=+180.850082975" watchObservedRunningTime="2026-03-18 06:49:55.912030251 +0000 UTC m=+180.853184965" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.137984 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" podStartSLOduration=7.137964099 podStartE2EDuration="7.137964099s" podCreationTimestamp="2026-03-18 06:49:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:49:55.959355103 +0000 UTC m=+180.900509837" watchObservedRunningTime="2026-03-18 06:50:00.137964099 +0000 UTC m=+185.079118833" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.144232 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563610-7pqdt"] Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.147543 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563610-7pqdt" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.152299 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563610-7pqdt"] Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.152421 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.152883 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.153271 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.222884 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wplmp\" (UniqueName: \"kubernetes.io/projected/c13e4c1a-96ea-4dad-9cc3-a850ee57f969-kube-api-access-wplmp\") pod \"auto-csr-approver-29563610-7pqdt\" (UID: \"c13e4c1a-96ea-4dad-9cc3-a850ee57f969\") " pod="openshift-infra/auto-csr-approver-29563610-7pqdt" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.323885 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wplmp\" (UniqueName: \"kubernetes.io/projected/c13e4c1a-96ea-4dad-9cc3-a850ee57f969-kube-api-access-wplmp\") pod \"auto-csr-approver-29563610-7pqdt\" (UID: \"c13e4c1a-96ea-4dad-9cc3-a850ee57f969\") " pod="openshift-infra/auto-csr-approver-29563610-7pqdt" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.346722 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wplmp\" (UniqueName: \"kubernetes.io/projected/c13e4c1a-96ea-4dad-9cc3-a850ee57f969-kube-api-access-wplmp\") pod \"auto-csr-approver-29563610-7pqdt\" (UID: \"c13e4c1a-96ea-4dad-9cc3-a850ee57f969\") " pod="openshift-infra/auto-csr-approver-29563610-7pqdt" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.465907 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563610-7pqdt" Mar 18 06:50:00 crc kubenswrapper[4917]: I0318 06:50:00.972917 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563610-7pqdt"] Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.036408 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" podUID="91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" containerName="oauth-openshift" containerID="cri-o://d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15" gracePeriod=15 Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.534643 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640509 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-ocp-branding-template\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640556 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-serving-cert\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640612 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-trusted-ca-bundle\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640635 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-login\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640670 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-session\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640754 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-router-certs\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640785 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-dir\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640829 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-idp-0-file-data\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640862 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfzw4\" (UniqueName: \"kubernetes.io/projected/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-kube-api-access-cfzw4\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640884 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-policies\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640917 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-provider-selection\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640942 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-error\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640963 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-service-ca\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.640981 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-cliconfig\") pod \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\" (UID: \"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9\") " Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.641356 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.641762 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.642047 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.642291 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.642769 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.648537 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.648961 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.649386 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.649453 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.650351 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-kube-api-access-cfzw4" (OuterVolumeSpecName: "kube-api-access-cfzw4") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "kube-api-access-cfzw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.650362 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.650513 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.651731 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.653818 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" (UID: "91ba995a-4ac9-4b7e-856e-58ef5bb5bca9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741654 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741697 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfzw4\" (UniqueName: \"kubernetes.io/projected/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-kube-api-access-cfzw4\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741714 4917 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741727 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741743 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741756 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741769 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741783 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741797 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741810 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741823 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741835 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741847 4917 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.741859 4917 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.935551 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563610-7pqdt" event={"ID":"c13e4c1a-96ea-4dad-9cc3-a850ee57f969","Type":"ContainerStarted","Data":"eb6b85fe4823fb39747d59ac40c722f81df93af06d91ca9839a6a1b38a405f8e"} Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.937462 4917 generic.go:334] "Generic (PLEG): container finished" podID="91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" containerID="d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15" exitCode=0 Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.937497 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" event={"ID":"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9","Type":"ContainerDied","Data":"d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15"} Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.937523 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" event={"ID":"91ba995a-4ac9-4b7e-856e-58ef5bb5bca9","Type":"ContainerDied","Data":"898a49a448d76e155307493e73ccf10bd1f76e76157c07e02ba1c6c054073f97"} Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.937544 4917 scope.go:117] "RemoveContainer" containerID="d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.937565 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-9xdsf" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.965301 4917 scope.go:117] "RemoveContainer" containerID="d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15" Mar 18 06:50:01 crc kubenswrapper[4917]: E0318 06:50:01.965794 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15\": container with ID starting with d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15 not found: ID does not exist" containerID="d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.965834 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15"} err="failed to get container status \"d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15\": rpc error: code = NotFound desc = could not find container \"d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15\": container with ID starting with d4e0c8c22a283d0be6869e277d0b370985f67b050f7c01e907dfc28854adcf15 not found: ID does not exist" Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.966032 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9xdsf"] Mar 18 06:50:01 crc kubenswrapper[4917]: I0318 06:50:01.969784 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-9xdsf"] Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.066847 4917 ???:1] "http: TLS handshake error from 192.168.126.11:58250: no serving certificate available for the kubelet" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.526815 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-74fd85d944-r7bnc"] Mar 18 06:50:03 crc kubenswrapper[4917]: E0318 06:50:03.527354 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" containerName="oauth-openshift" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.527379 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" containerName="oauth-openshift" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.527857 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" containerName="oauth-openshift" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.528344 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.533022 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.533353 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.533377 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.533405 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.533486 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.533363 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.533518 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.533751 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.534135 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.534440 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.536073 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.536435 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.544869 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.545054 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.546271 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-74fd85d944-r7bnc"] Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.556411 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568253 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ae46459-392d-440d-8f7a-4cbfd92aaa91-audit-dir\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568300 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-service-ca\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568385 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrzjv\" (UniqueName: \"kubernetes.io/projected/2ae46459-392d-440d-8f7a-4cbfd92aaa91-kube-api-access-hrzjv\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568575 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-serving-cert\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568646 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568682 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-audit-policies\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568712 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-session\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568754 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568778 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-router-certs\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568874 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568922 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.568953 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-template-login\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.569001 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.569038 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-template-error\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.669901 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.669954 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-router-certs\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670013 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670048 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670080 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-template-login\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670119 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670149 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-template-error\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670185 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ae46459-392d-440d-8f7a-4cbfd92aaa91-audit-dir\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670210 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-service-ca\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670240 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrzjv\" (UniqueName: \"kubernetes.io/projected/2ae46459-392d-440d-8f7a-4cbfd92aaa91-kube-api-access-hrzjv\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670254 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2ae46459-392d-440d-8f7a-4cbfd92aaa91-audit-dir\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670273 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-serving-cert\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670306 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670338 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-audit-policies\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670365 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-session\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670965 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.670988 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-cliconfig\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.671173 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-audit-policies\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.671331 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-service-ca\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.673703 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-session\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.674478 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-template-login\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.674893 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-template-error\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.675084 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-router-certs\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.675340 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.677300 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.677688 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.678309 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ae46459-392d-440d-8f7a-4cbfd92aaa91-v4-0-config-system-serving-cert\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.685765 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrzjv\" (UniqueName: \"kubernetes.io/projected/2ae46459-392d-440d-8f7a-4cbfd92aaa91-kube-api-access-hrzjv\") pod \"oauth-openshift-74fd85d944-r7bnc\" (UID: \"2ae46459-392d-440d-8f7a-4cbfd92aaa91\") " pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.778091 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91ba995a-4ac9-4b7e-856e-58ef5bb5bca9" path="/var/lib/kubelet/pods/91ba995a-4ac9-4b7e-856e-58ef5bb5bca9/volumes" Mar 18 06:50:03 crc kubenswrapper[4917]: I0318 06:50:03.861789 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:08 crc kubenswrapper[4917]: I0318 06:50:08.255463 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-74fd85d944-r7bnc"] Mar 18 06:50:08 crc kubenswrapper[4917]: I0318 06:50:08.640105 4917 csr.go:261] certificate signing request csr-gqdph is approved, waiting to be issued Mar 18 06:50:08 crc kubenswrapper[4917]: I0318 06:50:08.646538 4917 csr.go:257] certificate signing request csr-gqdph is issued Mar 18 06:50:09 crc kubenswrapper[4917]: I0318 06:50:09.006795 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" event={"ID":"2ae46459-392d-440d-8f7a-4cbfd92aaa91","Type":"ContainerStarted","Data":"a6f9dc1e54a8cc6c56898d263213182506c84f3bedd6d96f08418f05247d823a"} Mar 18 06:50:09 crc kubenswrapper[4917]: I0318 06:50:09.006846 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" event={"ID":"2ae46459-392d-440d-8f7a-4cbfd92aaa91","Type":"ContainerStarted","Data":"d8394b6cc8e3a651bd5d4b49f4cf6dc41bd4c572dc8473b785134e817e3d5fae"} Mar 18 06:50:09 crc kubenswrapper[4917]: I0318 06:50:09.008095 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:09 crc kubenswrapper[4917]: I0318 06:50:09.009675 4917 generic.go:334] "Generic (PLEG): container finished" podID="c13e4c1a-96ea-4dad-9cc3-a850ee57f969" containerID="68332a4a2ed53037c9224e4830b1e1364106cec82ac47c3ae4d48a8f94bd3da8" exitCode=0 Mar 18 06:50:09 crc kubenswrapper[4917]: I0318 06:50:09.009708 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563610-7pqdt" event={"ID":"c13e4c1a-96ea-4dad-9cc3-a850ee57f969","Type":"ContainerDied","Data":"68332a4a2ed53037c9224e4830b1e1364106cec82ac47c3ae4d48a8f94bd3da8"} Mar 18 06:50:09 crc kubenswrapper[4917]: I0318 06:50:09.017167 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" Mar 18 06:50:09 crc kubenswrapper[4917]: I0318 06:50:09.030056 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-74fd85d944-r7bnc" podStartSLOduration=33.03004098 podStartE2EDuration="33.03004098s" podCreationTimestamp="2026-03-18 06:49:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:50:09.028779641 +0000 UTC m=+193.969934355" watchObservedRunningTime="2026-03-18 06:50:09.03004098 +0000 UTC m=+193.971195684" Mar 18 06:50:09 crc kubenswrapper[4917]: I0318 06:50:09.648065 4917 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-30 14:31:29.073625095 +0000 UTC Mar 18 06:50:09 crc kubenswrapper[4917]: I0318 06:50:09.648377 4917 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6175h41m19.425250257s for next certificate rotation Mar 18 06:50:10 crc kubenswrapper[4917]: I0318 06:50:10.433901 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563610-7pqdt" Mar 18 06:50:10 crc kubenswrapper[4917]: I0318 06:50:10.554725 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wplmp\" (UniqueName: \"kubernetes.io/projected/c13e4c1a-96ea-4dad-9cc3-a850ee57f969-kube-api-access-wplmp\") pod \"c13e4c1a-96ea-4dad-9cc3-a850ee57f969\" (UID: \"c13e4c1a-96ea-4dad-9cc3-a850ee57f969\") " Mar 18 06:50:10 crc kubenswrapper[4917]: I0318 06:50:10.567947 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13e4c1a-96ea-4dad-9cc3-a850ee57f969-kube-api-access-wplmp" (OuterVolumeSpecName: "kube-api-access-wplmp") pod "c13e4c1a-96ea-4dad-9cc3-a850ee57f969" (UID: "c13e4c1a-96ea-4dad-9cc3-a850ee57f969"). InnerVolumeSpecName "kube-api-access-wplmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:50:10 crc kubenswrapper[4917]: I0318 06:50:10.648687 4917 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-28 04:16:54.82061425 +0000 UTC Mar 18 06:50:10 crc kubenswrapper[4917]: I0318 06:50:10.648724 4917 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6117h26m44.171892896s for next certificate rotation Mar 18 06:50:10 crc kubenswrapper[4917]: I0318 06:50:10.656387 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wplmp\" (UniqueName: \"kubernetes.io/projected/c13e4c1a-96ea-4dad-9cc3-a850ee57f969-kube-api-access-wplmp\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:11 crc kubenswrapper[4917]: I0318 06:50:11.024649 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563610-7pqdt" event={"ID":"c13e4c1a-96ea-4dad-9cc3-a850ee57f969","Type":"ContainerDied","Data":"eb6b85fe4823fb39747d59ac40c722f81df93af06d91ca9839a6a1b38a405f8e"} Mar 18 06:50:11 crc kubenswrapper[4917]: I0318 06:50:11.024738 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb6b85fe4823fb39747d59ac40c722f81df93af06d91ca9839a6a1b38a405f8e" Mar 18 06:50:11 crc kubenswrapper[4917]: I0318 06:50:11.024699 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563610-7pqdt" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.085333 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-667b974577-z5sxz"] Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.085569 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" podUID="ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" containerName="controller-manager" containerID="cri-o://8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07" gracePeriod=30 Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.190071 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c"] Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.190648 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" podUID="dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" containerName="route-controller-manager" containerID="cri-o://fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715" gracePeriod=30 Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.694406 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.702495 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.797708 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jxsj\" (UniqueName: \"kubernetes.io/projected/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-kube-api-access-2jxsj\") pod \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.797777 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-serving-cert\") pod \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.797819 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-client-ca\") pod \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.797872 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-config\") pod \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\" (UID: \"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9\") " Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.798821 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-client-ca" (OuterVolumeSpecName: "client-ca") pod "dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" (UID: "dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.798992 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-config" (OuterVolumeSpecName: "config") pod "dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" (UID: "dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.805079 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" (UID: "dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.805208 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-kube-api-access-2jxsj" (OuterVolumeSpecName: "kube-api-access-2jxsj") pod "dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" (UID: "dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9"). InnerVolumeSpecName "kube-api-access-2jxsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.899221 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf62f\" (UniqueName: \"kubernetes.io/projected/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-kube-api-access-pf62f\") pod \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.899374 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-serving-cert\") pod \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.899457 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-config\") pod \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.899571 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-client-ca\") pod \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.899649 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-proxy-ca-bundles\") pod \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\" (UID: \"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9\") " Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.900401 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" (UID: "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.900546 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" (UID: "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.901277 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-config" (OuterVolumeSpecName: "config") pod "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" (UID: "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.901574 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.901613 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.901660 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.901696 4917 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.901719 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jxsj\" (UniqueName: \"kubernetes.io/projected/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-kube-api-access-2jxsj\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.901739 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.901756 4917 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.904578 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-kube-api-access-pf62f" (OuterVolumeSpecName: "kube-api-access-pf62f") pod "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" (UID: "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9"). InnerVolumeSpecName "kube-api-access-pf62f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:50:13 crc kubenswrapper[4917]: I0318 06:50:13.904958 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" (UID: "ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.003154 4917 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.003207 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf62f\" (UniqueName: \"kubernetes.io/projected/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9-kube-api-access-pf62f\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.070739 4917 generic.go:334] "Generic (PLEG): container finished" podID="ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" containerID="8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07" exitCode=0 Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.070851 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" event={"ID":"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9","Type":"ContainerDied","Data":"8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07"} Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.070851 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.071061 4917 scope.go:117] "RemoveContainer" containerID="8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.071098 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-667b974577-z5sxz" event={"ID":"ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9","Type":"ContainerDied","Data":"582779d97a141142a85654a46248ef39bf9ef71a1561f4199d7b0d44441bf13e"} Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.073299 4917 generic.go:334] "Generic (PLEG): container finished" podID="dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" containerID="fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715" exitCode=0 Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.073554 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" event={"ID":"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9","Type":"ContainerDied","Data":"fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715"} Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.073644 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" event={"ID":"dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9","Type":"ContainerDied","Data":"15be528f720851e6d4bf5685ccc15f569b95523e55ffe4616296db8184d245d7"} Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.073723 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.099353 4917 scope.go:117] "RemoveContainer" containerID="8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07" Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.099937 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07\": container with ID starting with 8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07 not found: ID does not exist" containerID="8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.100009 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07"} err="failed to get container status \"8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07\": rpc error: code = NotFound desc = could not find container \"8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07\": container with ID starting with 8f542054ec37316a96c4aa977e123e40b821e3fdefdb2696864359d34e907f07 not found: ID does not exist" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.100096 4917 scope.go:117] "RemoveContainer" containerID="fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.121013 4917 scope.go:117] "RemoveContainer" containerID="fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715" Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.122046 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715\": container with ID starting with fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715 not found: ID does not exist" containerID="fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.122103 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715"} err="failed to get container status \"fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715\": rpc error: code = NotFound desc = could not find container \"fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715\": container with ID starting with fc15b0a6ad15a1e072014485a88143c5fdcce385e43d469bfe427956fd62b715 not found: ID does not exist" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.129084 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c"] Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.137617 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57df96565f-gm68c"] Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.145053 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-667b974577-z5sxz"] Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.150519 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-667b974577-z5sxz"] Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.510088 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-667bd87887-l86db"] Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.510912 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" containerName="controller-manager" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.510948 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" containerName="controller-manager" Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.510979 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" containerName="route-controller-manager" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.510991 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" containerName="route-controller-manager" Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.511008 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13e4c1a-96ea-4dad-9cc3-a850ee57f969" containerName="oc" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.511021 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13e4c1a-96ea-4dad-9cc3-a850ee57f969" containerName="oc" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.511183 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13e4c1a-96ea-4dad-9cc3-a850ee57f969" containerName="oc" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.511217 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" containerName="route-controller-manager" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.511230 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" containerName="controller-manager" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.511898 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.515117 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7749dc895c-jscn6"] Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.515980 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.516442 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.516459 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.517359 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 06:50:14 crc kubenswrapper[4917]: W0318 06:50:14.517435 4917 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.517469 4917 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.517556 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.518280 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 06:50:14 crc kubenswrapper[4917]: W0318 06:50:14.522582 4917 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.522651 4917 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 06:50:14 crc kubenswrapper[4917]: W0318 06:50:14.522756 4917 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 18 06:50:14 crc kubenswrapper[4917]: W0318 06:50:14.522755 4917 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.522828 4917 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.522772 4917 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 06:50:14 crc kubenswrapper[4917]: W0318 06:50:14.523004 4917 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 18 06:50:14 crc kubenswrapper[4917]: W0318 06:50:14.523049 4917 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.523069 4917 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.523037 4917 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.523158 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.524261 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7749dc895c-jscn6"] Mar 18 06:50:14 crc kubenswrapper[4917]: W0318 06:50:14.526334 4917 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Mar 18 06:50:14 crc kubenswrapper[4917]: E0318 06:50:14.526387 4917 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.536490 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-667bd87887-l86db"] Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.608797 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfmdq\" (UniqueName: \"kubernetes.io/projected/dbd98a7e-199b-44ff-847d-b1cc15e93e59-kube-api-access-pfmdq\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.608927 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwfgw\" (UniqueName: \"kubernetes.io/projected/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-kube-api-access-qwfgw\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.609061 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-serving-cert\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.609104 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbd98a7e-199b-44ff-847d-b1cc15e93e59-client-ca\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.609154 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-proxy-ca-bundles\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.609206 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-config\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.609299 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd98a7e-199b-44ff-847d-b1cc15e93e59-config\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.609385 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-client-ca\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.609464 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd98a7e-199b-44ff-847d-b1cc15e93e59-serving-cert\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.710340 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfmdq\" (UniqueName: \"kubernetes.io/projected/dbd98a7e-199b-44ff-847d-b1cc15e93e59-kube-api-access-pfmdq\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.710438 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwfgw\" (UniqueName: \"kubernetes.io/projected/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-kube-api-access-qwfgw\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.710534 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-serving-cert\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.710628 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbd98a7e-199b-44ff-847d-b1cc15e93e59-client-ca\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.710687 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-proxy-ca-bundles\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.710743 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-config\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.710795 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd98a7e-199b-44ff-847d-b1cc15e93e59-config\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.710863 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-client-ca\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.710939 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd98a7e-199b-44ff-847d-b1cc15e93e59-serving-cert\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.712679 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/dbd98a7e-199b-44ff-847d-b1cc15e93e59-client-ca\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.712908 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dbd98a7e-199b-44ff-847d-b1cc15e93e59-config\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.723743 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dbd98a7e-199b-44ff-847d-b1cc15e93e59-serving-cert\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.739951 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfmdq\" (UniqueName: \"kubernetes.io/projected/dbd98a7e-199b-44ff-847d-b1cc15e93e59-kube-api-access-pfmdq\") pod \"route-controller-manager-667bd87887-l86db\" (UID: \"dbd98a7e-199b-44ff-847d-b1cc15e93e59\") " pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:14 crc kubenswrapper[4917]: I0318 06:50:14.832725 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:15 crc kubenswrapper[4917]: I0318 06:50:15.101677 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-667bd87887-l86db"] Mar 18 06:50:15 crc kubenswrapper[4917]: I0318 06:50:15.343875 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 06:50:15 crc kubenswrapper[4917]: I0318 06:50:15.633409 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 06:50:15 crc kubenswrapper[4917]: I0318 06:50:15.691309 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 06:50:15 crc kubenswrapper[4917]: I0318 06:50:15.704523 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-config\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:15 crc kubenswrapper[4917]: E0318 06:50:15.711614 4917 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 18 06:50:15 crc kubenswrapper[4917]: E0318 06:50:15.711648 4917 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 18 06:50:15 crc kubenswrapper[4917]: E0318 06:50:15.711704 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-client-ca podName:15b1a4db-c918-49e0-a7ed-71a20efeeb5d nodeName:}" failed. No retries permitted until 2026-03-18 06:50:16.211677022 +0000 UTC m=+201.152831766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-client-ca") pod "controller-manager-7749dc895c-jscn6" (UID: "15b1a4db-c918-49e0-a7ed-71a20efeeb5d") : failed to sync configmap cache: timed out waiting for the condition Mar 18 06:50:15 crc kubenswrapper[4917]: E0318 06:50:15.711731 4917 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 18 06:50:15 crc kubenswrapper[4917]: E0318 06:50:15.711748 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-proxy-ca-bundles podName:15b1a4db-c918-49e0-a7ed-71a20efeeb5d nodeName:}" failed. No retries permitted until 2026-03-18 06:50:16.211724633 +0000 UTC m=+201.152879467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-proxy-ca-bundles") pod "controller-manager-7749dc895c-jscn6" (UID: "15b1a4db-c918-49e0-a7ed-71a20efeeb5d") : failed to sync configmap cache: timed out waiting for the condition Mar 18 06:50:15 crc kubenswrapper[4917]: E0318 06:50:15.711971 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-serving-cert podName:15b1a4db-c918-49e0-a7ed-71a20efeeb5d nodeName:}" failed. No retries permitted until 2026-03-18 06:50:16.211921718 +0000 UTC m=+201.153076432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-serving-cert") pod "controller-manager-7749dc895c-jscn6" (UID: "15b1a4db-c918-49e0-a7ed-71a20efeeb5d") : failed to sync secret cache: timed out waiting for the condition Mar 18 06:50:15 crc kubenswrapper[4917]: I0318 06:50:15.781096 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9" path="/var/lib/kubelet/pods/ac3a4fb0-3fa1-4ba7-aff1-7ca507d008e9/volumes" Mar 18 06:50:15 crc kubenswrapper[4917]: I0318 06:50:15.782703 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9" path="/var/lib/kubelet/pods/dd6bfa7b-7ba6-4f76-8fd4-512ffab413f9/volumes" Mar 18 06:50:15 crc kubenswrapper[4917]: I0318 06:50:15.935239 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 06:50:15 crc kubenswrapper[4917]: I0318 06:50:15.948477 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwfgw\" (UniqueName: \"kubernetes.io/projected/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-kube-api-access-qwfgw\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.018809 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.044943 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.110846 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.111376 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" event={"ID":"dbd98a7e-199b-44ff-847d-b1cc15e93e59","Type":"ContainerStarted","Data":"0db7ad740c11660a46e4b0c944d70a1aa4015ae764787c1c034bea4ed9a3e0bc"} Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.111452 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" event={"ID":"dbd98a7e-199b-44ff-847d-b1cc15e93e59","Type":"ContainerStarted","Data":"bc696ecc7924a4ad92d8da476b5b381676056f010f8441ec3b6acfa650fd2751"} Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.112125 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.123340 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.140057 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-667bd87887-l86db" podStartSLOduration=3.140031544 podStartE2EDuration="3.140031544s" podCreationTimestamp="2026-03-18 06:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:50:16.137996615 +0000 UTC m=+201.079151329" watchObservedRunningTime="2026-03-18 06:50:16.140031544 +0000 UTC m=+201.081186288" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.239987 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-proxy-ca-bundles\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.240108 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-client-ca\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.240226 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-serving-cert\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.242080 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-client-ca\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.242209 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-proxy-ca-bundles\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.248348 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/15b1a4db-c918-49e0-a7ed-71a20efeeb5d-serving-cert\") pod \"controller-manager-7749dc895c-jscn6\" (UID: \"15b1a4db-c918-49e0-a7ed-71a20efeeb5d\") " pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.336260 4917 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.337354 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675" gracePeriod=15 Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.337467 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275" gracePeriod=15 Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.337480 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72" gracePeriod=15 Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.337538 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b" gracePeriod=15 Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.337606 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c" gracePeriod=15 Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.341538 4917 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.341902 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.341930 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.341956 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.341971 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.341990 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342004 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.342023 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342036 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.342051 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342064 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.342082 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342095 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.342110 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342122 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.342143 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342157 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342310 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342326 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342347 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342367 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342383 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342398 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342415 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.342563 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342577 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342774 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.342953 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.342969 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.343133 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.344564 4917 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.345355 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.351464 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.351671 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.436545 4917 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.184:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.442369 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.442609 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.442642 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.442938 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.442992 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.443032 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.443054 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.443168 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.544767 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.544885 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545631 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545662 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545689 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545696 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545726 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545738 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545740 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545765 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545784 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545806 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545810 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545828 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545859 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.545909 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.573969 4917 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.574384 4917 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.574678 4917 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.575029 4917 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.575448 4917 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.575481 4917 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.575697 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="200ms" Mar 18 06:50:16 crc kubenswrapper[4917]: I0318 06:50:16.737119 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:16 crc kubenswrapper[4917]: W0318 06:50:16.770173 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-dbcf663fe1897d65630eedcc8acbf1e7a56559a09129e54b7b9cd3296b56f83f WatchSource:0}: Error finding container dbcf663fe1897d65630eedcc8acbf1e7a56559a09129e54b7b9cd3296b56f83f: Status 404 returned error can't find the container with id dbcf663fe1897d65630eedcc8acbf1e7a56559a09129e54b7b9cd3296b56f83f Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.776140 4917 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 06:50:16 crc kubenswrapper[4917]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2" Netns:"/var/run/netns/060199a5-2df0-40ca-848d-3807e581159e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:16 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 06:50:16 crc kubenswrapper[4917]: > Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.776221 4917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 06:50:16 crc kubenswrapper[4917]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2" Netns:"/var/run/netns/060199a5-2df0-40ca-848d-3807e581159e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:16 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 06:50:16 crc kubenswrapper[4917]: > pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.776253 4917 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 06:50:16 crc kubenswrapper[4917]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2" Netns:"/var/run/netns/060199a5-2df0-40ca-848d-3807e581159e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:16 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 06:50:16 crc kubenswrapper[4917]: > pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.776347 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-7749dc895c-jscn6_openshift-controller-manager(15b1a4db-c918-49e0-a7ed-71a20efeeb5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-7749dc895c-jscn6_openshift-controller-manager(15b1a4db-c918-49e0-a7ed-71a20efeeb5d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2\\\" Netns:\\\"/var/run/netns/060199a5-2df0-40ca-848d-3807e581159e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s\\\": dial tcp 38.102.83.184:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" podUID="15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.776571 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="400ms" Mar 18 06:50:16 crc kubenswrapper[4917]: E0318 06:50:16.777112 4917 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.184:6443: connect: connection refused" event=< Mar 18 06:50:16 crc kubenswrapper[4917]: &Event{ObjectMeta:{controller-manager-7749dc895c-jscn6.189ddcd09ba2938b openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-7749dc895c-jscn6,UID:15b1a4db-c918-49e0-a7ed-71a20efeeb5d,APIVersion:v1,ResourceVersion:29901,FieldPath:,},Reason:FailedCreatePodSandBox,Message:Failed to create pod sandbox: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2" Netns:"/var/run/netns/060199a5-2df0-40ca-848d-3807e581159e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:16 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"},Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:50:16.776274827 +0000 UTC m=+201.717429571,LastTimestamp:2026-03-18 06:50:16.776274827 +0000 UTC m=+201.717429571,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:50:16 crc kubenswrapper[4917]: > Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.121141 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"dbcf663fe1897d65630eedcc8acbf1e7a56559a09129e54b7b9cd3296b56f83f"} Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.124772 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.127112 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.128702 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b" exitCode=0 Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.128937 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275" exitCode=0 Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.129201 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72" exitCode=0 Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.129632 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c" exitCode=2 Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.128956 4917 scope.go:117] "RemoveContainer" containerID="0ca58bb67e06f1e4f1bd7c1c1cfd31b7eab8c33ef326c8989b99ca8f35676b95" Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.132704 4917 generic.go:334] "Generic (PLEG): container finished" podID="f35e52c9-1886-464a-99ac-95e9803226db" containerID="2dc38bfd7adefec982717bd1254001f68dc88008f4e1a31ff4e2570493f79200" exitCode=0 Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.132956 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f35e52c9-1886-464a-99ac-95e9803226db","Type":"ContainerDied","Data":"2dc38bfd7adefec982717bd1254001f68dc88008f4e1a31ff4e2570493f79200"} Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.133152 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.134187 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:17 crc kubenswrapper[4917]: I0318 06:50:17.135237 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.178045 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="800ms" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.309219 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:50:17Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:50:17Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:50:17Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-18T06:50:17Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.309483 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.309721 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.310195 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.310878 4917 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.310919 4917 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.903905 4917 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 06:50:17 crc kubenswrapper[4917]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1" Netns:"/var/run/netns/515f5b39-8f31-40dd-912d-ac7ea678efcf" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:17 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 06:50:17 crc kubenswrapper[4917]: > Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.904384 4917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 06:50:17 crc kubenswrapper[4917]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1" Netns:"/var/run/netns/515f5b39-8f31-40dd-912d-ac7ea678efcf" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:17 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 06:50:17 crc kubenswrapper[4917]: > pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.904421 4917 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 06:50:17 crc kubenswrapper[4917]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1" Netns:"/var/run/netns/515f5b39-8f31-40dd-912d-ac7ea678efcf" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:17 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 06:50:17 crc kubenswrapper[4917]: > pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.904521 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-7749dc895c-jscn6_openshift-controller-manager(15b1a4db-c918-49e0-a7ed-71a20efeeb5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-7749dc895c-jscn6_openshift-controller-manager(15b1a4db-c918-49e0-a7ed-71a20efeeb5d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1\\\" Netns:\\\"/var/run/netns/515f5b39-8f31-40dd-912d-ac7ea678efcf\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=88a3e8efccd9a0251f8a4328584a50fb5d094fccc9c269d0ef00b8df06dcd5d1;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s\\\": dial tcp 38.102.83.184:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" podUID="15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Mar 18 06:50:17 crc kubenswrapper[4917]: E0318 06:50:17.978667 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="1.6s" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.141540 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"615b876d547faf8a09a2cf381648695c9212b4c0b6f1ef0dfdb88339622c2862"} Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.142425 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:18 crc kubenswrapper[4917]: E0318 06:50:18.142478 4917 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.184:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.145066 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.540797 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.541440 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.576295 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-kubelet-dir\") pod \"f35e52c9-1886-464a-99ac-95e9803226db\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.576382 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-var-lock\") pod \"f35e52c9-1886-464a-99ac-95e9803226db\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.576482 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f35e52c9-1886-464a-99ac-95e9803226db-kube-api-access\") pod \"f35e52c9-1886-464a-99ac-95e9803226db\" (UID: \"f35e52c9-1886-464a-99ac-95e9803226db\") " Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.577602 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f35e52c9-1886-464a-99ac-95e9803226db" (UID: "f35e52c9-1886-464a-99ac-95e9803226db"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.577674 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-var-lock" (OuterVolumeSpecName: "var-lock") pod "f35e52c9-1886-464a-99ac-95e9803226db" (UID: "f35e52c9-1886-464a-99ac-95e9803226db"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.582221 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f35e52c9-1886-464a-99ac-95e9803226db-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f35e52c9-1886-464a-99ac-95e9803226db" (UID: "f35e52c9-1886-464a-99ac-95e9803226db"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.677796 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f35e52c9-1886-464a-99ac-95e9803226db-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.677850 4917 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.677869 4917 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f35e52c9-1886-464a-99ac-95e9803226db-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.721740 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.722746 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.723482 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.723913 4917 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.779251 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.779328 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.779366 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.779389 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.779401 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.779492 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.779703 4917 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.779720 4917 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:18 crc kubenswrapper[4917]: I0318 06:50:18.779731 4917 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.164092 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.165455 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675" exitCode=0 Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.165633 4917 scope.go:117] "RemoveContainer" containerID="4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.165685 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.169273 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.169888 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f35e52c9-1886-464a-99ac-95e9803226db","Type":"ContainerDied","Data":"00238203e6f710df2e88a3716d7979338e0e78a69980e32a54c2ceb233a4b275"} Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.169975 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00238203e6f710df2e88a3716d7979338e0e78a69980e32a54c2ceb233a4b275" Mar 18 06:50:19 crc kubenswrapper[4917]: E0318 06:50:19.170069 4917 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.184:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.191945 4917 scope.go:117] "RemoveContainer" containerID="ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.195722 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.196243 4917 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.201808 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.202543 4917 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.214143 4917 scope.go:117] "RemoveContainer" containerID="d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.239147 4917 scope.go:117] "RemoveContainer" containerID="a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.264439 4917 scope.go:117] "RemoveContainer" containerID="67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.296024 4917 scope.go:117] "RemoveContainer" containerID="5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.332398 4917 scope.go:117] "RemoveContainer" containerID="4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b" Mar 18 06:50:19 crc kubenswrapper[4917]: E0318 06:50:19.332955 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b\": container with ID starting with 4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b not found: ID does not exist" containerID="4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.333005 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b"} err="failed to get container status \"4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b\": rpc error: code = NotFound desc = could not find container \"4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b\": container with ID starting with 4de0cb27037ad8694a0bc82e80c2bc9a0bfda19ea5b99d49adfc2e81304be74b not found: ID does not exist" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.333079 4917 scope.go:117] "RemoveContainer" containerID="ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275" Mar 18 06:50:19 crc kubenswrapper[4917]: E0318 06:50:19.333654 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275\": container with ID starting with ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275 not found: ID does not exist" containerID="ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.333701 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275"} err="failed to get container status \"ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275\": rpc error: code = NotFound desc = could not find container \"ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275\": container with ID starting with ab1bcab8203a7c1db81dc764e9bd202cc36e7ab046831a252d737d4088d77275 not found: ID does not exist" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.333730 4917 scope.go:117] "RemoveContainer" containerID="d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72" Mar 18 06:50:19 crc kubenswrapper[4917]: E0318 06:50:19.334131 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72\": container with ID starting with d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72 not found: ID does not exist" containerID="d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.334165 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72"} err="failed to get container status \"d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72\": rpc error: code = NotFound desc = could not find container \"d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72\": container with ID starting with d071ba8e822504a3d112d4b4cdac734c1a927c9580e1eb76f9f1717d0179bb72 not found: ID does not exist" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.334208 4917 scope.go:117] "RemoveContainer" containerID="a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c" Mar 18 06:50:19 crc kubenswrapper[4917]: E0318 06:50:19.335089 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c\": container with ID starting with a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c not found: ID does not exist" containerID="a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.335132 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c"} err="failed to get container status \"a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c\": rpc error: code = NotFound desc = could not find container \"a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c\": container with ID starting with a347848cd5bb3eefdcf1e499c463e7bc8f83b460677f7fd246a4b4576bb4164c not found: ID does not exist" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.335158 4917 scope.go:117] "RemoveContainer" containerID="67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675" Mar 18 06:50:19 crc kubenswrapper[4917]: E0318 06:50:19.335770 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675\": container with ID starting with 67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675 not found: ID does not exist" containerID="67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.335852 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675"} err="failed to get container status \"67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675\": rpc error: code = NotFound desc = could not find container \"67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675\": container with ID starting with 67d35b97dc7aec1d4db63cef948e8b6c582856f1baa420e129cf9d03f9fca675 not found: ID does not exist" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.335909 4917 scope.go:117] "RemoveContainer" containerID="5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475" Mar 18 06:50:19 crc kubenswrapper[4917]: E0318 06:50:19.336450 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475\": container with ID starting with 5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475 not found: ID does not exist" containerID="5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.336489 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475"} err="failed to get container status \"5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475\": rpc error: code = NotFound desc = could not find container \"5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475\": container with ID starting with 5d56b41c387ae97a91378950db42e76abede5f55d91a6a42136546372a1dc475 not found: ID does not exist" Mar 18 06:50:19 crc kubenswrapper[4917]: E0318 06:50:19.579544 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="3.2s" Mar 18 06:50:19 crc kubenswrapper[4917]: I0318 06:50:19.792629 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 18 06:50:22 crc kubenswrapper[4917]: E0318 06:50:22.781084 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="6.4s" Mar 18 06:50:23 crc kubenswrapper[4917]: E0318 06:50:23.275352 4917 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.184:6443: connect: connection refused" event=< Mar 18 06:50:23 crc kubenswrapper[4917]: &Event{ObjectMeta:{controller-manager-7749dc895c-jscn6.189ddcd09ba2938b openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-7749dc895c-jscn6,UID:15b1a4db-c918-49e0-a7ed-71a20efeeb5d,APIVersion:v1,ResourceVersion:29901,FieldPath:,},Reason:FailedCreatePodSandBox,Message:Failed to create pod sandbox: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2" Netns:"/var/run/netns/060199a5-2df0-40ca-848d-3807e581159e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=f1b2e9f7848432b160b0beaf7ca8d83fa50b5f147059207c598ece3b80bdcff2;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:23 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"},Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-18 06:50:16.776274827 +0000 UTC m=+201.717429571,LastTimestamp:2026-03-18 06:50:16.776274827 +0000 UTC m=+201.717429571,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 18 06:50:23 crc kubenswrapper[4917]: > Mar 18 06:50:25 crc kubenswrapper[4917]: I0318 06:50:25.777370 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:29 crc kubenswrapper[4917]: E0318 06:50:29.182493 4917 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.184:6443: connect: connection refused" interval="7s" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.254260 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.256206 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.256277 4917 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="77bb48c37e48b86055e70fba417ddf2721298d8809a29fad69a8eaaeb1af7298" exitCode=1 Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.256320 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"77bb48c37e48b86055e70fba417ddf2721298d8809a29fad69a8eaaeb1af7298"} Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.256956 4917 scope.go:117] "RemoveContainer" containerID="77bb48c37e48b86055e70fba417ddf2721298d8809a29fad69a8eaaeb1af7298" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.258185 4917 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.259138 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.772477 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.773249 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.772579 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.774501 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.774998 4917 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.853086 4917 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.853123 4917 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:30 crc kubenswrapper[4917]: E0318 06:50:30.853923 4917 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:30 crc kubenswrapper[4917]: I0318 06:50:30.855182 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:31 crc kubenswrapper[4917]: E0318 06:50:31.269931 4917 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 18 06:50:31 crc kubenswrapper[4917]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1" Netns:"/var/run/netns/dc92b2af-8691-402c-8e6d-ff4d0108429e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:31 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 06:50:31 crc kubenswrapper[4917]: > Mar 18 06:50:31 crc kubenswrapper[4917]: E0318 06:50:31.270299 4917 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 18 06:50:31 crc kubenswrapper[4917]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1" Netns:"/var/run/netns/dc92b2af-8691-402c-8e6d-ff4d0108429e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:31 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 06:50:31 crc kubenswrapper[4917]: > pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:31 crc kubenswrapper[4917]: E0318 06:50:31.270336 4917 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 18 06:50:31 crc kubenswrapper[4917]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1" Netns:"/var/run/netns/dc92b2af-8691-402c-8e6d-ff4d0108429e" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s": dial tcp 38.102.83.184:6443: connect: connection refused Mar 18 06:50:31 crc kubenswrapper[4917]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 18 06:50:31 crc kubenswrapper[4917]: > pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:31 crc kubenswrapper[4917]: I0318 06:50:31.270384 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c8c069e5b7302c071ac40822ef03bc6f3562787fc7e2bc33cd57387fcfd38066"} Mar 18 06:50:31 crc kubenswrapper[4917]: E0318 06:50:31.270468 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-7749dc895c-jscn6_openshift-controller-manager(15b1a4db-c918-49e0-a7ed-71a20efeeb5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-7749dc895c-jscn6_openshift-controller-manager(15b1a4db-c918-49e0-a7ed-71a20efeeb5d)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-7749dc895c-jscn6_openshift-controller-manager_15b1a4db-c918-49e0-a7ed-71a20efeeb5d_0(ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1): error adding pod openshift-controller-manager_controller-manager-7749dc895c-jscn6 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1\\\" Netns:\\\"/var/run/netns/dc92b2af-8691-402c-8e6d-ff4d0108429e\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-7749dc895c-jscn6;K8S_POD_INFRA_CONTAINER_ID=ba0079be53508995c6f0178634d9454bbcfb817fcdba15f9f512ef8d9953afd1;K8S_POD_UID=15b1a4db-c918-49e0-a7ed-71a20efeeb5d\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-7749dc895c-jscn6] networking: Multus: [openshift-controller-manager/controller-manager-7749dc895c-jscn6/15b1a4db-c918-49e0-a7ed-71a20efeeb5d]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-7749dc895c-jscn6 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-7749dc895c-jscn6?timeout=1m0s\\\": dial tcp 38.102.83.184:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" podUID="15b1a4db-c918-49e0-a7ed-71a20efeeb5d" Mar 18 06:50:31 crc kubenswrapper[4917]: I0318 06:50:31.273226 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 18 06:50:31 crc kubenswrapper[4917]: I0318 06:50:31.274803 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 18 06:50:31 crc kubenswrapper[4917]: I0318 06:50:31.274889 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"334b33df8acd4fd5642450ba616337257e53223542aa0c3fc129e48bea9636ea"} Mar 18 06:50:31 crc kubenswrapper[4917]: I0318 06:50:31.276457 4917 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:31 crc kubenswrapper[4917]: I0318 06:50:31.276839 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.117108 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.123818 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.124685 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.125264 4917 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.285640 4917 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="be925b43c3e0baddc39716153eb24e8ee890621e30820bab60df0e7265c3e9e6" exitCode=0 Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.285742 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"be925b43c3e0baddc39716153eb24e8ee890621e30820bab60df0e7265c3e9e6"} Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.286042 4917 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.286079 4917 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:32 crc kubenswrapper[4917]: E0318 06:50:32.286761 4917 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.286843 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.287173 4917 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:32 crc kubenswrapper[4917]: I0318 06:50:32.287893 4917 status_manager.go:851] "Failed to get status for pod" podUID="f35e52c9-1886-464a-99ac-95e9803226db" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.184:6443: connect: connection refused" Mar 18 06:50:33 crc kubenswrapper[4917]: I0318 06:50:33.307413 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1094602b0e2ba05eab7d82af653c0965205a002776d08059c0de8f479da5e970"} Mar 18 06:50:33 crc kubenswrapper[4917]: I0318 06:50:33.307792 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77019d7f4853ef293aca0bd006369eb7b704ede3eee00b91bc33b7f16ce206fe"} Mar 18 06:50:33 crc kubenswrapper[4917]: I0318 06:50:33.307806 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1b757a2a7c4c88379e7f22fa91f14a430893929fa02d2614156dd776abc1f697"} Mar 18 06:50:34 crc kubenswrapper[4917]: I0318 06:50:34.314124 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5b6a734f50e223a1b5e1d27ba4039f6bca2db6f6f379804606a347e2a9b7b959"} Mar 18 06:50:34 crc kubenswrapper[4917]: I0318 06:50:34.314432 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ec0b35445b80ed7eeb2e5f37750dd151aced9bb704a9592bc7dbbdb819982a05"} Mar 18 06:50:34 crc kubenswrapper[4917]: I0318 06:50:34.314652 4917 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:34 crc kubenswrapper[4917]: I0318 06:50:34.314690 4917 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:35 crc kubenswrapper[4917]: I0318 06:50:35.856247 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:35 crc kubenswrapper[4917]: I0318 06:50:35.856509 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:35 crc kubenswrapper[4917]: I0318 06:50:35.865836 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:39 crc kubenswrapper[4917]: I0318 06:50:39.437326 4917 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:39 crc kubenswrapper[4917]: I0318 06:50:39.522019 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eb475199-48cf-4d48-b9ca-858beea3d8f4" Mar 18 06:50:40 crc kubenswrapper[4917]: I0318 06:50:40.365510 4917 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:40 crc kubenswrapper[4917]: I0318 06:50:40.365539 4917 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:40 crc kubenswrapper[4917]: I0318 06:50:40.365562 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:40 crc kubenswrapper[4917]: I0318 06:50:40.369125 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eb475199-48cf-4d48-b9ca-858beea3d8f4" Mar 18 06:50:40 crc kubenswrapper[4917]: I0318 06:50:40.370806 4917 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://1b757a2a7c4c88379e7f22fa91f14a430893929fa02d2614156dd776abc1f697" Mar 18 06:50:40 crc kubenswrapper[4917]: I0318 06:50:40.370824 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:41 crc kubenswrapper[4917]: I0318 06:50:41.373228 4917 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:41 crc kubenswrapper[4917]: I0318 06:50:41.373275 4917 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:41 crc kubenswrapper[4917]: I0318 06:50:41.378085 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eb475199-48cf-4d48-b9ca-858beea3d8f4" Mar 18 06:50:41 crc kubenswrapper[4917]: I0318 06:50:41.772407 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:41 crc kubenswrapper[4917]: I0318 06:50:41.773199 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:42 crc kubenswrapper[4917]: I0318 06:50:42.380989 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" event={"ID":"15b1a4db-c918-49e0-a7ed-71a20efeeb5d","Type":"ContainerStarted","Data":"e08bed3dccce7a291cf679a963103d24e251678ea4cd88995baaddc06df37f6a"} Mar 18 06:50:42 crc kubenswrapper[4917]: I0318 06:50:42.381655 4917 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:42 crc kubenswrapper[4917]: I0318 06:50:42.383232 4917 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:42 crc kubenswrapper[4917]: I0318 06:50:42.383193 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" event={"ID":"15b1a4db-c918-49e0-a7ed-71a20efeeb5d","Type":"ContainerStarted","Data":"563194af20843d29c4fcbf7787e9dac9b951544f2591fd158b056d288dad1490"} Mar 18 06:50:42 crc kubenswrapper[4917]: I0318 06:50:42.383620 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:42 crc kubenswrapper[4917]: I0318 06:50:42.386759 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="eb475199-48cf-4d48-b9ca-858beea3d8f4" Mar 18 06:50:42 crc kubenswrapper[4917]: I0318 06:50:42.394476 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" Mar 18 06:50:49 crc kubenswrapper[4917]: I0318 06:50:49.425359 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 18 06:50:49 crc kubenswrapper[4917]: I0318 06:50:49.708831 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 18 06:50:50 crc kubenswrapper[4917]: I0318 06:50:50.064277 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 18 06:50:50 crc kubenswrapper[4917]: I0318 06:50:50.118960 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 18 06:50:50 crc kubenswrapper[4917]: I0318 06:50:50.280240 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 18 06:50:50 crc kubenswrapper[4917]: I0318 06:50:50.280247 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 18 06:50:50 crc kubenswrapper[4917]: I0318 06:50:50.459692 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 18 06:50:50 crc kubenswrapper[4917]: I0318 06:50:50.796193 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 18 06:50:50 crc kubenswrapper[4917]: I0318 06:50:50.986499 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.122088 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.196746 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.496666 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.515776 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.653004 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.692213 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.782210 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.812388 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.882738 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.883390 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 18 06:50:51 crc kubenswrapper[4917]: I0318 06:50:51.959820 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.022897 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.287060 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.302540 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.422989 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.483925 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.486206 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.490914 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.515345 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.544130 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.645225 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.704088 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 18 06:50:52 crc kubenswrapper[4917]: I0318 06:50:52.919972 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.062230 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.062264 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.090381 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.221161 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.245533 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.292365 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.358801 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.406071 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.414435 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.471237 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.488326 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.523364 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.692579 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.716687 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.743411 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.815172 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.898995 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.946553 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.968403 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.976137 4917 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.984455 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.984566 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7749dc895c-jscn6" podStartSLOduration=40.984534317 podStartE2EDuration="40.984534317s" podCreationTimestamp="2026-03-18 06:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:50:42.412441057 +0000 UTC m=+227.353595841" watchObservedRunningTime="2026-03-18 06:50:53.984534317 +0000 UTC m=+238.925689101" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.988215 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.988304 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.988348 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7749dc895c-jscn6"] Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.989034 4917 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:53 crc kubenswrapper[4917]: I0318 06:50:53.989073 4917 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="da2936a8-51c8-42fe-88e6-739ca9192f46" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.014063 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.027254 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.0272017 podStartE2EDuration="15.0272017s" podCreationTimestamp="2026-03-18 06:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:50:54.020496419 +0000 UTC m=+238.961651213" watchObservedRunningTime="2026-03-18 06:50:54.0272017 +0000 UTC m=+238.968356454" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.033139 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.046450 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.095633 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.200429 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.257958 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.330405 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.343046 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.351217 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.423314 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.462792 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.565321 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.617246 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.667322 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.683226 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.900652 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 18 06:50:54 crc kubenswrapper[4917]: I0318 06:50:54.965734 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.077231 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.384870 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.402925 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.417712 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.427373 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.499303 4917 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.537667 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.540275 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.580295 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.756116 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.767948 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.776545 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.949201 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 18 06:50:55 crc kubenswrapper[4917]: I0318 06:50:55.984774 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.011043 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.097138 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.157713 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.169512 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.195937 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.201172 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.284695 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.298033 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.342805 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.396560 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.446268 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.468449 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.486568 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.512115 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.522906 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.555368 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.620017 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.656574 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.714847 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.715417 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.718477 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.728999 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.849229 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.876849 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.941577 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 18 06:50:56 crc kubenswrapper[4917]: I0318 06:50:56.994509 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.018857 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.205616 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.424556 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.456567 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.560306 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.638371 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.656954 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.660217 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.683953 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.784969 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.793802 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.842637 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.932727 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 18 06:50:57 crc kubenswrapper[4917]: I0318 06:50:57.958854 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.001772 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.023881 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.060675 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.071632 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.141666 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.228674 4917 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.293978 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.300939 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.302843 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.356000 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.399026 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.500891 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.510286 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.637470 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.669801 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.687910 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.739037 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.774411 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.788206 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.793627 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.834840 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.927768 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 18 06:50:58 crc kubenswrapper[4917]: I0318 06:50:58.939227 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.022903 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.026405 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.032646 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.067419 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.112827 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.359211 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.389411 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.400373 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.493572 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.520043 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.530553 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.545878 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.577882 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.642291 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.657159 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.718384 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.745410 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.793089 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.878915 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.965751 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.985539 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 18 06:50:59 crc kubenswrapper[4917]: I0318 06:50:59.995454 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.019814 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.040143 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.058812 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.086994 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.171865 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.212906 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.319540 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.353931 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.369100 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.380515 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.517427 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.595009 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.603695 4917 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.636711 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.661077 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.747470 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.781877 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.789954 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.874820 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.967011 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.995046 4917 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 18 06:51:00 crc kubenswrapper[4917]: I0318 06:51:00.995336 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://615b876d547faf8a09a2cf381648695c9212b4c0b6f1ef0dfdb88339622c2862" gracePeriod=5 Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.075216 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.077331 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.110068 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.134552 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.175431 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.221068 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.223412 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.323756 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.459761 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.512932 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.592800 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.632297 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.672062 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.699992 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.759460 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.783708 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.805248 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.833801 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.888362 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.988215 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 18 06:51:01 crc kubenswrapper[4917]: I0318 06:51:01.999668 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.253292 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.317528 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.356512 4917 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.409920 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.545460 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.560865 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.691677 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.837692 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.838918 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.902969 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.929379 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.929450 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.974573 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 18 06:51:02 crc kubenswrapper[4917]: I0318 06:51:02.998831 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 18 06:51:03 crc kubenswrapper[4917]: I0318 06:51:03.110083 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 18 06:51:03 crc kubenswrapper[4917]: I0318 06:51:03.387622 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 18 06:51:03 crc kubenswrapper[4917]: I0318 06:51:03.498776 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 18 06:51:03 crc kubenswrapper[4917]: I0318 06:51:03.587801 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 18 06:51:03 crc kubenswrapper[4917]: I0318 06:51:03.649317 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 06:51:03 crc kubenswrapper[4917]: I0318 06:51:03.816561 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 18 06:51:03 crc kubenswrapper[4917]: I0318 06:51:03.974425 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.031860 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.049998 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.188708 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.263025 4917 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.312564 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.338460 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.404894 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.445458 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.601669 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.620716 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.694142 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.777530 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.810981 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.828385 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 18 06:51:04 crc kubenswrapper[4917]: I0318 06:51:04.961459 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 18 06:51:05 crc kubenswrapper[4917]: I0318 06:51:05.560497 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.250990 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.548116 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.548226 4917 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="615b876d547faf8a09a2cf381648695c9212b4c0b6f1ef0dfdb88339622c2862" exitCode=137 Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.609107 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.609209 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.750998 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.751068 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.751114 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.751177 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.751251 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.751362 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.751363 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.751375 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.751492 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.752083 4917 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.752112 4917 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.752127 4917 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.752141 4917 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.763397 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:51:06 crc kubenswrapper[4917]: I0318 06:51:06.854209 4917 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 18 06:51:07 crc kubenswrapper[4917]: I0318 06:51:07.556039 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 18 06:51:07 crc kubenswrapper[4917]: I0318 06:51:07.556451 4917 scope.go:117] "RemoveContainer" containerID="615b876d547faf8a09a2cf381648695c9212b4c0b6f1ef0dfdb88339622c2862" Mar 18 06:51:07 crc kubenswrapper[4917]: I0318 06:51:07.556663 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 18 06:51:07 crc kubenswrapper[4917]: I0318 06:51:07.784843 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 18 06:51:18 crc kubenswrapper[4917]: I0318 06:51:18.412972 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 18 06:51:23 crc kubenswrapper[4917]: I0318 06:51:23.637511 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 18 06:51:24 crc kubenswrapper[4917]: I0318 06:51:24.114815 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 18 06:51:24 crc kubenswrapper[4917]: I0318 06:51:24.682786 4917 generic.go:334] "Generic (PLEG): container finished" podID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerID="5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb" exitCode=0 Mar 18 06:51:24 crc kubenswrapper[4917]: I0318 06:51:24.682887 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" event={"ID":"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26","Type":"ContainerDied","Data":"5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb"} Mar 18 06:51:24 crc kubenswrapper[4917]: I0318 06:51:24.683704 4917 scope.go:117] "RemoveContainer" containerID="5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb" Mar 18 06:51:25 crc kubenswrapper[4917]: I0318 06:51:25.695852 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" event={"ID":"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26","Type":"ContainerStarted","Data":"6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b"} Mar 18 06:51:25 crc kubenswrapper[4917]: I0318 06:51:25.698496 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:51:25 crc kubenswrapper[4917]: I0318 06:51:25.700890 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:51:27 crc kubenswrapper[4917]: I0318 06:51:27.243182 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 18 06:51:28 crc kubenswrapper[4917]: I0318 06:51:28.203885 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 18 06:51:29 crc kubenswrapper[4917]: I0318 06:51:29.414932 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 18 06:51:32 crc kubenswrapper[4917]: I0318 06:51:32.929306 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 06:51:32 crc kubenswrapper[4917]: I0318 06:51:32.930003 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 06:51:33 crc kubenswrapper[4917]: I0318 06:51:33.447885 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 18 06:51:38 crc kubenswrapper[4917]: I0318 06:51:38.355145 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.189604 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563612-whsn7"] Mar 18 06:52:00 crc kubenswrapper[4917]: E0318 06:52:00.190297 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.190311 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 06:52:00 crc kubenswrapper[4917]: E0318 06:52:00.190327 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f35e52c9-1886-464a-99ac-95e9803226db" containerName="installer" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.190337 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35e52c9-1886-464a-99ac-95e9803226db" containerName="installer" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.190440 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f35e52c9-1886-464a-99ac-95e9803226db" containerName="installer" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.190460 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.190880 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563612-whsn7" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.193975 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.194274 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.194514 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.212081 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563612-whsn7"] Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.224093 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w6bb\" (UniqueName: \"kubernetes.io/projected/9eda6fda-800a-456e-8f1a-13b9b31035d2-kube-api-access-2w6bb\") pod \"auto-csr-approver-29563612-whsn7\" (UID: \"9eda6fda-800a-456e-8f1a-13b9b31035d2\") " pod="openshift-infra/auto-csr-approver-29563612-whsn7" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.325548 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w6bb\" (UniqueName: \"kubernetes.io/projected/9eda6fda-800a-456e-8f1a-13b9b31035d2-kube-api-access-2w6bb\") pod \"auto-csr-approver-29563612-whsn7\" (UID: \"9eda6fda-800a-456e-8f1a-13b9b31035d2\") " pod="openshift-infra/auto-csr-approver-29563612-whsn7" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.346633 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w6bb\" (UniqueName: \"kubernetes.io/projected/9eda6fda-800a-456e-8f1a-13b9b31035d2-kube-api-access-2w6bb\") pod \"auto-csr-approver-29563612-whsn7\" (UID: \"9eda6fda-800a-456e-8f1a-13b9b31035d2\") " pod="openshift-infra/auto-csr-approver-29563612-whsn7" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.525201 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563612-whsn7" Mar 18 06:52:00 crc kubenswrapper[4917]: I0318 06:52:00.969763 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563612-whsn7"] Mar 18 06:52:01 crc kubenswrapper[4917]: I0318 06:52:01.981690 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563612-whsn7" event={"ID":"9eda6fda-800a-456e-8f1a-13b9b31035d2","Type":"ContainerStarted","Data":"603f80ae1ff9e2c720d0d5629d75a8a54ec206bfb970d54e6715d68ed7abc108"} Mar 18 06:52:02 crc kubenswrapper[4917]: I0318 06:52:02.928756 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 06:52:02 crc kubenswrapper[4917]: I0318 06:52:02.929411 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 06:52:02 crc kubenswrapper[4917]: I0318 06:52:02.929490 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:52:02 crc kubenswrapper[4917]: I0318 06:52:02.931066 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bc780e4080b196acc164a6127722e66896b9364ad9dbfff119e4b793a72e986"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 06:52:02 crc kubenswrapper[4917]: I0318 06:52:02.931205 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://8bc780e4080b196acc164a6127722e66896b9364ad9dbfff119e4b793a72e986" gracePeriod=600 Mar 18 06:52:02 crc kubenswrapper[4917]: I0318 06:52:02.991919 4917 generic.go:334] "Generic (PLEG): container finished" podID="9eda6fda-800a-456e-8f1a-13b9b31035d2" containerID="275ac8a6c189a51c65dcd0c73f076e2c80b9530fdbecedc735ee1bf0ef3949e2" exitCode=0 Mar 18 06:52:02 crc kubenswrapper[4917]: I0318 06:52:02.992217 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563612-whsn7" event={"ID":"9eda6fda-800a-456e-8f1a-13b9b31035d2","Type":"ContainerDied","Data":"275ac8a6c189a51c65dcd0c73f076e2c80b9530fdbecedc735ee1bf0ef3949e2"} Mar 18 06:52:04 crc kubenswrapper[4917]: I0318 06:52:04.002423 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="8bc780e4080b196acc164a6127722e66896b9364ad9dbfff119e4b793a72e986" exitCode=0 Mar 18 06:52:04 crc kubenswrapper[4917]: I0318 06:52:04.002512 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"8bc780e4080b196acc164a6127722e66896b9364ad9dbfff119e4b793a72e986"} Mar 18 06:52:04 crc kubenswrapper[4917]: I0318 06:52:04.002871 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"55ec8b0401f2edcf38b2bcbd2dfc70f1d98070145f47acdaaf58224f01cca565"} Mar 18 06:52:04 crc kubenswrapper[4917]: I0318 06:52:04.348751 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563612-whsn7" Mar 18 06:52:04 crc kubenswrapper[4917]: I0318 06:52:04.394111 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w6bb\" (UniqueName: \"kubernetes.io/projected/9eda6fda-800a-456e-8f1a-13b9b31035d2-kube-api-access-2w6bb\") pod \"9eda6fda-800a-456e-8f1a-13b9b31035d2\" (UID: \"9eda6fda-800a-456e-8f1a-13b9b31035d2\") " Mar 18 06:52:04 crc kubenswrapper[4917]: I0318 06:52:04.403340 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eda6fda-800a-456e-8f1a-13b9b31035d2-kube-api-access-2w6bb" (OuterVolumeSpecName: "kube-api-access-2w6bb") pod "9eda6fda-800a-456e-8f1a-13b9b31035d2" (UID: "9eda6fda-800a-456e-8f1a-13b9b31035d2"). InnerVolumeSpecName "kube-api-access-2w6bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:52:04 crc kubenswrapper[4917]: I0318 06:52:04.495637 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w6bb\" (UniqueName: \"kubernetes.io/projected/9eda6fda-800a-456e-8f1a-13b9b31035d2-kube-api-access-2w6bb\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:05 crc kubenswrapper[4917]: I0318 06:52:05.013218 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563612-whsn7" event={"ID":"9eda6fda-800a-456e-8f1a-13b9b31035d2","Type":"ContainerDied","Data":"603f80ae1ff9e2c720d0d5629d75a8a54ec206bfb970d54e6715d68ed7abc108"} Mar 18 06:52:05 crc kubenswrapper[4917]: I0318 06:52:05.013290 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="603f80ae1ff9e2c720d0d5629d75a8a54ec206bfb970d54e6715d68ed7abc108" Mar 18 06:52:05 crc kubenswrapper[4917]: I0318 06:52:05.013386 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563612-whsn7" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.041312 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-brr8d"] Mar 18 06:52:32 crc kubenswrapper[4917]: E0318 06:52:32.042386 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eda6fda-800a-456e-8f1a-13b9b31035d2" containerName="oc" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.042416 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eda6fda-800a-456e-8f1a-13b9b31035d2" containerName="oc" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.042755 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eda6fda-800a-456e-8f1a-13b9b31035d2" containerName="oc" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.043407 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.057987 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-brr8d"] Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.064925 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10d93d75-3468-4948-81e3-80182ce24715-ca-trust-extracted\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.064971 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10d93d75-3468-4948-81e3-80182ce24715-trusted-ca\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.065028 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.065091 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10d93d75-3468-4948-81e3-80182ce24715-registry-tls\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.065122 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10d93d75-3468-4948-81e3-80182ce24715-registry-certificates\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.065147 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10d93d75-3468-4948-81e3-80182ce24715-bound-sa-token\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.065176 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10d93d75-3468-4948-81e3-80182ce24715-installation-pull-secrets\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.065208 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc8h9\" (UniqueName: \"kubernetes.io/projected/10d93d75-3468-4948-81e3-80182ce24715-kube-api-access-gc8h9\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.114779 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.166238 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10d93d75-3468-4948-81e3-80182ce24715-registry-tls\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.166302 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10d93d75-3468-4948-81e3-80182ce24715-registry-certificates\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.166324 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10d93d75-3468-4948-81e3-80182ce24715-bound-sa-token\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.166346 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10d93d75-3468-4948-81e3-80182ce24715-installation-pull-secrets\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.166374 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc8h9\" (UniqueName: \"kubernetes.io/projected/10d93d75-3468-4948-81e3-80182ce24715-kube-api-access-gc8h9\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.166420 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10d93d75-3468-4948-81e3-80182ce24715-ca-trust-extracted\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.166440 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10d93d75-3468-4948-81e3-80182ce24715-trusted-ca\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.167271 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/10d93d75-3468-4948-81e3-80182ce24715-ca-trust-extracted\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.168268 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/10d93d75-3468-4948-81e3-80182ce24715-trusted-ca\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.168770 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/10d93d75-3468-4948-81e3-80182ce24715-registry-certificates\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.171615 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/10d93d75-3468-4948-81e3-80182ce24715-registry-tls\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.175306 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/10d93d75-3468-4948-81e3-80182ce24715-installation-pull-secrets\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.185368 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc8h9\" (UniqueName: \"kubernetes.io/projected/10d93d75-3468-4948-81e3-80182ce24715-kube-api-access-gc8h9\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.188363 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/10d93d75-3468-4948-81e3-80182ce24715-bound-sa-token\") pod \"image-registry-66df7c8f76-brr8d\" (UID: \"10d93d75-3468-4948-81e3-80182ce24715\") " pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.375321 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:32 crc kubenswrapper[4917]: I0318 06:52:32.700550 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-brr8d"] Mar 18 06:52:32 crc kubenswrapper[4917]: W0318 06:52:32.714039 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10d93d75_3468_4948_81e3_80182ce24715.slice/crio-b380276487f1bf3ad326b9047592931d09d3a50ce4e1a93f71fa1b7801bdc742 WatchSource:0}: Error finding container b380276487f1bf3ad326b9047592931d09d3a50ce4e1a93f71fa1b7801bdc742: Status 404 returned error can't find the container with id b380276487f1bf3ad326b9047592931d09d3a50ce4e1a93f71fa1b7801bdc742 Mar 18 06:52:33 crc kubenswrapper[4917]: I0318 06:52:33.198579 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" event={"ID":"10d93d75-3468-4948-81e3-80182ce24715","Type":"ContainerStarted","Data":"73ed133443fac37ce249456abb36a1bd2afd64aa97940f0ccc3fb78e5b02b4ca"} Mar 18 06:52:33 crc kubenswrapper[4917]: I0318 06:52:33.199112 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" event={"ID":"10d93d75-3468-4948-81e3-80182ce24715","Type":"ContainerStarted","Data":"b380276487f1bf3ad326b9047592931d09d3a50ce4e1a93f71fa1b7801bdc742"} Mar 18 06:52:33 crc kubenswrapper[4917]: I0318 06:52:33.199208 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:33 crc kubenswrapper[4917]: I0318 06:52:33.230224 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" podStartSLOduration=1.230202432 podStartE2EDuration="1.230202432s" podCreationTimestamp="2026-03-18 06:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:52:33.225490859 +0000 UTC m=+338.166645603" watchObservedRunningTime="2026-03-18 06:52:33.230202432 +0000 UTC m=+338.171357156" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.485510 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z9rcc"] Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.488817 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z9rcc" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerName="registry-server" containerID="cri-o://7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd" gracePeriod=30 Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.502455 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6mhrt"] Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.502708 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6mhrt" podUID="52df4f75-850c-4266-b94d-909e90669389" containerName="registry-server" containerID="cri-o://dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c" gracePeriod=30 Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.519945 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8v8rg"] Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.520300 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" containerID="cri-o://6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b" gracePeriod=30 Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.537430 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmg2f"] Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.537842 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mmg2f" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerName="registry-server" containerID="cri-o://ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9" gracePeriod=30 Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.544943 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlftl"] Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.545241 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jlftl" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerName="registry-server" containerID="cri-o://80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5" gracePeriod=30 Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.548840 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rflln"] Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.549642 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.557468 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rflln"] Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.610821 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qgnb\" (UniqueName: \"kubernetes.io/projected/eddc4b08-6464-49cf-9b06-a482cbcbef5d-kube-api-access-4qgnb\") pod \"marketplace-operator-79b997595-rflln\" (UID: \"eddc4b08-6464-49cf-9b06-a482cbcbef5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.611351 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eddc4b08-6464-49cf-9b06-a482cbcbef5d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rflln\" (UID: \"eddc4b08-6464-49cf-9b06-a482cbcbef5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.611469 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eddc4b08-6464-49cf-9b06-a482cbcbef5d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rflln\" (UID: \"eddc4b08-6464-49cf-9b06-a482cbcbef5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.713054 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eddc4b08-6464-49cf-9b06-a482cbcbef5d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rflln\" (UID: \"eddc4b08-6464-49cf-9b06-a482cbcbef5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.713371 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eddc4b08-6464-49cf-9b06-a482cbcbef5d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rflln\" (UID: \"eddc4b08-6464-49cf-9b06-a482cbcbef5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.713424 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qgnb\" (UniqueName: \"kubernetes.io/projected/eddc4b08-6464-49cf-9b06-a482cbcbef5d-kube-api-access-4qgnb\") pod \"marketplace-operator-79b997595-rflln\" (UID: \"eddc4b08-6464-49cf-9b06-a482cbcbef5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.715399 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eddc4b08-6464-49cf-9b06-a482cbcbef5d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rflln\" (UID: \"eddc4b08-6464-49cf-9b06-a482cbcbef5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.730945 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qgnb\" (UniqueName: \"kubernetes.io/projected/eddc4b08-6464-49cf-9b06-a482cbcbef5d-kube-api-access-4qgnb\") pod \"marketplace-operator-79b997595-rflln\" (UID: \"eddc4b08-6464-49cf-9b06-a482cbcbef5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.731206 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eddc4b08-6464-49cf-9b06-a482cbcbef5d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rflln\" (UID: \"eddc4b08-6464-49cf-9b06-a482cbcbef5d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.928418 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.933603 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.950546 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:52:40 crc kubenswrapper[4917]: I0318 06:52:40.957903 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.017856 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-operator-metrics\") pod \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.017914 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfdwm\" (UniqueName: \"kubernetes.io/projected/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-kube-api-access-bfdwm\") pod \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.017964 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-utilities\") pod \"52df4f75-850c-4266-b94d-909e90669389\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.017988 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-catalog-content\") pod \"52df4f75-850c-4266-b94d-909e90669389\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.018008 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x6m4\" (UniqueName: \"kubernetes.io/projected/52df4f75-850c-4266-b94d-909e90669389-kube-api-access-5x6m4\") pod \"52df4f75-850c-4266-b94d-909e90669389\" (UID: \"52df4f75-850c-4266-b94d-909e90669389\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.018053 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-trusted-ca\") pod \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\" (UID: \"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.018082 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-utilities\") pod \"552a2644-267a-4c40-8ee5-f91bd933e7f2\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.018109 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-catalog-content\") pod \"552a2644-267a-4c40-8ee5-f91bd933e7f2\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.018147 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcvls\" (UniqueName: \"kubernetes.io/projected/552a2644-267a-4c40-8ee5-f91bd933e7f2-kube-api-access-hcvls\") pod \"552a2644-267a-4c40-8ee5-f91bd933e7f2\" (UID: \"552a2644-267a-4c40-8ee5-f91bd933e7f2\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.019454 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-utilities" (OuterVolumeSpecName: "utilities") pod "52df4f75-850c-4266-b94d-909e90669389" (UID: "52df4f75-850c-4266-b94d-909e90669389"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.020434 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-utilities" (OuterVolumeSpecName: "utilities") pod "552a2644-267a-4c40-8ee5-f91bd933e7f2" (UID: "552a2644-267a-4c40-8ee5-f91bd933e7f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.020565 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" (UID: "3b578c2d-441d-4c19-9c2a-e42bd5b7bd26"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.027237 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52df4f75-850c-4266-b94d-909e90669389-kube-api-access-5x6m4" (OuterVolumeSpecName: "kube-api-access-5x6m4") pod "52df4f75-850c-4266-b94d-909e90669389" (UID: "52df4f75-850c-4266-b94d-909e90669389"). InnerVolumeSpecName "kube-api-access-5x6m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.027278 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" (UID: "3b578c2d-441d-4c19-9c2a-e42bd5b7bd26"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.027300 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-kube-api-access-bfdwm" (OuterVolumeSpecName: "kube-api-access-bfdwm") pod "3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" (UID: "3b578c2d-441d-4c19-9c2a-e42bd5b7bd26"). InnerVolumeSpecName "kube-api-access-bfdwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.030166 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552a2644-267a-4c40-8ee5-f91bd933e7f2-kube-api-access-hcvls" (OuterVolumeSpecName: "kube-api-access-hcvls") pod "552a2644-267a-4c40-8ee5-f91bd933e7f2" (UID: "552a2644-267a-4c40-8ee5-f91bd933e7f2"). InnerVolumeSpecName "kube-api-access-hcvls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.034490 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.037748 4917 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.037778 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfdwm\" (UniqueName: \"kubernetes.io/projected/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-kube-api-access-bfdwm\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.037791 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.037803 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x6m4\" (UniqueName: \"kubernetes.io/projected/52df4f75-850c-4266-b94d-909e90669389-kube-api-access-5x6m4\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.037816 4917 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.037828 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.037841 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcvls\" (UniqueName: \"kubernetes.io/projected/552a2644-267a-4c40-8ee5-f91bd933e7f2-kube-api-access-hcvls\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.044978 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.106811 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "552a2644-267a-4c40-8ee5-f91bd933e7f2" (UID: "552a2644-267a-4c40-8ee5-f91bd933e7f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.115571 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52df4f75-850c-4266-b94d-909e90669389" (UID: "52df4f75-850c-4266-b94d-909e90669389"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.139146 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-catalog-content\") pod \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.139184 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-utilities\") pod \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.139215 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjkr5\" (UniqueName: \"kubernetes.io/projected/edae9f71-365b-48dd-91fa-4ad4d56dcc62-kube-api-access-fjkr5\") pod \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.139235 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-catalog-content\") pod \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.139258 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xvhm\" (UniqueName: \"kubernetes.io/projected/528debb6-ed0b-4099-a21d-81d1da5ba9f6-kube-api-access-2xvhm\") pod \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\" (UID: \"528debb6-ed0b-4099-a21d-81d1da5ba9f6\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.139295 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-utilities\") pod \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\" (UID: \"edae9f71-365b-48dd-91fa-4ad4d56dcc62\") " Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.139534 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552a2644-267a-4c40-8ee5-f91bd933e7f2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.139547 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52df4f75-850c-4266-b94d-909e90669389-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.140203 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-utilities" (OuterVolumeSpecName: "utilities") pod "528debb6-ed0b-4099-a21d-81d1da5ba9f6" (UID: "528debb6-ed0b-4099-a21d-81d1da5ba9f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.140248 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-utilities" (OuterVolumeSpecName: "utilities") pod "edae9f71-365b-48dd-91fa-4ad4d56dcc62" (UID: "edae9f71-365b-48dd-91fa-4ad4d56dcc62"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.142990 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edae9f71-365b-48dd-91fa-4ad4d56dcc62-kube-api-access-fjkr5" (OuterVolumeSpecName: "kube-api-access-fjkr5") pod "edae9f71-365b-48dd-91fa-4ad4d56dcc62" (UID: "edae9f71-365b-48dd-91fa-4ad4d56dcc62"). InnerVolumeSpecName "kube-api-access-fjkr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.143069 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/528debb6-ed0b-4099-a21d-81d1da5ba9f6-kube-api-access-2xvhm" (OuterVolumeSpecName: "kube-api-access-2xvhm") pod "528debb6-ed0b-4099-a21d-81d1da5ba9f6" (UID: "528debb6-ed0b-4099-a21d-81d1da5ba9f6"). InnerVolumeSpecName "kube-api-access-2xvhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.165308 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "528debb6-ed0b-4099-a21d-81d1da5ba9f6" (UID: "528debb6-ed0b-4099-a21d-81d1da5ba9f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.240926 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.240960 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.240971 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjkr5\" (UniqueName: \"kubernetes.io/projected/edae9f71-365b-48dd-91fa-4ad4d56dcc62-kube-api-access-fjkr5\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.240981 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/528debb6-ed0b-4099-a21d-81d1da5ba9f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.240989 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xvhm\" (UniqueName: \"kubernetes.io/projected/528debb6-ed0b-4099-a21d-81d1da5ba9f6-kube-api-access-2xvhm\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.254031 4917 generic.go:334] "Generic (PLEG): container finished" podID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerID="6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b" exitCode=0 Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.254079 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.254133 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" event={"ID":"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26","Type":"ContainerDied","Data":"6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.254183 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-8v8rg" event={"ID":"3b578c2d-441d-4c19-9c2a-e42bd5b7bd26","Type":"ContainerDied","Data":"2cb5a86b18e339bb747f77f1ea22db0d407ca548dc3d90d43a6d4b27b8e36189"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.254202 4917 scope.go:117] "RemoveContainer" containerID="6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.256049 4917 generic.go:334] "Generic (PLEG): container finished" podID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerID="ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9" exitCode=0 Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.256087 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmg2f" event={"ID":"528debb6-ed0b-4099-a21d-81d1da5ba9f6","Type":"ContainerDied","Data":"ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.256122 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mmg2f" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.256140 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mmg2f" event={"ID":"528debb6-ed0b-4099-a21d-81d1da5ba9f6","Type":"ContainerDied","Data":"c7903086137eafa2ddd8a8596a701bcf8cc924389310159ccdfdfd564d580227"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.258799 4917 generic.go:334] "Generic (PLEG): container finished" podID="52df4f75-850c-4266-b94d-909e90669389" containerID="dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c" exitCode=0 Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.258857 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6mhrt" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.258884 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mhrt" event={"ID":"52df4f75-850c-4266-b94d-909e90669389","Type":"ContainerDied","Data":"dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.258906 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6mhrt" event={"ID":"52df4f75-850c-4266-b94d-909e90669389","Type":"ContainerDied","Data":"d6e3a08a85af864915b0040a1d02ef50f44f2e15cf79d08a19597f3736af6186"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.265701 4917 generic.go:334] "Generic (PLEG): container finished" podID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerID="7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd" exitCode=0 Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.265745 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9rcc" event={"ID":"552a2644-267a-4c40-8ee5-f91bd933e7f2","Type":"ContainerDied","Data":"7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.265778 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z9rcc" event={"ID":"552a2644-267a-4c40-8ee5-f91bd933e7f2","Type":"ContainerDied","Data":"a28a0d25ad783182c4169cf0bb0a9935b4f68c637bf5bcb794238b3f93320e1d"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.265826 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z9rcc" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.273198 4917 generic.go:334] "Generic (PLEG): container finished" podID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerID="80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5" exitCode=0 Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.273228 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jlftl" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.273239 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlftl" event={"ID":"edae9f71-365b-48dd-91fa-4ad4d56dcc62","Type":"ContainerDied","Data":"80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.273243 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edae9f71-365b-48dd-91fa-4ad4d56dcc62" (UID: "edae9f71-365b-48dd-91fa-4ad4d56dcc62"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.273265 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jlftl" event={"ID":"edae9f71-365b-48dd-91fa-4ad4d56dcc62","Type":"ContainerDied","Data":"7f2a27a1e9dd39ace6ba7ef26f0f7a1025c1aa8a66318999e07e4791e6b86d7b"} Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.291301 4917 scope.go:117] "RemoveContainer" containerID="5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.302086 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8v8rg"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.307669 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-8v8rg"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.310260 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmg2f"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.313791 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mmg2f"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.319731 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z9rcc"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.324150 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z9rcc"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.326555 4917 scope.go:117] "RemoveContainer" containerID="6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.327161 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b\": container with ID starting with 6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b not found: ID does not exist" containerID="6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.327205 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b"} err="failed to get container status \"6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b\": rpc error: code = NotFound desc = could not find container \"6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b\": container with ID starting with 6c07d5ad06d989ab115d60e1527a8b7e1134f7f09444e6418201f6fe2fa5115b not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.327238 4917 scope.go:117] "RemoveContainer" containerID="5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.330801 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb\": container with ID starting with 5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb not found: ID does not exist" containerID="5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.330846 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb"} err="failed to get container status \"5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb\": rpc error: code = NotFound desc = could not find container \"5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb\": container with ID starting with 5e6741012ee2663299c7f497eb4d014fa52fba37d071204b971ca2a81e3cb6cb not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.330874 4917 scope.go:117] "RemoveContainer" containerID="ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.331538 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6mhrt"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.334389 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6mhrt"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.342841 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edae9f71-365b-48dd-91fa-4ad4d56dcc62-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.344094 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jlftl"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.346754 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jlftl"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.349543 4917 scope.go:117] "RemoveContainer" containerID="b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.393941 4917 scope.go:117] "RemoveContainer" containerID="06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.409660 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rflln"] Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.412910 4917 scope.go:117] "RemoveContainer" containerID="ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.413410 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9\": container with ID starting with ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9 not found: ID does not exist" containerID="ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.413444 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9"} err="failed to get container status \"ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9\": rpc error: code = NotFound desc = could not find container \"ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9\": container with ID starting with ae854558dfc6aacaeae2c381cd8ab85515c41222568fa5b4a69e3914416770f9 not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.413468 4917 scope.go:117] "RemoveContainer" containerID="b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.413816 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b\": container with ID starting with b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b not found: ID does not exist" containerID="b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.413835 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b"} err="failed to get container status \"b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b\": rpc error: code = NotFound desc = could not find container \"b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b\": container with ID starting with b2e52c4c1717eaa73754029c3f8dc635e7e03f57d0be058579048cb2816ea80b not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.413848 4917 scope.go:117] "RemoveContainer" containerID="06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.414157 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0\": container with ID starting with 06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0 not found: ID does not exist" containerID="06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.414211 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0"} err="failed to get container status \"06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0\": rpc error: code = NotFound desc = could not find container \"06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0\": container with ID starting with 06f9f24683df7b87f595f18b606a93f119566bb80db57b08c3e1e899a9e8a8e0 not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.414249 4917 scope.go:117] "RemoveContainer" containerID="dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c" Mar 18 06:52:41 crc kubenswrapper[4917]: W0318 06:52:41.418239 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeddc4b08_6464_49cf_9b06_a482cbcbef5d.slice/crio-db5e9d73c7b1b0f0542a1c1915cb36e3154053d61a788442b38c3c3125931015 WatchSource:0}: Error finding container db5e9d73c7b1b0f0542a1c1915cb36e3154053d61a788442b38c3c3125931015: Status 404 returned error can't find the container with id db5e9d73c7b1b0f0542a1c1915cb36e3154053d61a788442b38c3c3125931015 Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.427210 4917 scope.go:117] "RemoveContainer" containerID="da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.444810 4917 scope.go:117] "RemoveContainer" containerID="06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.455972 4917 scope.go:117] "RemoveContainer" containerID="dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.456365 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c\": container with ID starting with dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c not found: ID does not exist" containerID="dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.456423 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c"} err="failed to get container status \"dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c\": rpc error: code = NotFound desc = could not find container \"dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c\": container with ID starting with dd107dc26633188439d1b26429de4575ccc3387888047d6cf8c60372e96d0b0c not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.456464 4917 scope.go:117] "RemoveContainer" containerID="da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.456814 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa\": container with ID starting with da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa not found: ID does not exist" containerID="da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.456857 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa"} err="failed to get container status \"da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa\": rpc error: code = NotFound desc = could not find container \"da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa\": container with ID starting with da559e9bcde6ae2f0876418a23a1b4986cd258a5014fc92996452f4b5abcb8fa not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.456889 4917 scope.go:117] "RemoveContainer" containerID="06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.457152 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354\": container with ID starting with 06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354 not found: ID does not exist" containerID="06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.457172 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354"} err="failed to get container status \"06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354\": rpc error: code = NotFound desc = could not find container \"06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354\": container with ID starting with 06d472af40db7b6c589b28c5c20f8362d2222ac35504ad5a8dde9b33c8d3d354 not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.457185 4917 scope.go:117] "RemoveContainer" containerID="7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.483349 4917 scope.go:117] "RemoveContainer" containerID="903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.496212 4917 scope.go:117] "RemoveContainer" containerID="187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.511254 4917 scope.go:117] "RemoveContainer" containerID="7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.511956 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd\": container with ID starting with 7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd not found: ID does not exist" containerID="7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.511996 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd"} err="failed to get container status \"7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd\": rpc error: code = NotFound desc = could not find container \"7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd\": container with ID starting with 7e805fbdbcaad6ea063b772a4b4080ae9fa3f5f679104a91a9a9e5cb474753fd not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.512023 4917 scope.go:117] "RemoveContainer" containerID="903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.512423 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4\": container with ID starting with 903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4 not found: ID does not exist" containerID="903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.512454 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4"} err="failed to get container status \"903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4\": rpc error: code = NotFound desc = could not find container \"903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4\": container with ID starting with 903aadf30f13ffab5f88e3041011600ff80a3d4e6f8076152414daeb5a00f4b4 not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.512474 4917 scope.go:117] "RemoveContainer" containerID="187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.512957 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a\": container with ID starting with 187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a not found: ID does not exist" containerID="187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.512979 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a"} err="failed to get container status \"187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a\": rpc error: code = NotFound desc = could not find container \"187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a\": container with ID starting with 187357079d5b3b6b124e783d102439a04b65810b20b093d1c4eed1bcd4d1dd8a not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.512991 4917 scope.go:117] "RemoveContainer" containerID="80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.529487 4917 scope.go:117] "RemoveContainer" containerID="ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.543184 4917 scope.go:117] "RemoveContainer" containerID="4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.563460 4917 scope.go:117] "RemoveContainer" containerID="80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.563988 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5\": container with ID starting with 80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5 not found: ID does not exist" containerID="80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.564014 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5"} err="failed to get container status \"80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5\": rpc error: code = NotFound desc = could not find container \"80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5\": container with ID starting with 80abe174a6508710085ca38121d4f229ed22852538c7182ca962bcc79bbf06a5 not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.564034 4917 scope.go:117] "RemoveContainer" containerID="ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.564422 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f\": container with ID starting with ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f not found: ID does not exist" containerID="ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.564436 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f"} err="failed to get container status \"ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f\": rpc error: code = NotFound desc = could not find container \"ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f\": container with ID starting with ea120a407782dd0c31db6e351352c9b0757d719a31ba48e1cdef32cc4b7e789f not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.564447 4917 scope.go:117] "RemoveContainer" containerID="4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5" Mar 18 06:52:41 crc kubenswrapper[4917]: E0318 06:52:41.564736 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5\": container with ID starting with 4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5 not found: ID does not exist" containerID="4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.564753 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5"} err="failed to get container status \"4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5\": rpc error: code = NotFound desc = could not find container \"4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5\": container with ID starting with 4dcc266f3f0ef2821a83b3734a78edefcc94eaa3a07ffb2c06832ba491c4eec5 not found: ID does not exist" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.778619 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" path="/var/lib/kubelet/pods/3b578c2d-441d-4c19-9c2a-e42bd5b7bd26/volumes" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.779319 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" path="/var/lib/kubelet/pods/528debb6-ed0b-4099-a21d-81d1da5ba9f6/volumes" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.779873 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52df4f75-850c-4266-b94d-909e90669389" path="/var/lib/kubelet/pods/52df4f75-850c-4266-b94d-909e90669389/volumes" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.780400 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" path="/var/lib/kubelet/pods/552a2644-267a-4c40-8ee5-f91bd933e7f2/volumes" Mar 18 06:52:41 crc kubenswrapper[4917]: I0318 06:52:41.780960 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" path="/var/lib/kubelet/pods/edae9f71-365b-48dd-91fa-4ad4d56dcc62/volumes" Mar 18 06:52:42 crc kubenswrapper[4917]: I0318 06:52:42.284474 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rflln" event={"ID":"eddc4b08-6464-49cf-9b06-a482cbcbef5d","Type":"ContainerStarted","Data":"2beac2c1c557c1994d7e200dc6ffb3f5d3829302a7a6a968a03fd8c642c796ae"} Mar 18 06:52:42 crc kubenswrapper[4917]: I0318 06:52:42.284550 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rflln" event={"ID":"eddc4b08-6464-49cf-9b06-a482cbcbef5d","Type":"ContainerStarted","Data":"db5e9d73c7b1b0f0542a1c1915cb36e3154053d61a788442b38c3c3125931015"} Mar 18 06:52:42 crc kubenswrapper[4917]: I0318 06:52:42.284624 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:42 crc kubenswrapper[4917]: I0318 06:52:42.296106 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rflln" Mar 18 06:52:42 crc kubenswrapper[4917]: I0318 06:52:42.312793 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rflln" podStartSLOduration=2.312775557 podStartE2EDuration="2.312775557s" podCreationTimestamp="2026-03-18 06:52:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:52:42.309895807 +0000 UTC m=+347.251050571" watchObservedRunningTime="2026-03-18 06:52:42.312775557 +0000 UTC m=+347.253930291" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.723611 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6p978"] Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724254 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724293 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724309 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724317 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724328 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724337 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724345 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerName="extract-content" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724373 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerName="extract-content" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724385 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df4f75-850c-4266-b94d-909e90669389" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724393 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df4f75-850c-4266-b94d-909e90669389" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724405 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df4f75-850c-4266-b94d-909e90669389" containerName="extract-utilities" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724467 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df4f75-850c-4266-b94d-909e90669389" containerName="extract-utilities" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724481 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerName="extract-content" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724489 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerName="extract-content" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724498 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724505 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724515 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerName="extract-utilities" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724522 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerName="extract-utilities" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724551 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52df4f75-850c-4266-b94d-909e90669389" containerName="extract-content" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724560 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="52df4f75-850c-4266-b94d-909e90669389" containerName="extract-content" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724573 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerName="extract-utilities" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724609 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerName="extract-utilities" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724625 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerName="extract-content" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724633 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerName="extract-content" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724641 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerName="extract-utilities" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724649 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerName="extract-utilities" Mar 18 06:52:47 crc kubenswrapper[4917]: E0318 06:52:47.724659 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724691 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724854 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="552a2644-267a-4c40-8ee5-f91bd933e7f2" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724867 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="edae9f71-365b-48dd-91fa-4ad4d56dcc62" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724879 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724890 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="52df4f75-850c-4266-b94d-909e90669389" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724900 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b578c2d-441d-4c19-9c2a-e42bd5b7bd26" containerName="marketplace-operator" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.724936 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="528debb6-ed0b-4099-a21d-81d1da5ba9f6" containerName="registry-server" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.726151 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.737206 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.739763 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6p978"] Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.862551 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcba268d-27fe-4d66-863a-b94762a886b4-utilities\") pod \"redhat-operators-6p978\" (UID: \"dcba268d-27fe-4d66-863a-b94762a886b4\") " pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.862724 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrcv5\" (UniqueName: \"kubernetes.io/projected/dcba268d-27fe-4d66-863a-b94762a886b4-kube-api-access-rrcv5\") pod \"redhat-operators-6p978\" (UID: \"dcba268d-27fe-4d66-863a-b94762a886b4\") " pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.862964 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcba268d-27fe-4d66-863a-b94762a886b4-catalog-content\") pod \"redhat-operators-6p978\" (UID: \"dcba268d-27fe-4d66-863a-b94762a886b4\") " pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.964650 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcba268d-27fe-4d66-863a-b94762a886b4-catalog-content\") pod \"redhat-operators-6p978\" (UID: \"dcba268d-27fe-4d66-863a-b94762a886b4\") " pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.964776 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcba268d-27fe-4d66-863a-b94762a886b4-utilities\") pod \"redhat-operators-6p978\" (UID: \"dcba268d-27fe-4d66-863a-b94762a886b4\") " pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.964837 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrcv5\" (UniqueName: \"kubernetes.io/projected/dcba268d-27fe-4d66-863a-b94762a886b4-kube-api-access-rrcv5\") pod \"redhat-operators-6p978\" (UID: \"dcba268d-27fe-4d66-863a-b94762a886b4\") " pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.965523 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dcba268d-27fe-4d66-863a-b94762a886b4-catalog-content\") pod \"redhat-operators-6p978\" (UID: \"dcba268d-27fe-4d66-863a-b94762a886b4\") " pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.969022 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dcba268d-27fe-4d66-863a-b94762a886b4-utilities\") pod \"redhat-operators-6p978\" (UID: \"dcba268d-27fe-4d66-863a-b94762a886b4\") " pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:47 crc kubenswrapper[4917]: I0318 06:52:47.987855 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrcv5\" (UniqueName: \"kubernetes.io/projected/dcba268d-27fe-4d66-863a-b94762a886b4-kube-api-access-rrcv5\") pod \"redhat-operators-6p978\" (UID: \"dcba268d-27fe-4d66-863a-b94762a886b4\") " pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.055864 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.337104 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rqzh7"] Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.338023 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.341614 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.371453 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqzh7"] Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.472355 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-catalog-content\") pod \"certified-operators-rqzh7\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.472430 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx84h\" (UniqueName: \"kubernetes.io/projected/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-kube-api-access-jx84h\") pod \"certified-operators-rqzh7\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.472460 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-utilities\") pod \"certified-operators-rqzh7\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.517569 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6p978"] Mar 18 06:52:48 crc kubenswrapper[4917]: W0318 06:52:48.545020 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcba268d_27fe_4d66_863a_b94762a886b4.slice/crio-10307e7ff0e7c8fe73434a9720640304533ada3b53d0b98b4d427acff2a9e155 WatchSource:0}: Error finding container 10307e7ff0e7c8fe73434a9720640304533ada3b53d0b98b4d427acff2a9e155: Status 404 returned error can't find the container with id 10307e7ff0e7c8fe73434a9720640304533ada3b53d0b98b4d427acff2a9e155 Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.574729 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-utilities\") pod \"certified-operators-rqzh7\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.574832 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-catalog-content\") pod \"certified-operators-rqzh7\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.575039 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx84h\" (UniqueName: \"kubernetes.io/projected/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-kube-api-access-jx84h\") pod \"certified-operators-rqzh7\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.575706 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-utilities\") pod \"certified-operators-rqzh7\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.575759 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-catalog-content\") pod \"certified-operators-rqzh7\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.608684 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx84h\" (UniqueName: \"kubernetes.io/projected/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-kube-api-access-jx84h\") pod \"certified-operators-rqzh7\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:48 crc kubenswrapper[4917]: I0318 06:52:48.675725 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:49 crc kubenswrapper[4917]: I0318 06:52:49.217415 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rqzh7"] Mar 18 06:52:49 crc kubenswrapper[4917]: I0318 06:52:49.350703 4917 generic.go:334] "Generic (PLEG): container finished" podID="dcba268d-27fe-4d66-863a-b94762a886b4" containerID="40eeaac383248024fff9dc04a874175287a252ccb71c8e34440e169b61bc6d03" exitCode=0 Mar 18 06:52:49 crc kubenswrapper[4917]: I0318 06:52:49.350798 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p978" event={"ID":"dcba268d-27fe-4d66-863a-b94762a886b4","Type":"ContainerDied","Data":"40eeaac383248024fff9dc04a874175287a252ccb71c8e34440e169b61bc6d03"} Mar 18 06:52:49 crc kubenswrapper[4917]: I0318 06:52:49.350837 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p978" event={"ID":"dcba268d-27fe-4d66-863a-b94762a886b4","Type":"ContainerStarted","Data":"10307e7ff0e7c8fe73434a9720640304533ada3b53d0b98b4d427acff2a9e155"} Mar 18 06:52:49 crc kubenswrapper[4917]: I0318 06:52:49.354291 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqzh7" event={"ID":"e4bdf3cd-7fe6-4853-a75d-dfb580089f25","Type":"ContainerStarted","Data":"a527075f51924d0d1a0e971a5815ae48041d85ccddcb257d05b4e60fa85cfcf4"} Mar 18 06:52:49 crc kubenswrapper[4917]: E0318 06:52:49.754573 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4bdf3cd_7fe6_4853_a75d_dfb580089f25.slice/crio-conmon-34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0.scope\": RecentStats: unable to find data in memory cache]" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.139176 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lpwxl"] Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.141528 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.143642 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lpwxl"] Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.145074 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.231931 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-catalog-content\") pod \"community-operators-lpwxl\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.232012 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjswp\" (UniqueName: \"kubernetes.io/projected/51220ace-9206-4db4-86c5-752d63a97ae2-kube-api-access-vjswp\") pod \"community-operators-lpwxl\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.232088 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-utilities\") pod \"community-operators-lpwxl\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.333610 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-catalog-content\") pod \"community-operators-lpwxl\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.333663 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjswp\" (UniqueName: \"kubernetes.io/projected/51220ace-9206-4db4-86c5-752d63a97ae2-kube-api-access-vjswp\") pod \"community-operators-lpwxl\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.333703 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-utilities\") pod \"community-operators-lpwxl\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.334273 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-utilities\") pod \"community-operators-lpwxl\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.334684 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-catalog-content\") pod \"community-operators-lpwxl\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.362447 4917 generic.go:334] "Generic (PLEG): container finished" podID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerID="34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0" exitCode=0 Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.362498 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqzh7" event={"ID":"e4bdf3cd-7fe6-4853-a75d-dfb580089f25","Type":"ContainerDied","Data":"34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0"} Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.366274 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjswp\" (UniqueName: \"kubernetes.io/projected/51220ace-9206-4db4-86c5-752d63a97ae2-kube-api-access-vjswp\") pod \"community-operators-lpwxl\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.465261 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.725188 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9vhrw"] Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.726608 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.730128 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.744688 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vhrw"] Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.841338 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e197559-b623-42e3-bf29-dcf0c20a779d-utilities\") pod \"redhat-marketplace-9vhrw\" (UID: \"9e197559-b623-42e3-bf29-dcf0c20a779d\") " pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.841422 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5pnn\" (UniqueName: \"kubernetes.io/projected/9e197559-b623-42e3-bf29-dcf0c20a779d-kube-api-access-q5pnn\") pod \"redhat-marketplace-9vhrw\" (UID: \"9e197559-b623-42e3-bf29-dcf0c20a779d\") " pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.841457 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e197559-b623-42e3-bf29-dcf0c20a779d-catalog-content\") pod \"redhat-marketplace-9vhrw\" (UID: \"9e197559-b623-42e3-bf29-dcf0c20a779d\") " pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.943001 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5pnn\" (UniqueName: \"kubernetes.io/projected/9e197559-b623-42e3-bf29-dcf0c20a779d-kube-api-access-q5pnn\") pod \"redhat-marketplace-9vhrw\" (UID: \"9e197559-b623-42e3-bf29-dcf0c20a779d\") " pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.943055 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e197559-b623-42e3-bf29-dcf0c20a779d-catalog-content\") pod \"redhat-marketplace-9vhrw\" (UID: \"9e197559-b623-42e3-bf29-dcf0c20a779d\") " pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.943205 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e197559-b623-42e3-bf29-dcf0c20a779d-utilities\") pod \"redhat-marketplace-9vhrw\" (UID: \"9e197559-b623-42e3-bf29-dcf0c20a779d\") " pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.943798 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e197559-b623-42e3-bf29-dcf0c20a779d-utilities\") pod \"redhat-marketplace-9vhrw\" (UID: \"9e197559-b623-42e3-bf29-dcf0c20a779d\") " pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.944358 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e197559-b623-42e3-bf29-dcf0c20a779d-catalog-content\") pod \"redhat-marketplace-9vhrw\" (UID: \"9e197559-b623-42e3-bf29-dcf0c20a779d\") " pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.954487 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lpwxl"] Mar 18 06:52:50 crc kubenswrapper[4917]: W0318 06:52:50.960863 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51220ace_9206_4db4_86c5_752d63a97ae2.slice/crio-126546a9bae03fb26b37b4011caef93b0e96e76e0c8d4a3024b9f582c035f5cc WatchSource:0}: Error finding container 126546a9bae03fb26b37b4011caef93b0e96e76e0c8d4a3024b9f582c035f5cc: Status 404 returned error can't find the container with id 126546a9bae03fb26b37b4011caef93b0e96e76e0c8d4a3024b9f582c035f5cc Mar 18 06:52:50 crc kubenswrapper[4917]: I0318 06:52:50.984854 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5pnn\" (UniqueName: \"kubernetes.io/projected/9e197559-b623-42e3-bf29-dcf0c20a779d-kube-api-access-q5pnn\") pod \"redhat-marketplace-9vhrw\" (UID: \"9e197559-b623-42e3-bf29-dcf0c20a779d\") " pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:51 crc kubenswrapper[4917]: I0318 06:52:51.041177 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:52:51 crc kubenswrapper[4917]: I0318 06:52:51.375812 4917 generic.go:334] "Generic (PLEG): container finished" podID="dcba268d-27fe-4d66-863a-b94762a886b4" containerID="d28b09e6fd9ab4cae4e61615db3213ea0785fe06a78f8aba5b8124c140259a5a" exitCode=0 Mar 18 06:52:51 crc kubenswrapper[4917]: I0318 06:52:51.375964 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p978" event={"ID":"dcba268d-27fe-4d66-863a-b94762a886b4","Type":"ContainerDied","Data":"d28b09e6fd9ab4cae4e61615db3213ea0785fe06a78f8aba5b8124c140259a5a"} Mar 18 06:52:51 crc kubenswrapper[4917]: I0318 06:52:51.379541 4917 generic.go:334] "Generic (PLEG): container finished" podID="51220ace-9206-4db4-86c5-752d63a97ae2" containerID="ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90" exitCode=0 Mar 18 06:52:51 crc kubenswrapper[4917]: I0318 06:52:51.379655 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lpwxl" event={"ID":"51220ace-9206-4db4-86c5-752d63a97ae2","Type":"ContainerDied","Data":"ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90"} Mar 18 06:52:51 crc kubenswrapper[4917]: I0318 06:52:51.379707 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lpwxl" event={"ID":"51220ace-9206-4db4-86c5-752d63a97ae2","Type":"ContainerStarted","Data":"126546a9bae03fb26b37b4011caef93b0e96e76e0c8d4a3024b9f582c035f5cc"} Mar 18 06:52:51 crc kubenswrapper[4917]: I0318 06:52:51.523207 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vhrw"] Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.383216 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-brr8d" Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.392624 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vhrw" event={"ID":"9e197559-b623-42e3-bf29-dcf0c20a779d","Type":"ContainerDied","Data":"97a3a81c235fc53d2080cf14f43f103e19e3df754acc643def3641c5e98a2973"} Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.392622 4917 generic.go:334] "Generic (PLEG): container finished" podID="9e197559-b623-42e3-bf29-dcf0c20a779d" containerID="97a3a81c235fc53d2080cf14f43f103e19e3df754acc643def3641c5e98a2973" exitCode=0 Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.393884 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vhrw" event={"ID":"9e197559-b623-42e3-bf29-dcf0c20a779d","Type":"ContainerStarted","Data":"04b5f071261bf00912c9e6a0c8814b783bf773db755f26e38663e05b9e09a4eb"} Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.399928 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p978" event={"ID":"dcba268d-27fe-4d66-863a-b94762a886b4","Type":"ContainerStarted","Data":"46b5a6e55d2736c38a75b97b6632b5a2c4137a126669a1796267ecd46fde4a09"} Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.407448 4917 generic.go:334] "Generic (PLEG): container finished" podID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerID="f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95" exitCode=0 Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.407534 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqzh7" event={"ID":"e4bdf3cd-7fe6-4853-a75d-dfb580089f25","Type":"ContainerDied","Data":"f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95"} Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.424218 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lpwxl" event={"ID":"51220ace-9206-4db4-86c5-752d63a97ae2","Type":"ContainerStarted","Data":"c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793"} Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.456906 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hfll9"] Mar 18 06:52:52 crc kubenswrapper[4917]: I0318 06:52:52.481087 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6p978" podStartSLOduration=3.039789114 podStartE2EDuration="5.481067696s" podCreationTimestamp="2026-03-18 06:52:47 +0000 UTC" firstStartedPulling="2026-03-18 06:52:49.352561197 +0000 UTC m=+354.293715941" lastFinishedPulling="2026-03-18 06:52:51.793839769 +0000 UTC m=+356.734994523" observedRunningTime="2026-03-18 06:52:52.477336826 +0000 UTC m=+357.418491570" watchObservedRunningTime="2026-03-18 06:52:52.481067696 +0000 UTC m=+357.422222420" Mar 18 06:52:53 crc kubenswrapper[4917]: I0318 06:52:53.432108 4917 generic.go:334] "Generic (PLEG): container finished" podID="9e197559-b623-42e3-bf29-dcf0c20a779d" containerID="3c692432c87a77a8e1802afda8c9a962ed7b7b8a204b620adec9612ce93be7c8" exitCode=0 Mar 18 06:52:53 crc kubenswrapper[4917]: I0318 06:52:53.432159 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vhrw" event={"ID":"9e197559-b623-42e3-bf29-dcf0c20a779d","Type":"ContainerDied","Data":"3c692432c87a77a8e1802afda8c9a962ed7b7b8a204b620adec9612ce93be7c8"} Mar 18 06:52:53 crc kubenswrapper[4917]: I0318 06:52:53.435496 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqzh7" event={"ID":"e4bdf3cd-7fe6-4853-a75d-dfb580089f25","Type":"ContainerStarted","Data":"4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a"} Mar 18 06:52:53 crc kubenswrapper[4917]: I0318 06:52:53.439452 4917 generic.go:334] "Generic (PLEG): container finished" podID="51220ace-9206-4db4-86c5-752d63a97ae2" containerID="c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793" exitCode=0 Mar 18 06:52:53 crc kubenswrapper[4917]: I0318 06:52:53.439493 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lpwxl" event={"ID":"51220ace-9206-4db4-86c5-752d63a97ae2","Type":"ContainerDied","Data":"c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793"} Mar 18 06:52:53 crc kubenswrapper[4917]: I0318 06:52:53.439521 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lpwxl" event={"ID":"51220ace-9206-4db4-86c5-752d63a97ae2","Type":"ContainerStarted","Data":"6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2"} Mar 18 06:52:53 crc kubenswrapper[4917]: I0318 06:52:53.485125 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lpwxl" podStartSLOduration=1.900794469 podStartE2EDuration="3.485093093s" podCreationTimestamp="2026-03-18 06:52:50 +0000 UTC" firstStartedPulling="2026-03-18 06:52:51.381495347 +0000 UTC m=+356.322650101" lastFinishedPulling="2026-03-18 06:52:52.965794001 +0000 UTC m=+357.906948725" observedRunningTime="2026-03-18 06:52:53.478899834 +0000 UTC m=+358.420054558" watchObservedRunningTime="2026-03-18 06:52:53.485093093 +0000 UTC m=+358.426247837" Mar 18 06:52:53 crc kubenswrapper[4917]: I0318 06:52:53.494958 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rqzh7" podStartSLOduration=3.009245494 podStartE2EDuration="5.494934802s" podCreationTimestamp="2026-03-18 06:52:48 +0000 UTC" firstStartedPulling="2026-03-18 06:52:50.364166227 +0000 UTC m=+355.305320981" lastFinishedPulling="2026-03-18 06:52:52.849855535 +0000 UTC m=+357.791010289" observedRunningTime="2026-03-18 06:52:53.494286976 +0000 UTC m=+358.435441700" watchObservedRunningTime="2026-03-18 06:52:53.494934802 +0000 UTC m=+358.436089546" Mar 18 06:52:54 crc kubenswrapper[4917]: I0318 06:52:54.448946 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vhrw" event={"ID":"9e197559-b623-42e3-bf29-dcf0c20a779d","Type":"ContainerStarted","Data":"fbe8ba03895958843fe7b0dc0e781286215cb8aaff029301219caaee14873c8a"} Mar 18 06:52:54 crc kubenswrapper[4917]: I0318 06:52:54.477413 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9vhrw" podStartSLOduration=2.99749258 podStartE2EDuration="4.477387037s" podCreationTimestamp="2026-03-18 06:52:50 +0000 UTC" firstStartedPulling="2026-03-18 06:52:52.396152061 +0000 UTC m=+357.337306785" lastFinishedPulling="2026-03-18 06:52:53.876046518 +0000 UTC m=+358.817201242" observedRunningTime="2026-03-18 06:52:54.471649488 +0000 UTC m=+359.412804202" watchObservedRunningTime="2026-03-18 06:52:54.477387037 +0000 UTC m=+359.418541781" Mar 18 06:52:58 crc kubenswrapper[4917]: I0318 06:52:58.057084 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:58 crc kubenswrapper[4917]: I0318 06:52:58.057439 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:52:58 crc kubenswrapper[4917]: I0318 06:52:58.675977 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:58 crc kubenswrapper[4917]: I0318 06:52:58.676057 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:58 crc kubenswrapper[4917]: I0318 06:52:58.745485 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:52:59 crc kubenswrapper[4917]: I0318 06:52:59.128399 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6p978" podUID="dcba268d-27fe-4d66-863a-b94762a886b4" containerName="registry-server" probeResult="failure" output=< Mar 18 06:52:59 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 06:52:59 crc kubenswrapper[4917]: > Mar 18 06:52:59 crc kubenswrapper[4917]: I0318 06:52:59.560990 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 06:53:00 crc kubenswrapper[4917]: I0318 06:53:00.465833 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:53:00 crc kubenswrapper[4917]: I0318 06:53:00.466367 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:53:00 crc kubenswrapper[4917]: I0318 06:53:00.544876 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:53:00 crc kubenswrapper[4917]: I0318 06:53:00.583603 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lpwxl" Mar 18 06:53:01 crc kubenswrapper[4917]: I0318 06:53:01.041997 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:53:01 crc kubenswrapper[4917]: I0318 06:53:01.042062 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:53:01 crc kubenswrapper[4917]: I0318 06:53:01.108027 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:53:01 crc kubenswrapper[4917]: I0318 06:53:01.555698 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9vhrw" Mar 18 06:53:08 crc kubenswrapper[4917]: I0318 06:53:08.119989 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:53:08 crc kubenswrapper[4917]: I0318 06:53:08.166814 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6p978" Mar 18 06:53:17 crc kubenswrapper[4917]: I0318 06:53:17.515478 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" podUID="d9512feb-84a2-46b7-8df1-a672b069d7bc" containerName="registry" containerID="cri-o://a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96" gracePeriod=30 Mar 18 06:53:17 crc kubenswrapper[4917]: I0318 06:53:17.958419 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.044920 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9512feb-84a2-46b7-8df1-a672b069d7bc-installation-pull-secrets\") pod \"d9512feb-84a2-46b7-8df1-a672b069d7bc\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.045022 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9512feb-84a2-46b7-8df1-a672b069d7bc-ca-trust-extracted\") pod \"d9512feb-84a2-46b7-8df1-a672b069d7bc\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.045081 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-trusted-ca\") pod \"d9512feb-84a2-46b7-8df1-a672b069d7bc\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.045138 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd52w\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-kube-api-access-fd52w\") pod \"d9512feb-84a2-46b7-8df1-a672b069d7bc\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.045466 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d9512feb-84a2-46b7-8df1-a672b069d7bc\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.045516 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-tls\") pod \"d9512feb-84a2-46b7-8df1-a672b069d7bc\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.045558 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-bound-sa-token\") pod \"d9512feb-84a2-46b7-8df1-a672b069d7bc\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.045660 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-certificates\") pod \"d9512feb-84a2-46b7-8df1-a672b069d7bc\" (UID: \"d9512feb-84a2-46b7-8df1-a672b069d7bc\") " Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.045868 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d9512feb-84a2-46b7-8df1-a672b069d7bc" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.046017 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.046896 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d9512feb-84a2-46b7-8df1-a672b069d7bc" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.051282 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9512feb-84a2-46b7-8df1-a672b069d7bc-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d9512feb-84a2-46b7-8df1-a672b069d7bc" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.054737 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-kube-api-access-fd52w" (OuterVolumeSpecName: "kube-api-access-fd52w") pod "d9512feb-84a2-46b7-8df1-a672b069d7bc" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc"). InnerVolumeSpecName "kube-api-access-fd52w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.060228 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d9512feb-84a2-46b7-8df1-a672b069d7bc" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.060513 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d9512feb-84a2-46b7-8df1-a672b069d7bc" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.065437 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d9512feb-84a2-46b7-8df1-a672b069d7bc" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.073199 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9512feb-84a2-46b7-8df1-a672b069d7bc-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d9512feb-84a2-46b7-8df1-a672b069d7bc" (UID: "d9512feb-84a2-46b7-8df1-a672b069d7bc"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.147480 4917 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.147507 4917 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d9512feb-84a2-46b7-8df1-a672b069d7bc-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.147517 4917 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d9512feb-84a2-46b7-8df1-a672b069d7bc-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.147526 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd52w\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-kube-api-access-fd52w\") on node \"crc\" DevicePath \"\"" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.147535 4917 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.147544 4917 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d9512feb-84a2-46b7-8df1-a672b069d7bc-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.606676 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.606701 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" event={"ID":"d9512feb-84a2-46b7-8df1-a672b069d7bc","Type":"ContainerDied","Data":"a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96"} Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.606760 4917 scope.go:117] "RemoveContainer" containerID="a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.606678 4917 generic.go:334] "Generic (PLEG): container finished" podID="d9512feb-84a2-46b7-8df1-a672b069d7bc" containerID="a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96" exitCode=0 Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.606873 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-hfll9" event={"ID":"d9512feb-84a2-46b7-8df1-a672b069d7bc","Type":"ContainerDied","Data":"58f464e54bad617c117f04ada1bc7b03514007489e20247928c1a7d1acb09cb9"} Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.637826 4917 scope.go:117] "RemoveContainer" containerID="a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96" Mar 18 06:53:18 crc kubenswrapper[4917]: E0318 06:53:18.638262 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96\": container with ID starting with a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96 not found: ID does not exist" containerID="a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.638290 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96"} err="failed to get container status \"a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96\": rpc error: code = NotFound desc = could not find container \"a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96\": container with ID starting with a2eb97d077c03ebf0f78f200587f8ae8453a8f17ff102947dac7ffdfebd33a96 not found: ID does not exist" Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.640732 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hfll9"] Mar 18 06:53:18 crc kubenswrapper[4917]: I0318 06:53:18.644972 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-hfll9"] Mar 18 06:53:19 crc kubenswrapper[4917]: I0318 06:53:19.780270 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9512feb-84a2-46b7-8df1-a672b069d7bc" path="/var/lib/kubelet/pods/d9512feb-84a2-46b7-8df1-a672b069d7bc/volumes" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.146190 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563614-sqr96"] Mar 18 06:54:00 crc kubenswrapper[4917]: E0318 06:54:00.147126 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9512feb-84a2-46b7-8df1-a672b069d7bc" containerName="registry" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.147148 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9512feb-84a2-46b7-8df1-a672b069d7bc" containerName="registry" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.147338 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9512feb-84a2-46b7-8df1-a672b069d7bc" containerName="registry" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.147954 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563614-sqr96" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.150617 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.154248 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.155307 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.160561 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563614-sqr96"] Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.268121 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqqdm\" (UniqueName: \"kubernetes.io/projected/28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb-kube-api-access-wqqdm\") pod \"auto-csr-approver-29563614-sqr96\" (UID: \"28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb\") " pod="openshift-infra/auto-csr-approver-29563614-sqr96" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.369120 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqqdm\" (UniqueName: \"kubernetes.io/projected/28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb-kube-api-access-wqqdm\") pod \"auto-csr-approver-29563614-sqr96\" (UID: \"28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb\") " pod="openshift-infra/auto-csr-approver-29563614-sqr96" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.401714 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqqdm\" (UniqueName: \"kubernetes.io/projected/28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb-kube-api-access-wqqdm\") pod \"auto-csr-approver-29563614-sqr96\" (UID: \"28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb\") " pod="openshift-infra/auto-csr-approver-29563614-sqr96" Mar 18 06:54:00 crc kubenswrapper[4917]: I0318 06:54:00.468192 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563614-sqr96" Mar 18 06:54:01 crc kubenswrapper[4917]: I0318 06:54:00.727413 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563614-sqr96"] Mar 18 06:54:01 crc kubenswrapper[4917]: I0318 06:54:00.741834 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 06:54:01 crc kubenswrapper[4917]: I0318 06:54:00.893294 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563614-sqr96" event={"ID":"28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb","Type":"ContainerStarted","Data":"8bfab9cc6e4dff6916f4be4089f58d051005d58e111d9c4b2694d72a33f7cbf0"} Mar 18 06:54:01 crc kubenswrapper[4917]: I0318 06:54:01.899731 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563614-sqr96" event={"ID":"28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb","Type":"ContainerStarted","Data":"45a19ad97dc7499fc8bfe60f402a74e4cb1aa8c2dce01795fd7e1a5398e0e3ca"} Mar 18 06:54:01 crc kubenswrapper[4917]: I0318 06:54:01.918743 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563614-sqr96" podStartSLOduration=1.057912501 podStartE2EDuration="1.918718781s" podCreationTimestamp="2026-03-18 06:54:00 +0000 UTC" firstStartedPulling="2026-03-18 06:54:00.741506891 +0000 UTC m=+425.682661615" lastFinishedPulling="2026-03-18 06:54:01.602313181 +0000 UTC m=+426.543467895" observedRunningTime="2026-03-18 06:54:01.913541088 +0000 UTC m=+426.854695802" watchObservedRunningTime="2026-03-18 06:54:01.918718781 +0000 UTC m=+426.859873525" Mar 18 06:54:02 crc kubenswrapper[4917]: I0318 06:54:02.910148 4917 generic.go:334] "Generic (PLEG): container finished" podID="28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb" containerID="45a19ad97dc7499fc8bfe60f402a74e4cb1aa8c2dce01795fd7e1a5398e0e3ca" exitCode=0 Mar 18 06:54:02 crc kubenswrapper[4917]: I0318 06:54:02.910226 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563614-sqr96" event={"ID":"28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb","Type":"ContainerDied","Data":"45a19ad97dc7499fc8bfe60f402a74e4cb1aa8c2dce01795fd7e1a5398e0e3ca"} Mar 18 06:54:04 crc kubenswrapper[4917]: I0318 06:54:04.198794 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563614-sqr96" Mar 18 06:54:04 crc kubenswrapper[4917]: I0318 06:54:04.324472 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqqdm\" (UniqueName: \"kubernetes.io/projected/28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb-kube-api-access-wqqdm\") pod \"28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb\" (UID: \"28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb\") " Mar 18 06:54:04 crc kubenswrapper[4917]: I0318 06:54:04.329717 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb-kube-api-access-wqqdm" (OuterVolumeSpecName: "kube-api-access-wqqdm") pod "28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb" (UID: "28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb"). InnerVolumeSpecName "kube-api-access-wqqdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:54:04 crc kubenswrapper[4917]: I0318 06:54:04.425836 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqqdm\" (UniqueName: \"kubernetes.io/projected/28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb-kube-api-access-wqqdm\") on node \"crc\" DevicePath \"\"" Mar 18 06:54:04 crc kubenswrapper[4917]: I0318 06:54:04.923758 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563614-sqr96" event={"ID":"28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb","Type":"ContainerDied","Data":"8bfab9cc6e4dff6916f4be4089f58d051005d58e111d9c4b2694d72a33f7cbf0"} Mar 18 06:54:04 crc kubenswrapper[4917]: I0318 06:54:04.923812 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bfab9cc6e4dff6916f4be4089f58d051005d58e111d9c4b2694d72a33f7cbf0" Mar 18 06:54:04 crc kubenswrapper[4917]: I0318 06:54:04.923893 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563614-sqr96" Mar 18 06:54:32 crc kubenswrapper[4917]: I0318 06:54:32.929796 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 06:54:32 crc kubenswrapper[4917]: I0318 06:54:32.930669 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 06:55:02 crc kubenswrapper[4917]: I0318 06:55:02.929145 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 06:55:02 crc kubenswrapper[4917]: I0318 06:55:02.929903 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 06:55:32 crc kubenswrapper[4917]: I0318 06:55:32.929098 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 06:55:32 crc kubenswrapper[4917]: I0318 06:55:32.929805 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 06:55:32 crc kubenswrapper[4917]: I0318 06:55:32.929896 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:55:32 crc kubenswrapper[4917]: I0318 06:55:32.931134 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"55ec8b0401f2edcf38b2bcbd2dfc70f1d98070145f47acdaaf58224f01cca565"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 06:55:32 crc kubenswrapper[4917]: I0318 06:55:32.931241 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://55ec8b0401f2edcf38b2bcbd2dfc70f1d98070145f47acdaaf58224f01cca565" gracePeriod=600 Mar 18 06:55:33 crc kubenswrapper[4917]: I0318 06:55:33.527239 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="55ec8b0401f2edcf38b2bcbd2dfc70f1d98070145f47acdaaf58224f01cca565" exitCode=0 Mar 18 06:55:33 crc kubenswrapper[4917]: I0318 06:55:33.527531 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"55ec8b0401f2edcf38b2bcbd2dfc70f1d98070145f47acdaaf58224f01cca565"} Mar 18 06:55:33 crc kubenswrapper[4917]: I0318 06:55:33.527664 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"db29f517217de042b6ac5da88be9d3c468c408a7daa895e65fa34100384451a5"} Mar 18 06:55:33 crc kubenswrapper[4917]: I0318 06:55:33.527705 4917 scope.go:117] "RemoveContainer" containerID="8bc780e4080b196acc164a6127722e66896b9364ad9dbfff119e4b793a72e986" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.151769 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563616-nx5gm"] Mar 18 06:56:00 crc kubenswrapper[4917]: E0318 06:56:00.153133 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb" containerName="oc" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.153157 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb" containerName="oc" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.153378 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb" containerName="oc" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.154021 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563616-nx5gm" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.158431 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563616-nx5gm"] Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.160256 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.160297 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.160734 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.294494 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmt4b\" (UniqueName: \"kubernetes.io/projected/0dde11ad-4301-47f1-a433-9a1d37b7f482-kube-api-access-vmt4b\") pod \"auto-csr-approver-29563616-nx5gm\" (UID: \"0dde11ad-4301-47f1-a433-9a1d37b7f482\") " pod="openshift-infra/auto-csr-approver-29563616-nx5gm" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.396202 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmt4b\" (UniqueName: \"kubernetes.io/projected/0dde11ad-4301-47f1-a433-9a1d37b7f482-kube-api-access-vmt4b\") pod \"auto-csr-approver-29563616-nx5gm\" (UID: \"0dde11ad-4301-47f1-a433-9a1d37b7f482\") " pod="openshift-infra/auto-csr-approver-29563616-nx5gm" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.431043 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmt4b\" (UniqueName: \"kubernetes.io/projected/0dde11ad-4301-47f1-a433-9a1d37b7f482-kube-api-access-vmt4b\") pod \"auto-csr-approver-29563616-nx5gm\" (UID: \"0dde11ad-4301-47f1-a433-9a1d37b7f482\") " pod="openshift-infra/auto-csr-approver-29563616-nx5gm" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.482371 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563616-nx5gm" Mar 18 06:56:00 crc kubenswrapper[4917]: I0318 06:56:00.714745 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563616-nx5gm"] Mar 18 06:56:01 crc kubenswrapper[4917]: I0318 06:56:01.738162 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563616-nx5gm" event={"ID":"0dde11ad-4301-47f1-a433-9a1d37b7f482","Type":"ContainerStarted","Data":"3db8b1052b66b35504a9582988319ca9da7aa3fc32d6033a031a2f767ab03868"} Mar 18 06:56:02 crc kubenswrapper[4917]: I0318 06:56:02.750768 4917 generic.go:334] "Generic (PLEG): container finished" podID="0dde11ad-4301-47f1-a433-9a1d37b7f482" containerID="5009656d9ccd2927642c0a8f3bd120b0a5772d75d6a890c507113c93b9aa92d0" exitCode=0 Mar 18 06:56:02 crc kubenswrapper[4917]: I0318 06:56:02.750921 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563616-nx5gm" event={"ID":"0dde11ad-4301-47f1-a433-9a1d37b7f482","Type":"ContainerDied","Data":"5009656d9ccd2927642c0a8f3bd120b0a5772d75d6a890c507113c93b9aa92d0"} Mar 18 06:56:04 crc kubenswrapper[4917]: I0318 06:56:04.081860 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563616-nx5gm" Mar 18 06:56:04 crc kubenswrapper[4917]: I0318 06:56:04.253220 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmt4b\" (UniqueName: \"kubernetes.io/projected/0dde11ad-4301-47f1-a433-9a1d37b7f482-kube-api-access-vmt4b\") pod \"0dde11ad-4301-47f1-a433-9a1d37b7f482\" (UID: \"0dde11ad-4301-47f1-a433-9a1d37b7f482\") " Mar 18 06:56:04 crc kubenswrapper[4917]: I0318 06:56:04.262811 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dde11ad-4301-47f1-a433-9a1d37b7f482-kube-api-access-vmt4b" (OuterVolumeSpecName: "kube-api-access-vmt4b") pod "0dde11ad-4301-47f1-a433-9a1d37b7f482" (UID: "0dde11ad-4301-47f1-a433-9a1d37b7f482"). InnerVolumeSpecName "kube-api-access-vmt4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:56:04 crc kubenswrapper[4917]: I0318 06:56:04.355237 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmt4b\" (UniqueName: \"kubernetes.io/projected/0dde11ad-4301-47f1-a433-9a1d37b7f482-kube-api-access-vmt4b\") on node \"crc\" DevicePath \"\"" Mar 18 06:56:04 crc kubenswrapper[4917]: I0318 06:56:04.769179 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563616-nx5gm" event={"ID":"0dde11ad-4301-47f1-a433-9a1d37b7f482","Type":"ContainerDied","Data":"3db8b1052b66b35504a9582988319ca9da7aa3fc32d6033a031a2f767ab03868"} Mar 18 06:56:04 crc kubenswrapper[4917]: I0318 06:56:04.769253 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3db8b1052b66b35504a9582988319ca9da7aa3fc32d6033a031a2f767ab03868" Mar 18 06:56:04 crc kubenswrapper[4917]: I0318 06:56:04.769256 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563616-nx5gm" Mar 18 06:56:05 crc kubenswrapper[4917]: I0318 06:56:05.151101 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563610-7pqdt"] Mar 18 06:56:05 crc kubenswrapper[4917]: I0318 06:56:05.154376 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563610-7pqdt"] Mar 18 06:56:05 crc kubenswrapper[4917]: I0318 06:56:05.782174 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c13e4c1a-96ea-4dad-9cc3-a850ee57f969" path="/var/lib/kubelet/pods/c13e4c1a-96ea-4dad-9cc3-a850ee57f969/volumes" Mar 18 06:56:56 crc kubenswrapper[4917]: I0318 06:56:56.172148 4917 scope.go:117] "RemoveContainer" containerID="68332a4a2ed53037c9224e4830b1e1364106cec82ac47c3ae4d48a8f94bd3da8" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.144487 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563618-78xd9"] Mar 18 06:58:00 crc kubenswrapper[4917]: E0318 06:58:00.145320 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dde11ad-4301-47f1-a433-9a1d37b7f482" containerName="oc" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.145335 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dde11ad-4301-47f1-a433-9a1d37b7f482" containerName="oc" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.145469 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dde11ad-4301-47f1-a433-9a1d37b7f482" containerName="oc" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.145895 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563618-78xd9" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.149741 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.149934 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.150235 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.167670 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563618-78xd9"] Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.196686 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkfqc\" (UniqueName: \"kubernetes.io/projected/f96a1728-b2c0-42da-b1a0-8794cd50c63c-kube-api-access-tkfqc\") pod \"auto-csr-approver-29563618-78xd9\" (UID: \"f96a1728-b2c0-42da-b1a0-8794cd50c63c\") " pod="openshift-infra/auto-csr-approver-29563618-78xd9" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.297757 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkfqc\" (UniqueName: \"kubernetes.io/projected/f96a1728-b2c0-42da-b1a0-8794cd50c63c-kube-api-access-tkfqc\") pod \"auto-csr-approver-29563618-78xd9\" (UID: \"f96a1728-b2c0-42da-b1a0-8794cd50c63c\") " pod="openshift-infra/auto-csr-approver-29563618-78xd9" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.320472 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkfqc\" (UniqueName: \"kubernetes.io/projected/f96a1728-b2c0-42da-b1a0-8794cd50c63c-kube-api-access-tkfqc\") pod \"auto-csr-approver-29563618-78xd9\" (UID: \"f96a1728-b2c0-42da-b1a0-8794cd50c63c\") " pod="openshift-infra/auto-csr-approver-29563618-78xd9" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.466042 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563618-78xd9" Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.739820 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563618-78xd9"] Mar 18 06:58:00 crc kubenswrapper[4917]: I0318 06:58:00.832549 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563618-78xd9" event={"ID":"f96a1728-b2c0-42da-b1a0-8794cd50c63c","Type":"ContainerStarted","Data":"5c465f92e41729ff78fea41bd48b2e5030410c15e80b6e96504552a282455cae"} Mar 18 06:58:02 crc kubenswrapper[4917]: I0318 06:58:02.853107 4917 generic.go:334] "Generic (PLEG): container finished" podID="f96a1728-b2c0-42da-b1a0-8794cd50c63c" containerID="e987278e5355ddd1c2438c79c9c14123be9ee59a9274536d40cc2fb35b569400" exitCode=0 Mar 18 06:58:02 crc kubenswrapper[4917]: I0318 06:58:02.853197 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563618-78xd9" event={"ID":"f96a1728-b2c0-42da-b1a0-8794cd50c63c","Type":"ContainerDied","Data":"e987278e5355ddd1c2438c79c9c14123be9ee59a9274536d40cc2fb35b569400"} Mar 18 06:58:02 crc kubenswrapper[4917]: I0318 06:58:02.929259 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 06:58:02 crc kubenswrapper[4917]: I0318 06:58:02.929343 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 06:58:04 crc kubenswrapper[4917]: I0318 06:58:04.213204 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563618-78xd9" Mar 18 06:58:04 crc kubenswrapper[4917]: I0318 06:58:04.257326 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkfqc\" (UniqueName: \"kubernetes.io/projected/f96a1728-b2c0-42da-b1a0-8794cd50c63c-kube-api-access-tkfqc\") pod \"f96a1728-b2c0-42da-b1a0-8794cd50c63c\" (UID: \"f96a1728-b2c0-42da-b1a0-8794cd50c63c\") " Mar 18 06:58:04 crc kubenswrapper[4917]: I0318 06:58:04.263021 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f96a1728-b2c0-42da-b1a0-8794cd50c63c-kube-api-access-tkfqc" (OuterVolumeSpecName: "kube-api-access-tkfqc") pod "f96a1728-b2c0-42da-b1a0-8794cd50c63c" (UID: "f96a1728-b2c0-42da-b1a0-8794cd50c63c"). InnerVolumeSpecName "kube-api-access-tkfqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:58:04 crc kubenswrapper[4917]: I0318 06:58:04.358688 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkfqc\" (UniqueName: \"kubernetes.io/projected/f96a1728-b2c0-42da-b1a0-8794cd50c63c-kube-api-access-tkfqc\") on node \"crc\" DevicePath \"\"" Mar 18 06:58:04 crc kubenswrapper[4917]: I0318 06:58:04.870037 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563618-78xd9" event={"ID":"f96a1728-b2c0-42da-b1a0-8794cd50c63c","Type":"ContainerDied","Data":"5c465f92e41729ff78fea41bd48b2e5030410c15e80b6e96504552a282455cae"} Mar 18 06:58:04 crc kubenswrapper[4917]: I0318 06:58:04.870096 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c465f92e41729ff78fea41bd48b2e5030410c15e80b6e96504552a282455cae" Mar 18 06:58:04 crc kubenswrapper[4917]: I0318 06:58:04.870148 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563618-78xd9" Mar 18 06:58:05 crc kubenswrapper[4917]: I0318 06:58:05.282149 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563612-whsn7"] Mar 18 06:58:05 crc kubenswrapper[4917]: I0318 06:58:05.288841 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563612-whsn7"] Mar 18 06:58:05 crc kubenswrapper[4917]: I0318 06:58:05.784231 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9eda6fda-800a-456e-8f1a-13b9b31035d2" path="/var/lib/kubelet/pods/9eda6fda-800a-456e-8f1a-13b9b31035d2/volumes" Mar 18 06:58:32 crc kubenswrapper[4917]: I0318 06:58:32.929212 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 06:58:32 crc kubenswrapper[4917]: I0318 06:58:32.930028 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 06:58:56 crc kubenswrapper[4917]: I0318 06:58:56.263640 4917 scope.go:117] "RemoveContainer" containerID="275ac8a6c189a51c65dcd0c73f076e2c80b9530fdbecedc735ee1bf0ef3949e2" Mar 18 06:59:01 crc kubenswrapper[4917]: I0318 06:59:01.239857 4917 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 18 06:59:02 crc kubenswrapper[4917]: I0318 06:59:02.929441 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 06:59:02 crc kubenswrapper[4917]: I0318 06:59:02.929529 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 06:59:02 crc kubenswrapper[4917]: I0318 06:59:02.929640 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 06:59:02 crc kubenswrapper[4917]: I0318 06:59:02.930527 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"db29f517217de042b6ac5da88be9d3c468c408a7daa895e65fa34100384451a5"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 06:59:02 crc kubenswrapper[4917]: I0318 06:59:02.930692 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://db29f517217de042b6ac5da88be9d3c468c408a7daa895e65fa34100384451a5" gracePeriod=600 Mar 18 06:59:03 crc kubenswrapper[4917]: I0318 06:59:03.313907 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="db29f517217de042b6ac5da88be9d3c468c408a7daa895e65fa34100384451a5" exitCode=0 Mar 18 06:59:03 crc kubenswrapper[4917]: I0318 06:59:03.314013 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"db29f517217de042b6ac5da88be9d3c468c408a7daa895e65fa34100384451a5"} Mar 18 06:59:03 crc kubenswrapper[4917]: I0318 06:59:03.314291 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"d1fa455e15b1a756345723f1e179413cfc4b43062d137c44fce060567289753f"} Mar 18 06:59:03 crc kubenswrapper[4917]: I0318 06:59:03.314317 4917 scope.go:117] "RemoveContainer" containerID="55ec8b0401f2edcf38b2bcbd2dfc70f1d98070145f47acdaaf58224f01cca565" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.779528 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dws25"] Mar 18 06:59:22 crc kubenswrapper[4917]: E0318 06:59:22.780792 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f96a1728-b2c0-42da-b1a0-8794cd50c63c" containerName="oc" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.780819 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f96a1728-b2c0-42da-b1a0-8794cd50c63c" containerName="oc" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.781017 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f96a1728-b2c0-42da-b1a0-8794cd50c63c" containerName="oc" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.782370 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.791428 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-catalog-content\") pod \"certified-operators-dws25\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.792240 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-utilities\") pod \"certified-operators-dws25\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.792362 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whszb\" (UniqueName: \"kubernetes.io/projected/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-kube-api-access-whszb\") pod \"certified-operators-dws25\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.798055 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dws25"] Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.893197 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-utilities\") pod \"certified-operators-dws25\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.893247 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whszb\" (UniqueName: \"kubernetes.io/projected/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-kube-api-access-whszb\") pod \"certified-operators-dws25\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.893279 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-catalog-content\") pod \"certified-operators-dws25\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.893989 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-catalog-content\") pod \"certified-operators-dws25\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.894369 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-utilities\") pod \"certified-operators-dws25\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:22 crc kubenswrapper[4917]: I0318 06:59:22.916783 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whszb\" (UniqueName: \"kubernetes.io/projected/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-kube-api-access-whszb\") pod \"certified-operators-dws25\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.108409 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.355662 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dws25"] Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.452358 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dws25" event={"ID":"d92de4fc-3dbb-450d-8ab8-2182ec4f435c","Type":"ContainerStarted","Data":"b43c62ef5bae855387e82856ffcf2d0551499b8094dcd97cd8514c94d74f000a"} Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.475281 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-bd8lr"] Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.475934 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.479984 4917 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-p9ph2" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.480171 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.480287 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.480412 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.480700 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bd8lr"] Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.599979 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcdtr\" (UniqueName: \"kubernetes.io/projected/984ea079-192f-436b-bde0-0b8bc6df9fe8-kube-api-access-tcdtr\") pod \"crc-storage-crc-bd8lr\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.600053 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/984ea079-192f-436b-bde0-0b8bc6df9fe8-node-mnt\") pod \"crc-storage-crc-bd8lr\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.600074 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/984ea079-192f-436b-bde0-0b8bc6df9fe8-crc-storage\") pod \"crc-storage-crc-bd8lr\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.702381 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcdtr\" (UniqueName: \"kubernetes.io/projected/984ea079-192f-436b-bde0-0b8bc6df9fe8-kube-api-access-tcdtr\") pod \"crc-storage-crc-bd8lr\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.702524 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/984ea079-192f-436b-bde0-0b8bc6df9fe8-node-mnt\") pod \"crc-storage-crc-bd8lr\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.702568 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/984ea079-192f-436b-bde0-0b8bc6df9fe8-crc-storage\") pod \"crc-storage-crc-bd8lr\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.703037 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/984ea079-192f-436b-bde0-0b8bc6df9fe8-node-mnt\") pod \"crc-storage-crc-bd8lr\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.704347 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/984ea079-192f-436b-bde0-0b8bc6df9fe8-crc-storage\") pod \"crc-storage-crc-bd8lr\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.738827 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcdtr\" (UniqueName: \"kubernetes.io/projected/984ea079-192f-436b-bde0-0b8bc6df9fe8-kube-api-access-tcdtr\") pod \"crc-storage-crc-bd8lr\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:23 crc kubenswrapper[4917]: I0318 06:59:23.795920 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.114589 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-bd8lr"] Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.138933 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.139932 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f8k4j"] Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.140326 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovn-controller" containerID="cri-o://cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1" gracePeriod=30 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.140762 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kube-rbac-proxy-node" containerID="cri-o://9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b" gracePeriod=30 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.140779 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44" gracePeriod=30 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.140821 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovn-acl-logging" containerID="cri-o://066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a" gracePeriod=30 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.140898 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="northd" containerID="cri-o://be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1" gracePeriod=30 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.140942 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="sbdb" containerID="cri-o://556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e" gracePeriod=30 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.140907 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="nbdb" containerID="cri-o://55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681" gracePeriod=30 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.174080 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovnkube-controller" containerID="cri-o://fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d" gracePeriod=30 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.414157 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f8k4j_5dd0f3cd-77e6-44b6-92e3-50740ab1fffa/ovn-acl-logging/0.log" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.414877 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f8k4j_5dd0f3cd-77e6-44b6-92e3-50740ab1fffa/ovn-controller/0.log" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.415258 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464148 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fm4f4"] Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.464332 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kubecfg-setup" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464344 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kubecfg-setup" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.464353 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="sbdb" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464360 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="sbdb" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.464370 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovn-acl-logging" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464376 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovn-acl-logging" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.464384 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464390 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.464399 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="northd" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464405 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="northd" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.464415 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kube-rbac-proxy-node" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464421 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kube-rbac-proxy-node" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.464428 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="nbdb" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464433 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="nbdb" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.464443 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovn-controller" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464448 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovn-controller" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.464461 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovnkube-controller" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464467 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovnkube-controller" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464545 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="sbdb" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464553 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kube-rbac-proxy-node" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464561 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovn-acl-logging" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464568 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="kube-rbac-proxy-ovn-metrics" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464574 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovnkube-controller" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464584 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="ovn-controller" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464607 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="nbdb" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.464616 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerName="northd" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.465872 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f8k4j_5dd0f3cd-77e6-44b6-92e3-50740ab1fffa/ovn-acl-logging/0.log" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466188 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466311 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f8k4j_5dd0f3cd-77e6-44b6-92e3-50740ab1fffa/ovn-controller/0.log" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466660 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerID="fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d" exitCode=0 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466682 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerID="556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e" exitCode=0 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466690 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerID="55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681" exitCode=0 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466699 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerID="be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1" exitCode=0 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466705 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerID="722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44" exitCode=0 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466711 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerID="9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b" exitCode=0 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466717 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerID="066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a" exitCode=143 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466723 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" containerID="cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1" exitCode=143 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466767 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466785 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466806 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466816 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466828 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466837 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466846 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466855 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466865 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466871 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466877 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466885 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466890 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466896 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466901 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466906 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466910 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466917 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466922 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466928 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466935 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466943 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466949 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466954 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466959 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466963 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466968 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466973 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466977 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466982 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466988 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f8k4j" event={"ID":"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa","Type":"ContainerDied","Data":"87d91530680849fe3d74c26838d79a41ba50f8ae6cdcebce96233e3593a8a4ad"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.466995 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.467000 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.467007 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.467011 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.467017 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.467023 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.467027 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.467032 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.467037 4917 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.467059 4917 scope.go:117] "RemoveContainer" containerID="fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.469608 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pp4xr_4aaf9352-3715-40a0-876a-09f4f27a41c2/kube-multus/0.log" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.469652 4917 generic.go:334] "Generic (PLEG): container finished" podID="4aaf9352-3715-40a0-876a-09f4f27a41c2" containerID="15cf267d6e201b9eed42ce3305396a9e1c1634cdcab53dc137bfd3a40c8a776b" exitCode=2 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.469711 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pp4xr" event={"ID":"4aaf9352-3715-40a0-876a-09f4f27a41c2","Type":"ContainerDied","Data":"15cf267d6e201b9eed42ce3305396a9e1c1634cdcab53dc137bfd3a40c8a776b"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.470082 4917 scope.go:117] "RemoveContainer" containerID="15cf267d6e201b9eed42ce3305396a9e1c1634cdcab53dc137bfd3a40c8a776b" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.472107 4917 generic.go:334] "Generic (PLEG): container finished" podID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerID="6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f" exitCode=0 Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.472178 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dws25" event={"ID":"d92de4fc-3dbb-450d-8ab8-2182ec4f435c","Type":"ContainerDied","Data":"6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.474050 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bd8lr" event={"ID":"984ea079-192f-436b-bde0-0b8bc6df9fe8","Type":"ContainerStarted","Data":"fa3cb97d344557da827f808596709aa9652c0c3fdebd33ccff7fb746970c7442"} Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.498080 4917 scope.go:117] "RemoveContainer" containerID="556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.530940 4917 scope.go:117] "RemoveContainer" containerID="55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.560491 4917 scope.go:117] "RemoveContainer" containerID="be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.574734 4917 scope.go:117] "RemoveContainer" containerID="722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.586760 4917 scope.go:117] "RemoveContainer" containerID="9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.597526 4917 scope.go:117] "RemoveContainer" containerID="066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.611788 4917 scope.go:117] "RemoveContainer" containerID="cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.614858 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-etc-openvswitch\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.614907 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-config\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.614961 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovn-node-metrics-cert\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.614985 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-openvswitch\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615027 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-var-lib-openvswitch\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615043 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-node-log\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615061 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-systemd-units\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615115 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-env-overrides\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615131 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2tv7\" (UniqueName: \"kubernetes.io/projected/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-kube-api-access-d2tv7\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615148 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-var-lib-cni-networks-ovn-kubernetes\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615225 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-netd\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615315 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-log-socket\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615351 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-systemd\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615366 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-ovn-kubernetes\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615382 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-slash\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615397 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-kubelet\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615416 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-script-lib\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615429 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-bin\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615451 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-netns\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615473 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-ovn\") pod \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\" (UID: \"5dd0f3cd-77e6-44b6-92e3-50740ab1fffa\") " Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615683 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da9219b4-ac36-4c9d-a457-0461ff261481-ovnkube-script-lib\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615703 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da9219b4-ac36-4c9d-a457-0461ff261481-ovnkube-config\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615711 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615722 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-var-lib-openvswitch\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615717 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615746 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-log-socket\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615774 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615798 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-log-socket" (OuterVolumeSpecName: "log-socket") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615813 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-run-systemd\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615841 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-node-log" (OuterVolumeSpecName: "node-log") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615862 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616145 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616173 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616203 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616209 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-slash" (OuterVolumeSpecName: "host-slash") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616236 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616251 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616279 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616285 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.615742 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616384 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616588 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616697 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-cni-bin\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616735 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-kubelet\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616757 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da9219b4-ac36-4c9d-a457-0461ff261481-env-overrides\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616775 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-run-ovn\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616805 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616827 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-node-log\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616843 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-run-netns\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616858 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da9219b4-ac36-4c9d-a457-0461ff261481-ovn-node-metrics-cert\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616873 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-cni-netd\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.616957 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-slash\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617079 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-systemd-units\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617492 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-run-ovn-kubernetes\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617552 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-etc-openvswitch\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617569 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rx7v\" (UniqueName: \"kubernetes.io/projected/da9219b4-ac36-4c9d-a457-0461ff261481-kube-api-access-9rx7v\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617598 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-run-openvswitch\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617638 4917 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617648 4917 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617657 4917 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617666 4917 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617676 4917 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-node-log\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617685 4917 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617694 4917 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617702 4917 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617711 4917 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617720 4917 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617729 4917 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-log-socket\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617737 4917 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-slash\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617746 4917 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617755 4917 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617763 4917 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617771 4917 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.617779 4917 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.621338 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.621616 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-kube-api-access-d2tv7" (OuterVolumeSpecName: "kube-api-access-d2tv7") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "kube-api-access-d2tv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.626750 4917 scope.go:117] "RemoveContainer" containerID="77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.628182 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" (UID: "5dd0f3cd-77e6-44b6-92e3-50740ab1fffa"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.642097 4917 scope.go:117] "RemoveContainer" containerID="fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.642636 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": container with ID starting with fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d not found: ID does not exist" containerID="fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.642681 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} err="failed to get container status \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": rpc error: code = NotFound desc = could not find container \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": container with ID starting with fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.642711 4917 scope.go:117] "RemoveContainer" containerID="556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.642979 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": container with ID starting with 556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e not found: ID does not exist" containerID="556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.643000 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} err="failed to get container status \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": rpc error: code = NotFound desc = could not find container \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": container with ID starting with 556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.643015 4917 scope.go:117] "RemoveContainer" containerID="55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.643326 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": container with ID starting with 55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681 not found: ID does not exist" containerID="55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.643355 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} err="failed to get container status \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": rpc error: code = NotFound desc = could not find container \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": container with ID starting with 55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.643372 4917 scope.go:117] "RemoveContainer" containerID="be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.643642 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": container with ID starting with be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1 not found: ID does not exist" containerID="be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.643675 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} err="failed to get container status \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": rpc error: code = NotFound desc = could not find container \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": container with ID starting with be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.643694 4917 scope.go:117] "RemoveContainer" containerID="722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.644013 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": container with ID starting with 722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44 not found: ID does not exist" containerID="722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.644041 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} err="failed to get container status \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": rpc error: code = NotFound desc = could not find container \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": container with ID starting with 722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.644060 4917 scope.go:117] "RemoveContainer" containerID="9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.644253 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": container with ID starting with 9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b not found: ID does not exist" containerID="9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.644275 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} err="failed to get container status \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": rpc error: code = NotFound desc = could not find container \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": container with ID starting with 9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.644288 4917 scope.go:117] "RemoveContainer" containerID="066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.644569 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a\": container with ID starting with 066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a not found: ID does not exist" containerID="066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.644608 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} err="failed to get container status \"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a\": rpc error: code = NotFound desc = could not find container \"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a\": container with ID starting with 066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.644621 4917 scope.go:117] "RemoveContainer" containerID="cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.645098 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1\": container with ID starting with cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1 not found: ID does not exist" containerID="cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645119 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} err="failed to get container status \"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1\": rpc error: code = NotFound desc = could not find container \"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1\": container with ID starting with cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645131 4917 scope.go:117] "RemoveContainer" containerID="77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd" Mar 18 06:59:24 crc kubenswrapper[4917]: E0318 06:59:24.645366 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd\": container with ID starting with 77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd not found: ID does not exist" containerID="77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645387 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd"} err="failed to get container status \"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd\": rpc error: code = NotFound desc = could not find container \"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd\": container with ID starting with 77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645399 4917 scope.go:117] "RemoveContainer" containerID="fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645614 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} err="failed to get container status \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": rpc error: code = NotFound desc = could not find container \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": container with ID starting with fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645632 4917 scope.go:117] "RemoveContainer" containerID="556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645806 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} err="failed to get container status \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": rpc error: code = NotFound desc = could not find container \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": container with ID starting with 556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645818 4917 scope.go:117] "RemoveContainer" containerID="55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645980 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} err="failed to get container status \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": rpc error: code = NotFound desc = could not find container \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": container with ID starting with 55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.645996 4917 scope.go:117] "RemoveContainer" containerID="be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646168 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} err="failed to get container status \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": rpc error: code = NotFound desc = could not find container \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": container with ID starting with be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646185 4917 scope.go:117] "RemoveContainer" containerID="722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646357 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} err="failed to get container status \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": rpc error: code = NotFound desc = could not find container \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": container with ID starting with 722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646374 4917 scope.go:117] "RemoveContainer" containerID="9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646527 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} err="failed to get container status \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": rpc error: code = NotFound desc = could not find container \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": container with ID starting with 9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646543 4917 scope.go:117] "RemoveContainer" containerID="066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646734 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} err="failed to get container status \"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a\": rpc error: code = NotFound desc = could not find container \"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a\": container with ID starting with 066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646752 4917 scope.go:117] "RemoveContainer" containerID="cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646969 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} err="failed to get container status \"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1\": rpc error: code = NotFound desc = could not find container \"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1\": container with ID starting with cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.646987 4917 scope.go:117] "RemoveContainer" containerID="77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.647150 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd"} err="failed to get container status \"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd\": rpc error: code = NotFound desc = could not find container \"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd\": container with ID starting with 77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.647166 4917 scope.go:117] "RemoveContainer" containerID="fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.647464 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} err="failed to get container status \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": rpc error: code = NotFound desc = could not find container \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": container with ID starting with fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.647482 4917 scope.go:117] "RemoveContainer" containerID="556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.647778 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} err="failed to get container status \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": rpc error: code = NotFound desc = could not find container \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": container with ID starting with 556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.647800 4917 scope.go:117] "RemoveContainer" containerID="55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648016 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} err="failed to get container status \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": rpc error: code = NotFound desc = could not find container \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": container with ID starting with 55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648032 4917 scope.go:117] "RemoveContainer" containerID="be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648242 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} err="failed to get container status \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": rpc error: code = NotFound desc = could not find container \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": container with ID starting with be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648261 4917 scope.go:117] "RemoveContainer" containerID="722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648433 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} err="failed to get container status \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": rpc error: code = NotFound desc = could not find container \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": container with ID starting with 722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648449 4917 scope.go:117] "RemoveContainer" containerID="9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648668 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} err="failed to get container status \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": rpc error: code = NotFound desc = could not find container \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": container with ID starting with 9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648684 4917 scope.go:117] "RemoveContainer" containerID="066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648876 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} err="failed to get container status \"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a\": rpc error: code = NotFound desc = could not find container \"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a\": container with ID starting with 066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.648898 4917 scope.go:117] "RemoveContainer" containerID="cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649110 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} err="failed to get container status \"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1\": rpc error: code = NotFound desc = could not find container \"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1\": container with ID starting with cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649136 4917 scope.go:117] "RemoveContainer" containerID="77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649317 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd"} err="failed to get container status \"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd\": rpc error: code = NotFound desc = could not find container \"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd\": container with ID starting with 77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649343 4917 scope.go:117] "RemoveContainer" containerID="fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649521 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} err="failed to get container status \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": rpc error: code = NotFound desc = could not find container \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": container with ID starting with fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649543 4917 scope.go:117] "RemoveContainer" containerID="556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649761 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} err="failed to get container status \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": rpc error: code = NotFound desc = could not find container \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": container with ID starting with 556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649784 4917 scope.go:117] "RemoveContainer" containerID="55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649958 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} err="failed to get container status \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": rpc error: code = NotFound desc = could not find container \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": container with ID starting with 55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.649979 4917 scope.go:117] "RemoveContainer" containerID="be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.650203 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} err="failed to get container status \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": rpc error: code = NotFound desc = could not find container \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": container with ID starting with be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.650227 4917 scope.go:117] "RemoveContainer" containerID="722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.650405 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} err="failed to get container status \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": rpc error: code = NotFound desc = could not find container \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": container with ID starting with 722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.650427 4917 scope.go:117] "RemoveContainer" containerID="9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.650739 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} err="failed to get container status \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": rpc error: code = NotFound desc = could not find container \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": container with ID starting with 9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.650764 4917 scope.go:117] "RemoveContainer" containerID="066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.650984 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a"} err="failed to get container status \"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a\": rpc error: code = NotFound desc = could not find container \"066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a\": container with ID starting with 066de2cc521118c94679a0908931e14f2bb31acd3afc86e0a26d2b40cb89055a not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.651008 4917 scope.go:117] "RemoveContainer" containerID="cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.651186 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1"} err="failed to get container status \"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1\": rpc error: code = NotFound desc = could not find container \"cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1\": container with ID starting with cc758e60a3cacb70e6158c78964122a1168446f8fe18090483dea5a04826e3f1 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.651208 4917 scope.go:117] "RemoveContainer" containerID="77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.657119 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd"} err="failed to get container status \"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd\": rpc error: code = NotFound desc = could not find container \"77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd\": container with ID starting with 77a89cb539fbadda8e545b63cb14e9db069a0973a7bca758c9066a4f831b68fd not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.657151 4917 scope.go:117] "RemoveContainer" containerID="fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.657616 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d"} err="failed to get container status \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": rpc error: code = NotFound desc = could not find container \"fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d\": container with ID starting with fe7af4ed943de9f2c0dcf07bf498f964e4304b50bc93cc573cb04c30623ffc3d not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.657642 4917 scope.go:117] "RemoveContainer" containerID="556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.657860 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e"} err="failed to get container status \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": rpc error: code = NotFound desc = could not find container \"556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e\": container with ID starting with 556210a25706931190b1f3bb9b61dd09b405771951bc02827dd44cf085e2e37e not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.657882 4917 scope.go:117] "RemoveContainer" containerID="55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.658154 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681"} err="failed to get container status \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": rpc error: code = NotFound desc = could not find container \"55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681\": container with ID starting with 55305b7ccb6826126bca16b51ecd3541257381690e576b8fc5df193adc827681 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.658202 4917 scope.go:117] "RemoveContainer" containerID="be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.658503 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1"} err="failed to get container status \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": rpc error: code = NotFound desc = could not find container \"be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1\": container with ID starting with be35c6b7462c4c1432b64716b8103ecc8c091e6012add8f6bdccb2b523ef73c1 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.658520 4917 scope.go:117] "RemoveContainer" containerID="722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.659553 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44"} err="failed to get container status \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": rpc error: code = NotFound desc = could not find container \"722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44\": container with ID starting with 722b6cc7fd655dcd6fcaf50edc0ebb0e370d8a2808e5e538af75a551794d8b44 not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.659606 4917 scope.go:117] "RemoveContainer" containerID="9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.659973 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b"} err="failed to get container status \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": rpc error: code = NotFound desc = could not find container \"9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b\": container with ID starting with 9c40820bf69ffcffed79adce39e3ebb6076257d83cf8d3184dfc270d435fa74b not found: ID does not exist" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719052 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da9219b4-ac36-4c9d-a457-0461ff261481-ovnkube-script-lib\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719106 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da9219b4-ac36-4c9d-a457-0461ff261481-ovnkube-config\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719124 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-var-lib-openvswitch\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719147 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-log-socket\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719164 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-run-systemd\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719182 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-cni-bin\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719205 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-kubelet\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719225 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da9219b4-ac36-4c9d-a457-0461ff261481-env-overrides\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719242 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-run-ovn\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719261 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719282 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-node-log\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719298 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-run-netns\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719312 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da9219b4-ac36-4c9d-a457-0461ff261481-ovn-node-metrics-cert\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719328 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-slash\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719335 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-log-socket\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719366 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-cni-netd\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719393 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-run-systemd\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719344 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-cni-netd\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719413 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-cni-bin\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719432 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-kubelet\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719428 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-systemd-units\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719455 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-run-ovn-kubernetes\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719481 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-etc-openvswitch\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719496 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rx7v\" (UniqueName: \"kubernetes.io/projected/da9219b4-ac36-4c9d-a457-0461ff261481-kube-api-access-9rx7v\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719510 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-run-openvswitch\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719543 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719555 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2tv7\" (UniqueName: \"kubernetes.io/projected/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-kube-api-access-d2tv7\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719564 4917 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719587 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-run-openvswitch\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720156 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/da9219b4-ac36-4c9d-a457-0461ff261481-env-overrides\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720198 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-run-ovn\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720220 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720241 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-node-log\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720261 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-run-netns\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720427 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/da9219b4-ac36-4c9d-a457-0461ff261481-ovnkube-script-lib\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720684 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-etc-openvswitch\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720730 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-slash\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720754 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-host-run-ovn-kubernetes\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.719453 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-systemd-units\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720825 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/da9219b4-ac36-4c9d-a457-0461ff261481-var-lib-openvswitch\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.720973 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/da9219b4-ac36-4c9d-a457-0461ff261481-ovnkube-config\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.733450 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/da9219b4-ac36-4c9d-a457-0461ff261481-ovn-node-metrics-cert\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.736446 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rx7v\" (UniqueName: \"kubernetes.io/projected/da9219b4-ac36-4c9d-a457-0461ff261481-kube-api-access-9rx7v\") pod \"ovnkube-node-fm4f4\" (UID: \"da9219b4-ac36-4c9d-a457-0461ff261481\") " pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.779717 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.804494 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f8k4j"] Mar 18 06:59:24 crc kubenswrapper[4917]: I0318 06:59:24.804540 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f8k4j"] Mar 18 06:59:24 crc kubenswrapper[4917]: W0318 06:59:24.824649 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda9219b4_ac36_4c9d_a457_0461ff261481.slice/crio-fcdd6c821e045b021aceb3c8864200935f539eb2f1feabf364e21087961d8222 WatchSource:0}: Error finding container fcdd6c821e045b021aceb3c8864200935f539eb2f1feabf364e21087961d8222: Status 404 returned error can't find the container with id fcdd6c821e045b021aceb3c8864200935f539eb2f1feabf364e21087961d8222 Mar 18 06:59:25 crc kubenswrapper[4917]: I0318 06:59:25.486019 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pp4xr_4aaf9352-3715-40a0-876a-09f4f27a41c2/kube-multus/0.log" Mar 18 06:59:25 crc kubenswrapper[4917]: I0318 06:59:25.486466 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pp4xr" event={"ID":"4aaf9352-3715-40a0-876a-09f4f27a41c2","Type":"ContainerStarted","Data":"fe6d63c95e65af89c3d2b77d791137c11cf477f82bfa251b561ceae4ac129e85"} Mar 18 06:59:25 crc kubenswrapper[4917]: I0318 06:59:25.489498 4917 generic.go:334] "Generic (PLEG): container finished" podID="da9219b4-ac36-4c9d-a457-0461ff261481" containerID="e1732d025c7837b49efa7470cca6bbd816683c8b61a71b3d6c3faaf2550e4e17" exitCode=0 Mar 18 06:59:25 crc kubenswrapper[4917]: I0318 06:59:25.489547 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerDied","Data":"e1732d025c7837b49efa7470cca6bbd816683c8b61a71b3d6c3faaf2550e4e17"} Mar 18 06:59:25 crc kubenswrapper[4917]: I0318 06:59:25.489672 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerStarted","Data":"fcdd6c821e045b021aceb3c8864200935f539eb2f1feabf364e21087961d8222"} Mar 18 06:59:25 crc kubenswrapper[4917]: I0318 06:59:25.780724 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dd0f3cd-77e6-44b6-92e3-50740ab1fffa" path="/var/lib/kubelet/pods/5dd0f3cd-77e6-44b6-92e3-50740ab1fffa/volumes" Mar 18 06:59:26 crc kubenswrapper[4917]: I0318 06:59:26.504408 4917 generic.go:334] "Generic (PLEG): container finished" podID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerID="38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca" exitCode=0 Mar 18 06:59:26 crc kubenswrapper[4917]: I0318 06:59:26.504701 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dws25" event={"ID":"d92de4fc-3dbb-450d-8ab8-2182ec4f435c","Type":"ContainerDied","Data":"38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca"} Mar 18 06:59:26 crc kubenswrapper[4917]: I0318 06:59:26.513458 4917 generic.go:334] "Generic (PLEG): container finished" podID="984ea079-192f-436b-bde0-0b8bc6df9fe8" containerID="f670e74d63a6066383ea159d4a2a66749784be47b4bc831fa5b46e8221d75c25" exitCode=0 Mar 18 06:59:26 crc kubenswrapper[4917]: I0318 06:59:26.513655 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bd8lr" event={"ID":"984ea079-192f-436b-bde0-0b8bc6df9fe8","Type":"ContainerDied","Data":"f670e74d63a6066383ea159d4a2a66749784be47b4bc831fa5b46e8221d75c25"} Mar 18 06:59:26 crc kubenswrapper[4917]: I0318 06:59:26.518924 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerStarted","Data":"07546b22f88350ad34165f8d594df90badbaf6c6f63d7b3fdb25e38e819e691b"} Mar 18 06:59:26 crc kubenswrapper[4917]: I0318 06:59:26.519096 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerStarted","Data":"e51fc4828d32446b3105783ff2b0b75b634adc99fd1beb067dd448d517573e93"} Mar 18 06:59:26 crc kubenswrapper[4917]: I0318 06:59:26.519265 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerStarted","Data":"80de9392ab5d7ac3b28d628ccabc98093170c1d8b632674b591a999bfc12c9cf"} Mar 18 06:59:26 crc kubenswrapper[4917]: I0318 06:59:26.519418 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerStarted","Data":"db45f2f3ff406126fdc078af33c05c50759e37612911bdc8e5fe9c5447b9beda"} Mar 18 06:59:26 crc kubenswrapper[4917]: I0318 06:59:26.519520 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerStarted","Data":"018d73dafe4c38ffd158495b852beaca212256170a138cf136b042cdc453cace"} Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.530472 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerStarted","Data":"d06ca79a6d958f1aac1ed7b0dd167115c8521b8c887748298fa0e5919bcd548a"} Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.535287 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dws25" event={"ID":"d92de4fc-3dbb-450d-8ab8-2182ec4f435c","Type":"ContainerStarted","Data":"ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512"} Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.573432 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dws25" podStartSLOduration=3.050696249 podStartE2EDuration="5.573412784s" podCreationTimestamp="2026-03-18 06:59:22 +0000 UTC" firstStartedPulling="2026-03-18 06:59:24.474086346 +0000 UTC m=+749.415241060" lastFinishedPulling="2026-03-18 06:59:26.996802841 +0000 UTC m=+751.937957595" observedRunningTime="2026-03-18 06:59:27.565505814 +0000 UTC m=+752.506660558" watchObservedRunningTime="2026-03-18 06:59:27.573412784 +0000 UTC m=+752.514567508" Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.643848 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.761552 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/984ea079-192f-436b-bde0-0b8bc6df9fe8-node-mnt\") pod \"984ea079-192f-436b-bde0-0b8bc6df9fe8\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.761640 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/984ea079-192f-436b-bde0-0b8bc6df9fe8-crc-storage\") pod \"984ea079-192f-436b-bde0-0b8bc6df9fe8\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.761673 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcdtr\" (UniqueName: \"kubernetes.io/projected/984ea079-192f-436b-bde0-0b8bc6df9fe8-kube-api-access-tcdtr\") pod \"984ea079-192f-436b-bde0-0b8bc6df9fe8\" (UID: \"984ea079-192f-436b-bde0-0b8bc6df9fe8\") " Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.762261 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/984ea079-192f-436b-bde0-0b8bc6df9fe8-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "984ea079-192f-436b-bde0-0b8bc6df9fe8" (UID: "984ea079-192f-436b-bde0-0b8bc6df9fe8"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.767763 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/984ea079-192f-436b-bde0-0b8bc6df9fe8-kube-api-access-tcdtr" (OuterVolumeSpecName: "kube-api-access-tcdtr") pod "984ea079-192f-436b-bde0-0b8bc6df9fe8" (UID: "984ea079-192f-436b-bde0-0b8bc6df9fe8"). InnerVolumeSpecName "kube-api-access-tcdtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.779613 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/984ea079-192f-436b-bde0-0b8bc6df9fe8-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "984ea079-192f-436b-bde0-0b8bc6df9fe8" (UID: "984ea079-192f-436b-bde0-0b8bc6df9fe8"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.863703 4917 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/984ea079-192f-436b-bde0-0b8bc6df9fe8-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.863742 4917 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/984ea079-192f-436b-bde0-0b8bc6df9fe8-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:27 crc kubenswrapper[4917]: I0318 06:59:27.863755 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcdtr\" (UniqueName: \"kubernetes.io/projected/984ea079-192f-436b-bde0-0b8bc6df9fe8-kube-api-access-tcdtr\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:28 crc kubenswrapper[4917]: I0318 06:59:28.545906 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-bd8lr" event={"ID":"984ea079-192f-436b-bde0-0b8bc6df9fe8","Type":"ContainerDied","Data":"fa3cb97d344557da827f808596709aa9652c0c3fdebd33ccff7fb746970c7442"} Mar 18 06:59:28 crc kubenswrapper[4917]: I0318 06:59:28.546340 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa3cb97d344557da827f808596709aa9652c0c3fdebd33ccff7fb746970c7442" Mar 18 06:59:28 crc kubenswrapper[4917]: I0318 06:59:28.545950 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-bd8lr" Mar 18 06:59:29 crc kubenswrapper[4917]: I0318 06:59:29.560045 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerStarted","Data":"53f0e550e9f21d3cb2d353a41376ce6242dc11025053cd3ae59100c136ba4235"} Mar 18 06:59:31 crc kubenswrapper[4917]: I0318 06:59:31.576843 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" event={"ID":"da9219b4-ac36-4c9d-a457-0461ff261481","Type":"ContainerStarted","Data":"288d6059c1dc6207af165b9b80f3f77b0f0b013d9eb8c42c5284b5503e60770c"} Mar 18 06:59:31 crc kubenswrapper[4917]: I0318 06:59:31.577485 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:31 crc kubenswrapper[4917]: I0318 06:59:31.577578 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:31 crc kubenswrapper[4917]: I0318 06:59:31.577805 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:31 crc kubenswrapper[4917]: I0318 06:59:31.607212 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:31 crc kubenswrapper[4917]: I0318 06:59:31.612973 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" podStartSLOduration=7.61295838 podStartE2EDuration="7.61295838s" podCreationTimestamp="2026-03-18 06:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:59:31.60781064 +0000 UTC m=+756.548965384" watchObservedRunningTime="2026-03-18 06:59:31.61295838 +0000 UTC m=+756.554113094" Mar 18 06:59:31 crc kubenswrapper[4917]: I0318 06:59:31.621995 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:33 crc kubenswrapper[4917]: I0318 06:59:33.108837 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:33 crc kubenswrapper[4917]: I0318 06:59:33.109207 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:33 crc kubenswrapper[4917]: I0318 06:59:33.152530 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:33 crc kubenswrapper[4917]: I0318 06:59:33.637635 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:33 crc kubenswrapper[4917]: I0318 06:59:33.679801 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dws25"] Mar 18 06:59:35 crc kubenswrapper[4917]: I0318 06:59:35.601228 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dws25" podUID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerName="registry-server" containerID="cri-o://ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512" gracePeriod=2 Mar 18 06:59:35 crc kubenswrapper[4917]: I0318 06:59:35.975563 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:35 crc kubenswrapper[4917]: I0318 06:59:35.985705 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whszb\" (UniqueName: \"kubernetes.io/projected/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-kube-api-access-whszb\") pod \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " Mar 18 06:59:35 crc kubenswrapper[4917]: I0318 06:59:35.985920 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-catalog-content\") pod \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " Mar 18 06:59:35 crc kubenswrapper[4917]: I0318 06:59:35.986006 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-utilities\") pod \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\" (UID: \"d92de4fc-3dbb-450d-8ab8-2182ec4f435c\") " Mar 18 06:59:35 crc kubenswrapper[4917]: I0318 06:59:35.987705 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-utilities" (OuterVolumeSpecName: "utilities") pod "d92de4fc-3dbb-450d-8ab8-2182ec4f435c" (UID: "d92de4fc-3dbb-450d-8ab8-2182ec4f435c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:59:35 crc kubenswrapper[4917]: I0318 06:59:35.992471 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-kube-api-access-whszb" (OuterVolumeSpecName: "kube-api-access-whszb") pod "d92de4fc-3dbb-450d-8ab8-2182ec4f435c" (UID: "d92de4fc-3dbb-450d-8ab8-2182ec4f435c"). InnerVolumeSpecName "kube-api-access-whszb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.054865 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm"] Mar 18 06:59:36 crc kubenswrapper[4917]: E0318 06:59:36.056769 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerName="extract-content" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.056791 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerName="extract-content" Mar 18 06:59:36 crc kubenswrapper[4917]: E0318 06:59:36.056808 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerName="extract-utilities" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.056817 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerName="extract-utilities" Mar 18 06:59:36 crc kubenswrapper[4917]: E0318 06:59:36.056831 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="984ea079-192f-436b-bde0-0b8bc6df9fe8" containerName="storage" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.056839 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="984ea079-192f-436b-bde0-0b8bc6df9fe8" containerName="storage" Mar 18 06:59:36 crc kubenswrapper[4917]: E0318 06:59:36.056855 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerName="registry-server" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.056863 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerName="registry-server" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.056970 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerName="registry-server" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.056982 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="984ea079-192f-436b-bde0-0b8bc6df9fe8" containerName="storage" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.058228 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.061053 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.062233 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm"] Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.086148 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d92de4fc-3dbb-450d-8ab8-2182ec4f435c" (UID: "d92de4fc-3dbb-450d-8ab8-2182ec4f435c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.086895 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g2hg\" (UniqueName: \"kubernetes.io/projected/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-kube-api-access-8g2hg\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.086984 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.087010 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.087062 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whszb\" (UniqueName: \"kubernetes.io/projected/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-kube-api-access-whszb\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.087074 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.087083 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92de4fc-3dbb-450d-8ab8-2182ec4f435c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.187811 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.187857 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.187894 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g2hg\" (UniqueName: \"kubernetes.io/projected/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-kube-api-access-8g2hg\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.188317 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.188673 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.203010 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g2hg\" (UniqueName: \"kubernetes.io/projected/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-kube-api-access-8g2hg\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.376391 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.609405 4917 generic.go:334] "Generic (PLEG): container finished" podID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" containerID="ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512" exitCode=0 Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.609701 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dws25" event={"ID":"d92de4fc-3dbb-450d-8ab8-2182ec4f435c","Type":"ContainerDied","Data":"ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512"} Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.609727 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dws25" event={"ID":"d92de4fc-3dbb-450d-8ab8-2182ec4f435c","Type":"ContainerDied","Data":"b43c62ef5bae855387e82856ffcf2d0551499b8094dcd97cd8514c94d74f000a"} Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.609772 4917 scope.go:117] "RemoveContainer" containerID="ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.609884 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dws25" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.632780 4917 scope.go:117] "RemoveContainer" containerID="38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.645742 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dws25"] Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.651051 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dws25"] Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.662659 4917 scope.go:117] "RemoveContainer" containerID="6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.682089 4917 scope.go:117] "RemoveContainer" containerID="ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512" Mar 18 06:59:36 crc kubenswrapper[4917]: E0318 06:59:36.682674 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512\": container with ID starting with ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512 not found: ID does not exist" containerID="ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.682707 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512"} err="failed to get container status \"ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512\": rpc error: code = NotFound desc = could not find container \"ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512\": container with ID starting with ea594ee51587ecf993a06a508cba02686787ba914d2f1aad1bce7a466822d512 not found: ID does not exist" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.682732 4917 scope.go:117] "RemoveContainer" containerID="38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca" Mar 18 06:59:36 crc kubenswrapper[4917]: E0318 06:59:36.683024 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca\": container with ID starting with 38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca not found: ID does not exist" containerID="38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.683124 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca"} err="failed to get container status \"38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca\": rpc error: code = NotFound desc = could not find container \"38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca\": container with ID starting with 38a2e05e54a4a00fba0cf7cbf55a8baec91533af41487f7c7449c3718818dcca not found: ID does not exist" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.683141 4917 scope.go:117] "RemoveContainer" containerID="6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f" Mar 18 06:59:36 crc kubenswrapper[4917]: E0318 06:59:36.683418 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f\": container with ID starting with 6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f not found: ID does not exist" containerID="6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.683442 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f"} err="failed to get container status \"6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f\": rpc error: code = NotFound desc = could not find container \"6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f\": container with ID starting with 6325247ecc87542e6a2adcb1f37f49b935e178d1b3f7bb35727c8eb6f7e0f55f not found: ID does not exist" Mar 18 06:59:36 crc kubenswrapper[4917]: I0318 06:59:36.834894 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm"] Mar 18 06:59:36 crc kubenswrapper[4917]: W0318 06:59:36.840991 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04e01d3a_3bf7_4904_9bd7_131a3f9ca046.slice/crio-d29b5fd0573cf4ba34c45a5dd5cae6e323af259e067694722ce77f2178c1ed1a WatchSource:0}: Error finding container d29b5fd0573cf4ba34c45a5dd5cae6e323af259e067694722ce77f2178c1ed1a: Status 404 returned error can't find the container with id d29b5fd0573cf4ba34c45a5dd5cae6e323af259e067694722ce77f2178c1ed1a Mar 18 06:59:37 crc kubenswrapper[4917]: I0318 06:59:37.620764 4917 generic.go:334] "Generic (PLEG): container finished" podID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerID="219721b397d7af344cabaf53fdeb20e377634ea9c64a1684242e293169272540" exitCode=0 Mar 18 06:59:37 crc kubenswrapper[4917]: I0318 06:59:37.621541 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" event={"ID":"04e01d3a-3bf7-4904-9bd7-131a3f9ca046","Type":"ContainerDied","Data":"219721b397d7af344cabaf53fdeb20e377634ea9c64a1684242e293169272540"} Mar 18 06:59:37 crc kubenswrapper[4917]: I0318 06:59:37.621694 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" event={"ID":"04e01d3a-3bf7-4904-9bd7-131a3f9ca046","Type":"ContainerStarted","Data":"d29b5fd0573cf4ba34c45a5dd5cae6e323af259e067694722ce77f2178c1ed1a"} Mar 18 06:59:37 crc kubenswrapper[4917]: I0318 06:59:37.785765 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92de4fc-3dbb-450d-8ab8-2182ec4f435c" path="/var/lib/kubelet/pods/d92de4fc-3dbb-450d-8ab8-2182ec4f435c/volumes" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.397658 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9nhxf"] Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.399439 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.423189 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9nhxf"] Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.532011 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-catalog-content\") pod \"redhat-operators-9nhxf\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.532071 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ltq\" (UniqueName: \"kubernetes.io/projected/b6dc2e1b-ede2-408d-b990-2787b951fcc6-kube-api-access-p6ltq\") pod \"redhat-operators-9nhxf\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.532228 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-utilities\") pod \"redhat-operators-9nhxf\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.633170 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-catalog-content\") pod \"redhat-operators-9nhxf\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.633251 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ltq\" (UniqueName: \"kubernetes.io/projected/b6dc2e1b-ede2-408d-b990-2787b951fcc6-kube-api-access-p6ltq\") pod \"redhat-operators-9nhxf\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.633303 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-utilities\") pod \"redhat-operators-9nhxf\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.633840 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-utilities\") pod \"redhat-operators-9nhxf\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.633840 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-catalog-content\") pod \"redhat-operators-9nhxf\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.638334 4917 generic.go:334] "Generic (PLEG): container finished" podID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerID="f4032cd1ecaf8993ef0bfff8132da3c8aaeedd960c642b8f9db891c5ebca4444" exitCode=0 Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.638386 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" event={"ID":"04e01d3a-3bf7-4904-9bd7-131a3f9ca046","Type":"ContainerDied","Data":"f4032cd1ecaf8993ef0bfff8132da3c8aaeedd960c642b8f9db891c5ebca4444"} Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.657829 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ltq\" (UniqueName: \"kubernetes.io/projected/b6dc2e1b-ede2-408d-b990-2787b951fcc6-kube-api-access-p6ltq\") pod \"redhat-operators-9nhxf\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.746261 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:39 crc kubenswrapper[4917]: I0318 06:59:39.963363 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9nhxf"] Mar 18 06:59:39 crc kubenswrapper[4917]: W0318 06:59:39.972024 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dc2e1b_ede2_408d_b990_2787b951fcc6.slice/crio-60cf2f821da466525d8f2ff0e60446d5fa5c93fbccf7dac92a81fcf74a661d08 WatchSource:0}: Error finding container 60cf2f821da466525d8f2ff0e60446d5fa5c93fbccf7dac92a81fcf74a661d08: Status 404 returned error can't find the container with id 60cf2f821da466525d8f2ff0e60446d5fa5c93fbccf7dac92a81fcf74a661d08 Mar 18 06:59:40 crc kubenswrapper[4917]: I0318 06:59:40.644152 4917 generic.go:334] "Generic (PLEG): container finished" podID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerID="2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e" exitCode=0 Mar 18 06:59:40 crc kubenswrapper[4917]: I0318 06:59:40.644260 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9nhxf" event={"ID":"b6dc2e1b-ede2-408d-b990-2787b951fcc6","Type":"ContainerDied","Data":"2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e"} Mar 18 06:59:40 crc kubenswrapper[4917]: I0318 06:59:40.644527 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9nhxf" event={"ID":"b6dc2e1b-ede2-408d-b990-2787b951fcc6","Type":"ContainerStarted","Data":"60cf2f821da466525d8f2ff0e60446d5fa5c93fbccf7dac92a81fcf74a661d08"} Mar 18 06:59:40 crc kubenswrapper[4917]: I0318 06:59:40.647341 4917 generic.go:334] "Generic (PLEG): container finished" podID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerID="79a19be2c8f9ade9a4d14755214921f69a9fe6a597d92002ec3c575bb6aa4401" exitCode=0 Mar 18 06:59:40 crc kubenswrapper[4917]: I0318 06:59:40.647418 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" event={"ID":"04e01d3a-3bf7-4904-9bd7-131a3f9ca046","Type":"ContainerDied","Data":"79a19be2c8f9ade9a4d14755214921f69a9fe6a597d92002ec3c575bb6aa4401"} Mar 18 06:59:41 crc kubenswrapper[4917]: I0318 06:59:41.656752 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9nhxf" event={"ID":"b6dc2e1b-ede2-408d-b990-2787b951fcc6","Type":"ContainerStarted","Data":"ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6"} Mar 18 06:59:41 crc kubenswrapper[4917]: I0318 06:59:41.927754 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:41 crc kubenswrapper[4917]: I0318 06:59:41.966547 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-bundle\") pod \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " Mar 18 06:59:41 crc kubenswrapper[4917]: I0318 06:59:41.966771 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-util\") pod \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " Mar 18 06:59:41 crc kubenswrapper[4917]: I0318 06:59:41.966837 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g2hg\" (UniqueName: \"kubernetes.io/projected/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-kube-api-access-8g2hg\") pod \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\" (UID: \"04e01d3a-3bf7-4904-9bd7-131a3f9ca046\") " Mar 18 06:59:41 crc kubenswrapper[4917]: I0318 06:59:41.967491 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-bundle" (OuterVolumeSpecName: "bundle") pod "04e01d3a-3bf7-4904-9bd7-131a3f9ca046" (UID: "04e01d3a-3bf7-4904-9bd7-131a3f9ca046"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:59:41 crc kubenswrapper[4917]: I0318 06:59:41.973790 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-kube-api-access-8g2hg" (OuterVolumeSpecName: "kube-api-access-8g2hg") pod "04e01d3a-3bf7-4904-9bd7-131a3f9ca046" (UID: "04e01d3a-3bf7-4904-9bd7-131a3f9ca046"). InnerVolumeSpecName "kube-api-access-8g2hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 06:59:41 crc kubenswrapper[4917]: I0318 06:59:41.996039 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-util" (OuterVolumeSpecName: "util") pod "04e01d3a-3bf7-4904-9bd7-131a3f9ca046" (UID: "04e01d3a-3bf7-4904-9bd7-131a3f9ca046"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 06:59:42 crc kubenswrapper[4917]: I0318 06:59:42.068837 4917 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-util\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:42 crc kubenswrapper[4917]: I0318 06:59:42.069073 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g2hg\" (UniqueName: \"kubernetes.io/projected/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-kube-api-access-8g2hg\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:42 crc kubenswrapper[4917]: I0318 06:59:42.069166 4917 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/04e01d3a-3bf7-4904-9bd7-131a3f9ca046-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 06:59:42 crc kubenswrapper[4917]: I0318 06:59:42.667765 4917 generic.go:334] "Generic (PLEG): container finished" podID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerID="ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6" exitCode=0 Mar 18 06:59:42 crc kubenswrapper[4917]: I0318 06:59:42.667843 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9nhxf" event={"ID":"b6dc2e1b-ede2-408d-b990-2787b951fcc6","Type":"ContainerDied","Data":"ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6"} Mar 18 06:59:42 crc kubenswrapper[4917]: I0318 06:59:42.673683 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" event={"ID":"04e01d3a-3bf7-4904-9bd7-131a3f9ca046","Type":"ContainerDied","Data":"d29b5fd0573cf4ba34c45a5dd5cae6e323af259e067694722ce77f2178c1ed1a"} Mar 18 06:59:42 crc kubenswrapper[4917]: I0318 06:59:42.673728 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d29b5fd0573cf4ba34c45a5dd5cae6e323af259e067694722ce77f2178c1ed1a" Mar 18 06:59:42 crc kubenswrapper[4917]: I0318 06:59:42.673840 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm" Mar 18 06:59:43 crc kubenswrapper[4917]: I0318 06:59:43.683793 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9nhxf" event={"ID":"b6dc2e1b-ede2-408d-b990-2787b951fcc6","Type":"ContainerStarted","Data":"89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795"} Mar 18 06:59:43 crc kubenswrapper[4917]: I0318 06:59:43.706063 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9nhxf" podStartSLOduration=2.244002019 podStartE2EDuration="4.706040215s" podCreationTimestamp="2026-03-18 06:59:39 +0000 UTC" firstStartedPulling="2026-03-18 06:59:40.645969786 +0000 UTC m=+765.587124500" lastFinishedPulling="2026-03-18 06:59:43.108007932 +0000 UTC m=+768.049162696" observedRunningTime="2026-03-18 06:59:43.704998129 +0000 UTC m=+768.646152883" watchObservedRunningTime="2026-03-18 06:59:43.706040215 +0000 UTC m=+768.647194939" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.912158 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4446x"] Mar 18 06:59:45 crc kubenswrapper[4917]: E0318 06:59:45.912389 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerName="util" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.912403 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerName="util" Mar 18 06:59:45 crc kubenswrapper[4917]: E0318 06:59:45.912418 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerName="pull" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.912426 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerName="pull" Mar 18 06:59:45 crc kubenswrapper[4917]: E0318 06:59:45.912449 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerName="extract" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.912457 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerName="extract" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.912573 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e01d3a-3bf7-4904-9bd7-131a3f9ca046" containerName="extract" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.913068 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4446x" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.916257 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.916318 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-n2rdk" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.919045 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 18 06:59:45 crc kubenswrapper[4917]: I0318 06:59:45.924277 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4446x"] Mar 18 06:59:46 crc kubenswrapper[4917]: I0318 06:59:46.021640 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wr6\" (UniqueName: \"kubernetes.io/projected/5a0173d1-0d7d-4b19-b01f-b5f6d8a543eb-kube-api-access-88wr6\") pod \"nmstate-operator-796d4cfff4-4446x\" (UID: \"5a0173d1-0d7d-4b19-b01f-b5f6d8a543eb\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4446x" Mar 18 06:59:46 crc kubenswrapper[4917]: I0318 06:59:46.123505 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88wr6\" (UniqueName: \"kubernetes.io/projected/5a0173d1-0d7d-4b19-b01f-b5f6d8a543eb-kube-api-access-88wr6\") pod \"nmstate-operator-796d4cfff4-4446x\" (UID: \"5a0173d1-0d7d-4b19-b01f-b5f6d8a543eb\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4446x" Mar 18 06:59:46 crc kubenswrapper[4917]: I0318 06:59:46.150009 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wr6\" (UniqueName: \"kubernetes.io/projected/5a0173d1-0d7d-4b19-b01f-b5f6d8a543eb-kube-api-access-88wr6\") pod \"nmstate-operator-796d4cfff4-4446x\" (UID: \"5a0173d1-0d7d-4b19-b01f-b5f6d8a543eb\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4446x" Mar 18 06:59:46 crc kubenswrapper[4917]: I0318 06:59:46.231177 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4446x" Mar 18 06:59:46 crc kubenswrapper[4917]: I0318 06:59:46.727113 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4446x"] Mar 18 06:59:46 crc kubenswrapper[4917]: W0318 06:59:46.736360 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0173d1_0d7d_4b19_b01f_b5f6d8a543eb.slice/crio-8f7225e188f05954667fce11a71a22dbed8ac2509fc682ac0be2b4ef2d1bf4c8 WatchSource:0}: Error finding container 8f7225e188f05954667fce11a71a22dbed8ac2509fc682ac0be2b4ef2d1bf4c8: Status 404 returned error can't find the container with id 8f7225e188f05954667fce11a71a22dbed8ac2509fc682ac0be2b4ef2d1bf4c8 Mar 18 06:59:47 crc kubenswrapper[4917]: I0318 06:59:47.707839 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4446x" event={"ID":"5a0173d1-0d7d-4b19-b01f-b5f6d8a543eb","Type":"ContainerStarted","Data":"8f7225e188f05954667fce11a71a22dbed8ac2509fc682ac0be2b4ef2d1bf4c8"} Mar 18 06:59:49 crc kubenswrapper[4917]: I0318 06:59:49.728059 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4446x" event={"ID":"5a0173d1-0d7d-4b19-b01f-b5f6d8a543eb","Type":"ContainerStarted","Data":"ee5f17fd1a40d0e66eeba340e39efebe06c66177e9a43f0a26547e5617c8156c"} Mar 18 06:59:49 crc kubenswrapper[4917]: I0318 06:59:49.746720 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:49 crc kubenswrapper[4917]: I0318 06:59:49.746809 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:49 crc kubenswrapper[4917]: I0318 06:59:49.757037 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4446x" podStartSLOduration=2.307756826 podStartE2EDuration="4.757019409s" podCreationTimestamp="2026-03-18 06:59:45 +0000 UTC" firstStartedPulling="2026-03-18 06:59:46.738836607 +0000 UTC m=+771.679991321" lastFinishedPulling="2026-03-18 06:59:49.18809919 +0000 UTC m=+774.129253904" observedRunningTime="2026-03-18 06:59:49.753204623 +0000 UTC m=+774.694359357" watchObservedRunningTime="2026-03-18 06:59:49.757019409 +0000 UTC m=+774.698174133" Mar 18 06:59:50 crc kubenswrapper[4917]: I0318 06:59:50.815593 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9nhxf" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerName="registry-server" probeResult="failure" output=< Mar 18 06:59:50 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 06:59:50 crc kubenswrapper[4917]: > Mar 18 06:59:54 crc kubenswrapper[4917]: I0318 06:59:54.820900 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fm4f4" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.006080 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-ldchq"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.007215 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.011929 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.013203 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.016326 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-xs8tk" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.016716 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.027624 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-c9pr9"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.029067 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.042960 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-ldchq"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.070759 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.138130 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.139340 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.147934 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.148261 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.148532 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.148695 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-lzf2t" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.162608 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5vk5\" (UniqueName: \"kubernetes.io/projected/cf995f1c-4cea-4dd8-9283-161f3e9b3a4d-kube-api-access-d5vk5\") pod \"nmstate-webhook-5f558f5558-ldchq\" (UID: \"cf995f1c-4cea-4dd8-9283-161f3e9b3a4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.162682 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-ovs-socket\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.162878 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-nmstate-lock\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.162933 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-dbus-socket\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.162988 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cf995f1c-4cea-4dd8-9283-161f3e9b3a4d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-ldchq\" (UID: \"cf995f1c-4cea-4dd8-9283-161f3e9b3a4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.163081 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55grp\" (UniqueName: \"kubernetes.io/projected/d92256d3-2d78-4985-944e-273ad62ce8f2-kube-api-access-55grp\") pod \"nmstate-metrics-9b8c8685d-qqdsm\" (UID: \"d92256d3-2d78-4985-944e-273ad62ce8f2\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.163150 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffs82\" (UniqueName: \"kubernetes.io/projected/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-kube-api-access-ffs82\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272084 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-nmstate-lock\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272165 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j25n4\" (UniqueName: \"kubernetes.io/projected/c796ff9e-9db2-437c-92f6-fecb8ff9df67-kube-api-access-j25n4\") pod \"nmstate-console-plugin-86f58fcf4-sn24n\" (UID: \"c796ff9e-9db2-437c-92f6-fecb8ff9df67\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272223 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-dbus-socket\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272227 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-nmstate-lock\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272258 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cf995f1c-4cea-4dd8-9283-161f3e9b3a4d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-ldchq\" (UID: \"cf995f1c-4cea-4dd8-9283-161f3e9b3a4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272317 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55grp\" (UniqueName: \"kubernetes.io/projected/d92256d3-2d78-4985-944e-273ad62ce8f2-kube-api-access-55grp\") pod \"nmstate-metrics-9b8c8685d-qqdsm\" (UID: \"d92256d3-2d78-4985-944e-273ad62ce8f2\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272345 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffs82\" (UniqueName: \"kubernetes.io/projected/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-kube-api-access-ffs82\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272383 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5vk5\" (UniqueName: \"kubernetes.io/projected/cf995f1c-4cea-4dd8-9283-161f3e9b3a4d-kube-api-access-d5vk5\") pod \"nmstate-webhook-5f558f5558-ldchq\" (UID: \"cf995f1c-4cea-4dd8-9283-161f3e9b3a4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272404 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c796ff9e-9db2-437c-92f6-fecb8ff9df67-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-sn24n\" (UID: \"c796ff9e-9db2-437c-92f6-fecb8ff9df67\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272458 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-ovs-socket\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272485 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c796ff9e-9db2-437c-92f6-fecb8ff9df67-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-sn24n\" (UID: \"c796ff9e-9db2-437c-92f6-fecb8ff9df67\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272508 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-dbus-socket\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.272630 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-ovs-socket\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.287764 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/cf995f1c-4cea-4dd8-9283-161f3e9b3a4d-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-ldchq\" (UID: \"cf995f1c-4cea-4dd8-9283-161f3e9b3a4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.288062 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5vk5\" (UniqueName: \"kubernetes.io/projected/cf995f1c-4cea-4dd8-9283-161f3e9b3a4d-kube-api-access-d5vk5\") pod \"nmstate-webhook-5f558f5558-ldchq\" (UID: \"cf995f1c-4cea-4dd8-9283-161f3e9b3a4d\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.291214 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55grp\" (UniqueName: \"kubernetes.io/projected/d92256d3-2d78-4985-944e-273ad62ce8f2-kube-api-access-55grp\") pod \"nmstate-metrics-9b8c8685d-qqdsm\" (UID: \"d92256d3-2d78-4985-944e-273ad62ce8f2\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.309152 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffs82\" (UniqueName: \"kubernetes.io/projected/89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97-kube-api-access-ffs82\") pod \"nmstate-handler-c9pr9\" (UID: \"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97\") " pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.326308 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cb88c8f89-88jxw"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.326963 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.338468 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.351407 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cb88c8f89-88jxw"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.374193 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c796ff9e-9db2-437c-92f6-fecb8ff9df67-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-sn24n\" (UID: \"c796ff9e-9db2-437c-92f6-fecb8ff9df67\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.374259 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c796ff9e-9db2-437c-92f6-fecb8ff9df67-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-sn24n\" (UID: \"c796ff9e-9db2-437c-92f6-fecb8ff9df67\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.374291 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j25n4\" (UniqueName: \"kubernetes.io/projected/c796ff9e-9db2-437c-92f6-fecb8ff9df67-kube-api-access-j25n4\") pod \"nmstate-console-plugin-86f58fcf4-sn24n\" (UID: \"c796ff9e-9db2-437c-92f6-fecb8ff9df67\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.379437 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c796ff9e-9db2-437c-92f6-fecb8ff9df67-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-sn24n\" (UID: \"c796ff9e-9db2-437c-92f6-fecb8ff9df67\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.380059 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c796ff9e-9db2-437c-92f6-fecb8ff9df67-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-sn24n\" (UID: \"c796ff9e-9db2-437c-92f6-fecb8ff9df67\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.384141 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.388070 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j25n4\" (UniqueName: \"kubernetes.io/projected/c796ff9e-9db2-437c-92f6-fecb8ff9df67-kube-api-access-j25n4\") pod \"nmstate-console-plugin-86f58fcf4-sn24n\" (UID: \"c796ff9e-9db2-437c-92f6-fecb8ff9df67\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.389722 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.462273 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.475799 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e83cb6c-a259-4b64-bc6d-726b0b495217-console-oauth-config\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.475884 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-service-ca\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.475918 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e83cb6c-a259-4b64-bc6d-726b0b495217-console-serving-cert\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.475934 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-oauth-serving-cert\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.475953 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-trusted-ca-bundle\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.475981 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-console-config\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.475996 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqnxw\" (UniqueName: \"kubernetes.io/projected/1e83cb6c-a259-4b64-bc6d-726b0b495217-kube-api-access-lqnxw\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.577864 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e83cb6c-a259-4b64-bc6d-726b0b495217-console-oauth-config\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.582200 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-service-ca\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.582331 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e83cb6c-a259-4b64-bc6d-726b0b495217-console-serving-cert\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.582352 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-oauth-serving-cert\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.583034 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-service-ca\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.583091 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-trusted-ca-bundle\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.583401 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-oauth-serving-cert\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.583830 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-trusted-ca-bundle\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.583865 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-console-config\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.583884 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqnxw\" (UniqueName: \"kubernetes.io/projected/1e83cb6c-a259-4b64-bc6d-726b0b495217-kube-api-access-lqnxw\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.584041 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1e83cb6c-a259-4b64-bc6d-726b0b495217-console-config\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.585815 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e83cb6c-a259-4b64-bc6d-726b0b495217-console-serving-cert\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.585866 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1e83cb6c-a259-4b64-bc6d-726b0b495217-console-oauth-config\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.597907 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqnxw\" (UniqueName: \"kubernetes.io/projected/1e83cb6c-a259-4b64-bc6d-726b0b495217-kube-api-access-lqnxw\") pod \"console-7cb88c8f89-88jxw\" (UID: \"1e83cb6c-a259-4b64-bc6d-726b0b495217\") " pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.771941 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.784618 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-ldchq"] Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.786610 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-c9pr9" event={"ID":"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97","Type":"ContainerStarted","Data":"12aef7299e85df8de6bcc9757a3cf97d6ba1f4c226d4d757b29ff76b73cfd0bc"} Mar 18 06:59:56 crc kubenswrapper[4917]: W0318 06:59:56.795779 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf995f1c_4cea_4dd8_9283_161f3e9b3a4d.slice/crio-e8242af624d465ab9e74905478e500138128bc828c21ec3256f87c3846047fb5 WatchSource:0}: Error finding container e8242af624d465ab9e74905478e500138128bc828c21ec3256f87c3846047fb5: Status 404 returned error can't find the container with id e8242af624d465ab9e74905478e500138128bc828c21ec3256f87c3846047fb5 Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.841782 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm"] Mar 18 06:59:56 crc kubenswrapper[4917]: W0318 06:59:56.847099 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd92256d3_2d78_4985_944e_273ad62ce8f2.slice/crio-43e5dc8ae30c8ed7465dec7d277cea6c9ac372f301404b70eea508facfacfb99 WatchSource:0}: Error finding container 43e5dc8ae30c8ed7465dec7d277cea6c9ac372f301404b70eea508facfacfb99: Status 404 returned error can't find the container with id 43e5dc8ae30c8ed7465dec7d277cea6c9ac372f301404b70eea508facfacfb99 Mar 18 06:59:56 crc kubenswrapper[4917]: I0318 06:59:56.895537 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n"] Mar 18 06:59:56 crc kubenswrapper[4917]: W0318 06:59:56.911212 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc796ff9e_9db2_437c_92f6_fecb8ff9df67.slice/crio-c31a108ff8405b2a406b22df004c05642072deb39092c0138b72ba141f6f65c8 WatchSource:0}: Error finding container c31a108ff8405b2a406b22df004c05642072deb39092c0138b72ba141f6f65c8: Status 404 returned error can't find the container with id c31a108ff8405b2a406b22df004c05642072deb39092c0138b72ba141f6f65c8 Mar 18 06:59:57 crc kubenswrapper[4917]: I0318 06:59:57.029201 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cb88c8f89-88jxw"] Mar 18 06:59:57 crc kubenswrapper[4917]: W0318 06:59:57.032843 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e83cb6c_a259_4b64_bc6d_726b0b495217.slice/crio-50df828eb79625f6d1674a180b99babd08edcbfd2b0373492c3bbfc10ec9ec03 WatchSource:0}: Error finding container 50df828eb79625f6d1674a180b99babd08edcbfd2b0373492c3bbfc10ec9ec03: Status 404 returned error can't find the container with id 50df828eb79625f6d1674a180b99babd08edcbfd2b0373492c3bbfc10ec9ec03 Mar 18 06:59:57 crc kubenswrapper[4917]: I0318 06:59:57.795286 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" event={"ID":"c796ff9e-9db2-437c-92f6-fecb8ff9df67","Type":"ContainerStarted","Data":"c31a108ff8405b2a406b22df004c05642072deb39092c0138b72ba141f6f65c8"} Mar 18 06:59:57 crc kubenswrapper[4917]: I0318 06:59:57.796903 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm" event={"ID":"d92256d3-2d78-4985-944e-273ad62ce8f2","Type":"ContainerStarted","Data":"43e5dc8ae30c8ed7465dec7d277cea6c9ac372f301404b70eea508facfacfb99"} Mar 18 06:59:57 crc kubenswrapper[4917]: I0318 06:59:57.799243 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cb88c8f89-88jxw" event={"ID":"1e83cb6c-a259-4b64-bc6d-726b0b495217","Type":"ContainerStarted","Data":"05ea6292914cd4f833f8f6bfefa12883f8091bccb3d3529edb7cad99291fd142"} Mar 18 06:59:57 crc kubenswrapper[4917]: I0318 06:59:57.799294 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cb88c8f89-88jxw" event={"ID":"1e83cb6c-a259-4b64-bc6d-726b0b495217","Type":"ContainerStarted","Data":"50df828eb79625f6d1674a180b99babd08edcbfd2b0373492c3bbfc10ec9ec03"} Mar 18 06:59:57 crc kubenswrapper[4917]: I0318 06:59:57.800991 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" event={"ID":"cf995f1c-4cea-4dd8-9283-161f3e9b3a4d","Type":"ContainerStarted","Data":"e8242af624d465ab9e74905478e500138128bc828c21ec3256f87c3846047fb5"} Mar 18 06:59:57 crc kubenswrapper[4917]: I0318 06:59:57.826068 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cb88c8f89-88jxw" podStartSLOduration=1.826039537 podStartE2EDuration="1.826039537s" podCreationTimestamp="2026-03-18 06:59:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 06:59:57.820272211 +0000 UTC m=+782.761426935" watchObservedRunningTime="2026-03-18 06:59:57.826039537 +0000 UTC m=+782.767194291" Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.786484 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.817302 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" event={"ID":"c796ff9e-9db2-437c-92f6-fecb8ff9df67","Type":"ContainerStarted","Data":"eb01777fbcb021c8ceac7bf6a3057b17735ca82440cf767bf60142453ddedbf9"} Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.848641 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm" event={"ID":"d92256d3-2d78-4985-944e-273ad62ce8f2","Type":"ContainerStarted","Data":"188f5dcfd17a84b30f4ce2146dfe89ab6e7c83cc5336ef11e1bfb895e2408b8d"} Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.854799 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-sn24n" podStartSLOduration=1.342951854 podStartE2EDuration="3.854777433s" podCreationTimestamp="2026-03-18 06:59:56 +0000 UTC" firstStartedPulling="2026-03-18 06:59:56.918555352 +0000 UTC m=+781.859710066" lastFinishedPulling="2026-03-18 06:59:59.430380901 +0000 UTC m=+784.371535645" observedRunningTime="2026-03-18 06:59:59.853240564 +0000 UTC m=+784.794395288" watchObservedRunningTime="2026-03-18 06:59:59.854777433 +0000 UTC m=+784.795932147" Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.867219 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-c9pr9" event={"ID":"89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97","Type":"ContainerStarted","Data":"1f866dd1d931fca7fa2f866f1b980c55629b3173b763c43cef187f9438ae0876"} Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.867638 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.869433 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" event={"ID":"cf995f1c-4cea-4dd8-9283-161f3e9b3a4d","Type":"ContainerStarted","Data":"0463ce5b2ed34033f0474e59c3ea0e5b365af3ecadd03507f5aac5bef052d31a"} Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.869564 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.909071 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.926335 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" podStartSLOduration=2.2720576550000002 podStartE2EDuration="4.926317788s" podCreationTimestamp="2026-03-18 06:59:55 +0000 UTC" firstStartedPulling="2026-03-18 06:59:56.801416566 +0000 UTC m=+781.742571280" lastFinishedPulling="2026-03-18 06:59:59.455676649 +0000 UTC m=+784.396831413" observedRunningTime="2026-03-18 06:59:59.926055432 +0000 UTC m=+784.867210156" watchObservedRunningTime="2026-03-18 06:59:59.926317788 +0000 UTC m=+784.867472512" Mar 18 06:59:59 crc kubenswrapper[4917]: I0318 06:59:59.929356 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-c9pr9" podStartSLOduration=1.920484891 podStartE2EDuration="4.929343195s" podCreationTimestamp="2026-03-18 06:59:55 +0000 UTC" firstStartedPulling="2026-03-18 06:59:56.42443368 +0000 UTC m=+781.365588394" lastFinishedPulling="2026-03-18 06:59:59.433291944 +0000 UTC m=+784.374446698" observedRunningTime="2026-03-18 06:59:59.89509296 +0000 UTC m=+784.836247684" watchObservedRunningTime="2026-03-18 06:59:59.929343195 +0000 UTC m=+784.870497919" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.036076 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9nhxf"] Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.130983 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563620-rvlqw"] Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.132186 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563620-rvlqw" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.133792 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfgpz\" (UniqueName: \"kubernetes.io/projected/1438dd56-616a-4ae5-98eb-451e7e0349db-kube-api-access-lfgpz\") pod \"auto-csr-approver-29563620-rvlqw\" (UID: \"1438dd56-616a-4ae5-98eb-451e7e0349db\") " pod="openshift-infra/auto-csr-approver-29563620-rvlqw" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.135772 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.135916 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl"] Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.136553 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.141988 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.142416 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.142509 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563620-rvlqw"] Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.142475 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.142723 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.165908 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl"] Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.234948 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfgpz\" (UniqueName: \"kubernetes.io/projected/1438dd56-616a-4ae5-98eb-451e7e0349db-kube-api-access-lfgpz\") pod \"auto-csr-approver-29563620-rvlqw\" (UID: \"1438dd56-616a-4ae5-98eb-451e7e0349db\") " pod="openshift-infra/auto-csr-approver-29563620-rvlqw" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.253391 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfgpz\" (UniqueName: \"kubernetes.io/projected/1438dd56-616a-4ae5-98eb-451e7e0349db-kube-api-access-lfgpz\") pod \"auto-csr-approver-29563620-rvlqw\" (UID: \"1438dd56-616a-4ae5-98eb-451e7e0349db\") " pod="openshift-infra/auto-csr-approver-29563620-rvlqw" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.336230 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11225532-26d4-47aa-80bc-077720b847bd-config-volume\") pod \"collect-profiles-29563620-pnxkl\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.336278 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11225532-26d4-47aa-80bc-077720b847bd-secret-volume\") pod \"collect-profiles-29563620-pnxkl\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.336303 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qvhg\" (UniqueName: \"kubernetes.io/projected/11225532-26d4-47aa-80bc-077720b847bd-kube-api-access-5qvhg\") pod \"collect-profiles-29563620-pnxkl\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.438175 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11225532-26d4-47aa-80bc-077720b847bd-config-volume\") pod \"collect-profiles-29563620-pnxkl\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.438222 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11225532-26d4-47aa-80bc-077720b847bd-secret-volume\") pod \"collect-profiles-29563620-pnxkl\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.438249 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qvhg\" (UniqueName: \"kubernetes.io/projected/11225532-26d4-47aa-80bc-077720b847bd-kube-api-access-5qvhg\") pod \"collect-profiles-29563620-pnxkl\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.441129 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11225532-26d4-47aa-80bc-077720b847bd-config-volume\") pod \"collect-profiles-29563620-pnxkl\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.445276 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11225532-26d4-47aa-80bc-077720b847bd-secret-volume\") pod \"collect-profiles-29563620-pnxkl\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.460696 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563620-rvlqw" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.476960 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qvhg\" (UniqueName: \"kubernetes.io/projected/11225532-26d4-47aa-80bc-077720b847bd-kube-api-access-5qvhg\") pod \"collect-profiles-29563620-pnxkl\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.707024 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563620-rvlqw"] Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.774839 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.878022 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9nhxf" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerName="registry-server" containerID="cri-o://89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795" gracePeriod=2 Mar 18 07:00:00 crc kubenswrapper[4917]: I0318 07:00:00.878204 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563620-rvlqw" event={"ID":"1438dd56-616a-4ae5-98eb-451e7e0349db","Type":"ContainerStarted","Data":"7fa0a16046226d588890d7353dfc2954fbe2e292cee12a9a5c1ee5d7c204c8d6"} Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.200520 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl"] Mar 18 07:00:01 crc kubenswrapper[4917]: W0318 07:00:01.208960 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11225532_26d4_47aa_80bc_077720b847bd.slice/crio-8348914f69e4c82bf5c49e0c6ebe7089f5b1e5543f7549fe129bfcffca2f97d8 WatchSource:0}: Error finding container 8348914f69e4c82bf5c49e0c6ebe7089f5b1e5543f7549fe129bfcffca2f97d8: Status 404 returned error can't find the container with id 8348914f69e4c82bf5c49e0c6ebe7089f5b1e5543f7549fe129bfcffca2f97d8 Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.226373 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.348864 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6ltq\" (UniqueName: \"kubernetes.io/projected/b6dc2e1b-ede2-408d-b990-2787b951fcc6-kube-api-access-p6ltq\") pod \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.348945 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-catalog-content\") pod \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.348981 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-utilities\") pod \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\" (UID: \"b6dc2e1b-ede2-408d-b990-2787b951fcc6\") " Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.349893 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-utilities" (OuterVolumeSpecName: "utilities") pod "b6dc2e1b-ede2-408d-b990-2787b951fcc6" (UID: "b6dc2e1b-ede2-408d-b990-2787b951fcc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.355565 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6dc2e1b-ede2-408d-b990-2787b951fcc6-kube-api-access-p6ltq" (OuterVolumeSpecName: "kube-api-access-p6ltq") pod "b6dc2e1b-ede2-408d-b990-2787b951fcc6" (UID: "b6dc2e1b-ede2-408d-b990-2787b951fcc6"). InnerVolumeSpecName "kube-api-access-p6ltq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.450967 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6ltq\" (UniqueName: \"kubernetes.io/projected/b6dc2e1b-ede2-408d-b990-2787b951fcc6-kube-api-access-p6ltq\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.451309 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.474756 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6dc2e1b-ede2-408d-b990-2787b951fcc6" (UID: "b6dc2e1b-ede2-408d-b990-2787b951fcc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.552661 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6dc2e1b-ede2-408d-b990-2787b951fcc6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.885842 4917 generic.go:334] "Generic (PLEG): container finished" podID="11225532-26d4-47aa-80bc-077720b847bd" containerID="431fdcead3cff13ff87c87a4261522916c371ae85df4343ab3dfc8b6632e3303" exitCode=0 Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.886011 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" event={"ID":"11225532-26d4-47aa-80bc-077720b847bd","Type":"ContainerDied","Data":"431fdcead3cff13ff87c87a4261522916c371ae85df4343ab3dfc8b6632e3303"} Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.887228 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" event={"ID":"11225532-26d4-47aa-80bc-077720b847bd","Type":"ContainerStarted","Data":"8348914f69e4c82bf5c49e0c6ebe7089f5b1e5543f7549fe129bfcffca2f97d8"} Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.891813 4917 generic.go:334] "Generic (PLEG): container finished" podID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerID="89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795" exitCode=0 Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.891880 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9nhxf" event={"ID":"b6dc2e1b-ede2-408d-b990-2787b951fcc6","Type":"ContainerDied","Data":"89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795"} Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.891919 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9nhxf" event={"ID":"b6dc2e1b-ede2-408d-b990-2787b951fcc6","Type":"ContainerDied","Data":"60cf2f821da466525d8f2ff0e60446d5fa5c93fbccf7dac92a81fcf74a661d08"} Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.891949 4917 scope.go:117] "RemoveContainer" containerID="89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.892171 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9nhxf" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.923657 4917 scope.go:117] "RemoveContainer" containerID="ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.927055 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9nhxf"] Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.933973 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9nhxf"] Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.949277 4917 scope.go:117] "RemoveContainer" containerID="2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.978476 4917 scope.go:117] "RemoveContainer" containerID="89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795" Mar 18 07:00:01 crc kubenswrapper[4917]: E0318 07:00:01.978905 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795\": container with ID starting with 89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795 not found: ID does not exist" containerID="89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.978953 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795"} err="failed to get container status \"89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795\": rpc error: code = NotFound desc = could not find container \"89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795\": container with ID starting with 89d3a85071433149c532469219ff5f2631f131c8d1086780484459064e96e795 not found: ID does not exist" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.978988 4917 scope.go:117] "RemoveContainer" containerID="ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6" Mar 18 07:00:01 crc kubenswrapper[4917]: E0318 07:00:01.979368 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6\": container with ID starting with ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6 not found: ID does not exist" containerID="ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.979548 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6"} err="failed to get container status \"ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6\": rpc error: code = NotFound desc = could not find container \"ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6\": container with ID starting with ea81eb220043911970c11c93278a4953c7f3d37e5c62c442139c76fc733f30b6 not found: ID does not exist" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.979753 4917 scope.go:117] "RemoveContainer" containerID="2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e" Mar 18 07:00:01 crc kubenswrapper[4917]: E0318 07:00:01.980183 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e\": container with ID starting with 2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e not found: ID does not exist" containerID="2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e" Mar 18 07:00:01 crc kubenswrapper[4917]: I0318 07:00:01.980211 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e"} err="failed to get container status \"2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e\": rpc error: code = NotFound desc = could not find container \"2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e\": container with ID starting with 2a8095a903c40634928b0de4f418e91655fcda1cb3f1ae04e530651d34e0155e not found: ID does not exist" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.337029 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.478627 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11225532-26d4-47aa-80bc-077720b847bd-config-volume\") pod \"11225532-26d4-47aa-80bc-077720b847bd\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.478872 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qvhg\" (UniqueName: \"kubernetes.io/projected/11225532-26d4-47aa-80bc-077720b847bd-kube-api-access-5qvhg\") pod \"11225532-26d4-47aa-80bc-077720b847bd\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.478933 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11225532-26d4-47aa-80bc-077720b847bd-secret-volume\") pod \"11225532-26d4-47aa-80bc-077720b847bd\" (UID: \"11225532-26d4-47aa-80bc-077720b847bd\") " Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.479485 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11225532-26d4-47aa-80bc-077720b847bd-config-volume" (OuterVolumeSpecName: "config-volume") pod "11225532-26d4-47aa-80bc-077720b847bd" (UID: "11225532-26d4-47aa-80bc-077720b847bd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.483682 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11225532-26d4-47aa-80bc-077720b847bd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "11225532-26d4-47aa-80bc-077720b847bd" (UID: "11225532-26d4-47aa-80bc-077720b847bd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.487471 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11225532-26d4-47aa-80bc-077720b847bd-kube-api-access-5qvhg" (OuterVolumeSpecName: "kube-api-access-5qvhg") pod "11225532-26d4-47aa-80bc-077720b847bd" (UID: "11225532-26d4-47aa-80bc-077720b847bd"). InnerVolumeSpecName "kube-api-access-5qvhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.581103 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qvhg\" (UniqueName: \"kubernetes.io/projected/11225532-26d4-47aa-80bc-077720b847bd-kube-api-access-5qvhg\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.581171 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11225532-26d4-47aa-80bc-077720b847bd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.581184 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11225532-26d4-47aa-80bc-077720b847bd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.785259 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" path="/var/lib/kubelet/pods/b6dc2e1b-ede2-408d-b990-2787b951fcc6/volumes" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.908201 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" event={"ID":"11225532-26d4-47aa-80bc-077720b847bd","Type":"ContainerDied","Data":"8348914f69e4c82bf5c49e0c6ebe7089f5b1e5543f7549fe129bfcffca2f97d8"} Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.908273 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8348914f69e4c82bf5c49e0c6ebe7089f5b1e5543f7549fe129bfcffca2f97d8" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.908391 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl" Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.911289 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm" event={"ID":"d92256d3-2d78-4985-944e-273ad62ce8f2","Type":"ContainerStarted","Data":"73db9a0303c1e9db7031c5b3def71ba7392359d1b4cb5f1e407066fafa67116c"} Mar 18 07:00:03 crc kubenswrapper[4917]: I0318 07:00:03.948226 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-qqdsm" podStartSLOduration=2.434796912 podStartE2EDuration="8.948186801s" podCreationTimestamp="2026-03-18 06:59:55 +0000 UTC" firstStartedPulling="2026-03-18 06:59:56.852945446 +0000 UTC m=+781.794100190" lastFinishedPulling="2026-03-18 07:00:03.366335365 +0000 UTC m=+788.307490079" observedRunningTime="2026-03-18 07:00:03.930492305 +0000 UTC m=+788.871647059" watchObservedRunningTime="2026-03-18 07:00:03.948186801 +0000 UTC m=+788.889341555" Mar 18 07:00:06 crc kubenswrapper[4917]: I0318 07:00:06.420515 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-c9pr9" Mar 18 07:00:06 crc kubenswrapper[4917]: I0318 07:00:06.772929 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 07:00:06 crc kubenswrapper[4917]: I0318 07:00:06.773510 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 07:00:06 crc kubenswrapper[4917]: I0318 07:00:06.780712 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 07:00:06 crc kubenswrapper[4917]: I0318 07:00:06.935944 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cb88c8f89-88jxw" Mar 18 07:00:06 crc kubenswrapper[4917]: I0318 07:00:06.993155 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sgj9g"] Mar 18 07:00:16 crc kubenswrapper[4917]: I0318 07:00:16.347479 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-ldchq" Mar 18 07:00:17 crc kubenswrapper[4917]: I0318 07:00:17.011481 4917 generic.go:334] "Generic (PLEG): container finished" podID="1438dd56-616a-4ae5-98eb-451e7e0349db" containerID="bfa87acff2b656f0233fae7d871deb2516ab159c286a9396e0be62ecd6f350ca" exitCode=0 Mar 18 07:00:17 crc kubenswrapper[4917]: I0318 07:00:17.011548 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563620-rvlqw" event={"ID":"1438dd56-616a-4ae5-98eb-451e7e0349db","Type":"ContainerDied","Data":"bfa87acff2b656f0233fae7d871deb2516ab159c286a9396e0be62ecd6f350ca"} Mar 18 07:00:18 crc kubenswrapper[4917]: I0318 07:00:18.324576 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563620-rvlqw" Mar 18 07:00:18 crc kubenswrapper[4917]: I0318 07:00:18.414753 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfgpz\" (UniqueName: \"kubernetes.io/projected/1438dd56-616a-4ae5-98eb-451e7e0349db-kube-api-access-lfgpz\") pod \"1438dd56-616a-4ae5-98eb-451e7e0349db\" (UID: \"1438dd56-616a-4ae5-98eb-451e7e0349db\") " Mar 18 07:00:18 crc kubenswrapper[4917]: I0318 07:00:18.422772 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1438dd56-616a-4ae5-98eb-451e7e0349db-kube-api-access-lfgpz" (OuterVolumeSpecName: "kube-api-access-lfgpz") pod "1438dd56-616a-4ae5-98eb-451e7e0349db" (UID: "1438dd56-616a-4ae5-98eb-451e7e0349db"). InnerVolumeSpecName "kube-api-access-lfgpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:00:18 crc kubenswrapper[4917]: I0318 07:00:18.516014 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfgpz\" (UniqueName: \"kubernetes.io/projected/1438dd56-616a-4ae5-98eb-451e7e0349db-kube-api-access-lfgpz\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:19 crc kubenswrapper[4917]: I0318 07:00:19.034035 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563620-rvlqw" event={"ID":"1438dd56-616a-4ae5-98eb-451e7e0349db","Type":"ContainerDied","Data":"7fa0a16046226d588890d7353dfc2954fbe2e292cee12a9a5c1ee5d7c204c8d6"} Mar 18 07:00:19 crc kubenswrapper[4917]: I0318 07:00:19.034113 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563620-rvlqw" Mar 18 07:00:19 crc kubenswrapper[4917]: I0318 07:00:19.034131 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fa0a16046226d588890d7353dfc2954fbe2e292cee12a9a5c1ee5d7c204c8d6" Mar 18 07:00:19 crc kubenswrapper[4917]: I0318 07:00:19.384175 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563614-sqr96"] Mar 18 07:00:19 crc kubenswrapper[4917]: I0318 07:00:19.391551 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563614-sqr96"] Mar 18 07:00:19 crc kubenswrapper[4917]: I0318 07:00:19.783549 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb" path="/var/lib/kubelet/pods/28f902fe-2ef7-4f3a-9d6b-4d4fc9ae8ceb/volumes" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.032294 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-sgj9g" podUID="7c21e973-7d87-496c-81ba-1425ba599774" containerName="console" containerID="cri-o://fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3" gracePeriod=15 Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.477420 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sgj9g_7c21e973-7d87-496c-81ba-1425ba599774/console/0.log" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.477717 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.611648 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-oauth-config\") pod \"7c21e973-7d87-496c-81ba-1425ba599774\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.611728 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-trusted-ca-bundle\") pod \"7c21e973-7d87-496c-81ba-1425ba599774\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.611766 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-serving-cert\") pod \"7c21e973-7d87-496c-81ba-1425ba599774\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.611890 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-console-config\") pod \"7c21e973-7d87-496c-81ba-1425ba599774\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.611961 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-service-ca\") pod \"7c21e973-7d87-496c-81ba-1425ba599774\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.612014 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-oauth-serving-cert\") pod \"7c21e973-7d87-496c-81ba-1425ba599774\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.612054 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scqk7\" (UniqueName: \"kubernetes.io/projected/7c21e973-7d87-496c-81ba-1425ba599774-kube-api-access-scqk7\") pod \"7c21e973-7d87-496c-81ba-1425ba599774\" (UID: \"7c21e973-7d87-496c-81ba-1425ba599774\") " Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.613467 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-console-config" (OuterVolumeSpecName: "console-config") pod "7c21e973-7d87-496c-81ba-1425ba599774" (UID: "7c21e973-7d87-496c-81ba-1425ba599774"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.613548 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "7c21e973-7d87-496c-81ba-1425ba599774" (UID: "7c21e973-7d87-496c-81ba-1425ba599774"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.613542 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-service-ca" (OuterVolumeSpecName: "service-ca") pod "7c21e973-7d87-496c-81ba-1425ba599774" (UID: "7c21e973-7d87-496c-81ba-1425ba599774"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.613774 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "7c21e973-7d87-496c-81ba-1425ba599774" (UID: "7c21e973-7d87-496c-81ba-1425ba599774"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.618666 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c21e973-7d87-496c-81ba-1425ba599774-kube-api-access-scqk7" (OuterVolumeSpecName: "kube-api-access-scqk7") pod "7c21e973-7d87-496c-81ba-1425ba599774" (UID: "7c21e973-7d87-496c-81ba-1425ba599774"). InnerVolumeSpecName "kube-api-access-scqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.619814 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "7c21e973-7d87-496c-81ba-1425ba599774" (UID: "7c21e973-7d87-496c-81ba-1425ba599774"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.624179 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "7c21e973-7d87-496c-81ba-1425ba599774" (UID: "7c21e973-7d87-496c-81ba-1425ba599774"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.713931 4917 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-console-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.713976 4917 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-service-ca\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.713995 4917 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.714016 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scqk7\" (UniqueName: \"kubernetes.io/projected/7c21e973-7d87-496c-81ba-1425ba599774-kube-api-access-scqk7\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.714036 4917 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.714052 4917 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c21e973-7d87-496c-81ba-1425ba599774-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:32 crc kubenswrapper[4917]: I0318 07:00:32.714068 4917 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7c21e973-7d87-496c-81ba-1425ba599774-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.147049 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sgj9g_7c21e973-7d87-496c-81ba-1425ba599774/console/0.log" Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.147129 4917 generic.go:334] "Generic (PLEG): container finished" podID="7c21e973-7d87-496c-81ba-1425ba599774" containerID="fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3" exitCode=2 Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.147180 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sgj9g" event={"ID":"7c21e973-7d87-496c-81ba-1425ba599774","Type":"ContainerDied","Data":"fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3"} Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.147225 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sgj9g" event={"ID":"7c21e973-7d87-496c-81ba-1425ba599774","Type":"ContainerDied","Data":"c34f8ae044f61a88bce171ce491824262324307db188e40ce46ad8a5ecb43ff8"} Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.147236 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sgj9g" Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.147264 4917 scope.go:117] "RemoveContainer" containerID="fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3" Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.182000 4917 scope.go:117] "RemoveContainer" containerID="fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3" Mar 18 07:00:33 crc kubenswrapper[4917]: E0318 07:00:33.183078 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3\": container with ID starting with fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3 not found: ID does not exist" containerID="fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3" Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.183147 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3"} err="failed to get container status \"fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3\": rpc error: code = NotFound desc = could not find container \"fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3\": container with ID starting with fd5b477575b88c24c474d7ed65f4e31c1616e7f301ac5c76cfcd23a31d8edaa3 not found: ID does not exist" Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.201178 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sgj9g"] Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.209063 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-sgj9g"] Mar 18 07:00:33 crc kubenswrapper[4917]: I0318 07:00:33.778705 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c21e973-7d87-496c-81ba-1425ba599774" path="/var/lib/kubelet/pods/7c21e973-7d87-496c-81ba-1425ba599774/volumes" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.319919 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2"] Mar 18 07:00:35 crc kubenswrapper[4917]: E0318 07:00:35.320737 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1438dd56-616a-4ae5-98eb-451e7e0349db" containerName="oc" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.320766 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1438dd56-616a-4ae5-98eb-451e7e0349db" containerName="oc" Mar 18 07:00:35 crc kubenswrapper[4917]: E0318 07:00:35.320798 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11225532-26d4-47aa-80bc-077720b847bd" containerName="collect-profiles" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.320814 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="11225532-26d4-47aa-80bc-077720b847bd" containerName="collect-profiles" Mar 18 07:00:35 crc kubenswrapper[4917]: E0318 07:00:35.320848 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerName="registry-server" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.320865 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerName="registry-server" Mar 18 07:00:35 crc kubenswrapper[4917]: E0318 07:00:35.320888 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c21e973-7d87-496c-81ba-1425ba599774" containerName="console" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.320905 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c21e973-7d87-496c-81ba-1425ba599774" containerName="console" Mar 18 07:00:35 crc kubenswrapper[4917]: E0318 07:00:35.320929 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerName="extract-content" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.320944 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerName="extract-content" Mar 18 07:00:35 crc kubenswrapper[4917]: E0318 07:00:35.320964 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerName="extract-utilities" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.320980 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerName="extract-utilities" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.324266 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6dc2e1b-ede2-408d-b990-2787b951fcc6" containerName="registry-server" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.324402 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1438dd56-616a-4ae5-98eb-451e7e0349db" containerName="oc" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.324446 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="11225532-26d4-47aa-80bc-077720b847bd" containerName="collect-profiles" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.324491 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c21e973-7d87-496c-81ba-1425ba599774" containerName="console" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.328994 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.337036 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.338475 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2"] Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.476053 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74pkh\" (UniqueName: \"kubernetes.io/projected/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-kube-api-access-74pkh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.476139 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.476195 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.578049 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74pkh\" (UniqueName: \"kubernetes.io/projected/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-kube-api-access-74pkh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.578171 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.578226 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.579096 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.579113 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.617959 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74pkh\" (UniqueName: \"kubernetes.io/projected/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-kube-api-access-74pkh\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.687729 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:35 crc kubenswrapper[4917]: I0318 07:00:35.956824 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2"] Mar 18 07:00:35 crc kubenswrapper[4917]: W0318 07:00:35.968893 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1e77201_ed7f_4151_bf4c_b81a3fbfda8b.slice/crio-f787c940a53cf892f565950bebc40dd1f6b6100e0de17f3ce2b7b95df1a3138a WatchSource:0}: Error finding container f787c940a53cf892f565950bebc40dd1f6b6100e0de17f3ce2b7b95df1a3138a: Status 404 returned error can't find the container with id f787c940a53cf892f565950bebc40dd1f6b6100e0de17f3ce2b7b95df1a3138a Mar 18 07:00:36 crc kubenswrapper[4917]: I0318 07:00:36.172291 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" event={"ID":"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b","Type":"ContainerStarted","Data":"3a07f26f929280c1b932f7775542e23163dd38cdb77386884424a403fc8ae763"} Mar 18 07:00:36 crc kubenswrapper[4917]: I0318 07:00:36.172342 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" event={"ID":"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b","Type":"ContainerStarted","Data":"f787c940a53cf892f565950bebc40dd1f6b6100e0de17f3ce2b7b95df1a3138a"} Mar 18 07:00:37 crc kubenswrapper[4917]: I0318 07:00:37.183661 4917 generic.go:334] "Generic (PLEG): container finished" podID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerID="3a07f26f929280c1b932f7775542e23163dd38cdb77386884424a403fc8ae763" exitCode=0 Mar 18 07:00:37 crc kubenswrapper[4917]: I0318 07:00:37.184111 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" event={"ID":"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b","Type":"ContainerDied","Data":"3a07f26f929280c1b932f7775542e23163dd38cdb77386884424a403fc8ae763"} Mar 18 07:00:39 crc kubenswrapper[4917]: I0318 07:00:39.208963 4917 generic.go:334] "Generic (PLEG): container finished" podID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerID="24ff7a0d47eaa40164e36776fa37ab86672ef6d4ac91638810d5b23505948e65" exitCode=0 Mar 18 07:00:39 crc kubenswrapper[4917]: I0318 07:00:39.209067 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" event={"ID":"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b","Type":"ContainerDied","Data":"24ff7a0d47eaa40164e36776fa37ab86672ef6d4ac91638810d5b23505948e65"} Mar 18 07:00:40 crc kubenswrapper[4917]: I0318 07:00:40.220434 4917 generic.go:334] "Generic (PLEG): container finished" podID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerID="c1b9b58c11c04e3a8cfb2710116c202376e799cb5cc4cf33444d5be1acce7682" exitCode=0 Mar 18 07:00:40 crc kubenswrapper[4917]: I0318 07:00:40.220482 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" event={"ID":"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b","Type":"ContainerDied","Data":"c1b9b58c11c04e3a8cfb2710116c202376e799cb5cc4cf33444d5be1acce7682"} Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.494407 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.590408 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74pkh\" (UniqueName: \"kubernetes.io/projected/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-kube-api-access-74pkh\") pod \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.590564 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-util\") pod \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.590647 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-bundle\") pod \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\" (UID: \"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b\") " Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.591781 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-bundle" (OuterVolumeSpecName: "bundle") pod "d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" (UID: "d1e77201-ed7f-4151-bf4c-b81a3fbfda8b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.603874 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-kube-api-access-74pkh" (OuterVolumeSpecName: "kube-api-access-74pkh") pod "d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" (UID: "d1e77201-ed7f-4151-bf4c-b81a3fbfda8b"). InnerVolumeSpecName "kube-api-access-74pkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.692039 4917 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.692072 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74pkh\" (UniqueName: \"kubernetes.io/projected/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-kube-api-access-74pkh\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.730506 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-util" (OuterVolumeSpecName: "util") pod "d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" (UID: "d1e77201-ed7f-4151-bf4c-b81a3fbfda8b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:00:41 crc kubenswrapper[4917]: I0318 07:00:41.793804 4917 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d1e77201-ed7f-4151-bf4c-b81a3fbfda8b-util\") on node \"crc\" DevicePath \"\"" Mar 18 07:00:42 crc kubenswrapper[4917]: I0318 07:00:42.242465 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" event={"ID":"d1e77201-ed7f-4151-bf4c-b81a3fbfda8b","Type":"ContainerDied","Data":"f787c940a53cf892f565950bebc40dd1f6b6100e0de17f3ce2b7b95df1a3138a"} Mar 18 07:00:42 crc kubenswrapper[4917]: I0318 07:00:42.243066 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f787c940a53cf892f565950bebc40dd1f6b6100e0de17f3ce2b7b95df1a3138a" Mar 18 07:00:42 crc kubenswrapper[4917]: I0318 07:00:42.242528 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.346752 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz"] Mar 18 07:00:54 crc kubenswrapper[4917]: E0318 07:00:54.347511 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerName="util" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.347524 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerName="util" Mar 18 07:00:54 crc kubenswrapper[4917]: E0318 07:00:54.347535 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerName="extract" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.347542 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerName="extract" Mar 18 07:00:54 crc kubenswrapper[4917]: E0318 07:00:54.347560 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerName="pull" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.347567 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerName="pull" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.347863 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e77201-ed7f-4151-bf4c-b81a3fbfda8b" containerName="extract" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.348240 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.356654 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.356852 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-2svjk" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.357014 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.359093 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.364115 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.365656 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz"] Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.465341 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmwp\" (UniqueName: \"kubernetes.io/projected/ae747d59-ac7b-464b-9cb5-859d07e265a8-kube-api-access-7rmwp\") pod \"metallb-operator-controller-manager-74bdc75999-hb7kz\" (UID: \"ae747d59-ac7b-464b-9cb5-859d07e265a8\") " pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.465387 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae747d59-ac7b-464b-9cb5-859d07e265a8-apiservice-cert\") pod \"metallb-operator-controller-manager-74bdc75999-hb7kz\" (UID: \"ae747d59-ac7b-464b-9cb5-859d07e265a8\") " pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.465424 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae747d59-ac7b-464b-9cb5-859d07e265a8-webhook-cert\") pod \"metallb-operator-controller-manager-74bdc75999-hb7kz\" (UID: \"ae747d59-ac7b-464b-9cb5-859d07e265a8\") " pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.566352 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae747d59-ac7b-464b-9cb5-859d07e265a8-webhook-cert\") pod \"metallb-operator-controller-manager-74bdc75999-hb7kz\" (UID: \"ae747d59-ac7b-464b-9cb5-859d07e265a8\") " pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.566675 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rmwp\" (UniqueName: \"kubernetes.io/projected/ae747d59-ac7b-464b-9cb5-859d07e265a8-kube-api-access-7rmwp\") pod \"metallb-operator-controller-manager-74bdc75999-hb7kz\" (UID: \"ae747d59-ac7b-464b-9cb5-859d07e265a8\") " pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.566785 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae747d59-ac7b-464b-9cb5-859d07e265a8-apiservice-cert\") pod \"metallb-operator-controller-manager-74bdc75999-hb7kz\" (UID: \"ae747d59-ac7b-464b-9cb5-859d07e265a8\") " pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.573718 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae747d59-ac7b-464b-9cb5-859d07e265a8-apiservice-cert\") pod \"metallb-operator-controller-manager-74bdc75999-hb7kz\" (UID: \"ae747d59-ac7b-464b-9cb5-859d07e265a8\") " pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.585027 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae747d59-ac7b-464b-9cb5-859d07e265a8-webhook-cert\") pod \"metallb-operator-controller-manager-74bdc75999-hb7kz\" (UID: \"ae747d59-ac7b-464b-9cb5-859d07e265a8\") " pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.607923 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rmwp\" (UniqueName: \"kubernetes.io/projected/ae747d59-ac7b-464b-9cb5-859d07e265a8-kube-api-access-7rmwp\") pod \"metallb-operator-controller-manager-74bdc75999-hb7kz\" (UID: \"ae747d59-ac7b-464b-9cb5-859d07e265a8\") " pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.623338 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b"] Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.624147 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.630241 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.630570 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.630571 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nj5mn" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.641726 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b"] Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.667804 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.769192 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b5699c2f-0727-4d0e-b131-6966e1bc8126-apiservice-cert\") pod \"metallb-operator-webhook-server-84fb45cdd8-gjn5b\" (UID: \"b5699c2f-0727-4d0e-b131-6966e1bc8126\") " pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.769633 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zqhg\" (UniqueName: \"kubernetes.io/projected/b5699c2f-0727-4d0e-b131-6966e1bc8126-kube-api-access-5zqhg\") pod \"metallb-operator-webhook-server-84fb45cdd8-gjn5b\" (UID: \"b5699c2f-0727-4d0e-b131-6966e1bc8126\") " pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.769664 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b5699c2f-0727-4d0e-b131-6966e1bc8126-webhook-cert\") pod \"metallb-operator-webhook-server-84fb45cdd8-gjn5b\" (UID: \"b5699c2f-0727-4d0e-b131-6966e1bc8126\") " pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.870860 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b5699c2f-0727-4d0e-b131-6966e1bc8126-apiservice-cert\") pod \"metallb-operator-webhook-server-84fb45cdd8-gjn5b\" (UID: \"b5699c2f-0727-4d0e-b131-6966e1bc8126\") " pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.870933 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zqhg\" (UniqueName: \"kubernetes.io/projected/b5699c2f-0727-4d0e-b131-6966e1bc8126-kube-api-access-5zqhg\") pod \"metallb-operator-webhook-server-84fb45cdd8-gjn5b\" (UID: \"b5699c2f-0727-4d0e-b131-6966e1bc8126\") " pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.870951 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b5699c2f-0727-4d0e-b131-6966e1bc8126-webhook-cert\") pod \"metallb-operator-webhook-server-84fb45cdd8-gjn5b\" (UID: \"b5699c2f-0727-4d0e-b131-6966e1bc8126\") " pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.875447 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b5699c2f-0727-4d0e-b131-6966e1bc8126-apiservice-cert\") pod \"metallb-operator-webhook-server-84fb45cdd8-gjn5b\" (UID: \"b5699c2f-0727-4d0e-b131-6966e1bc8126\") " pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.877009 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b5699c2f-0727-4d0e-b131-6966e1bc8126-webhook-cert\") pod \"metallb-operator-webhook-server-84fb45cdd8-gjn5b\" (UID: \"b5699c2f-0727-4d0e-b131-6966e1bc8126\") " pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.885898 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zqhg\" (UniqueName: \"kubernetes.io/projected/b5699c2f-0727-4d0e-b131-6966e1bc8126-kube-api-access-5zqhg\") pod \"metallb-operator-webhook-server-84fb45cdd8-gjn5b\" (UID: \"b5699c2f-0727-4d0e-b131-6966e1bc8126\") " pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.905486 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz"] Mar 18 07:00:54 crc kubenswrapper[4917]: W0318 07:00:54.917129 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae747d59_ac7b_464b_9cb5_859d07e265a8.slice/crio-229dd3587c6dcc7341b979cfcc470ea3ea853a78038b4a3dd1826c7c7cd01ac6 WatchSource:0}: Error finding container 229dd3587c6dcc7341b979cfcc470ea3ea853a78038b4a3dd1826c7c7cd01ac6: Status 404 returned error can't find the container with id 229dd3587c6dcc7341b979cfcc470ea3ea853a78038b4a3dd1826c7c7cd01ac6 Mar 18 07:00:54 crc kubenswrapper[4917]: I0318 07:00:54.937633 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:00:55 crc kubenswrapper[4917]: I0318 07:00:55.143932 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b"] Mar 18 07:00:55 crc kubenswrapper[4917]: W0318 07:00:55.154977 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5699c2f_0727_4d0e_b131_6966e1bc8126.slice/crio-7c990f3fc83d89a3a594a5786a98bae53314c94562749d59eb740734e114e253 WatchSource:0}: Error finding container 7c990f3fc83d89a3a594a5786a98bae53314c94562749d59eb740734e114e253: Status 404 returned error can't find the container with id 7c990f3fc83d89a3a594a5786a98bae53314c94562749d59eb740734e114e253 Mar 18 07:00:55 crc kubenswrapper[4917]: I0318 07:00:55.322272 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" event={"ID":"b5699c2f-0727-4d0e-b131-6966e1bc8126","Type":"ContainerStarted","Data":"7c990f3fc83d89a3a594a5786a98bae53314c94562749d59eb740734e114e253"} Mar 18 07:00:55 crc kubenswrapper[4917]: I0318 07:00:55.323833 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" event={"ID":"ae747d59-ac7b-464b-9cb5-859d07e265a8","Type":"ContainerStarted","Data":"229dd3587c6dcc7341b979cfcc470ea3ea853a78038b4a3dd1826c7c7cd01ac6"} Mar 18 07:00:56 crc kubenswrapper[4917]: I0318 07:00:56.388703 4917 scope.go:117] "RemoveContainer" containerID="45a19ad97dc7499fc8bfe60f402a74e4cb1aa8c2dce01795fd7e1a5398e0e3ca" Mar 18 07:00:56 crc kubenswrapper[4917]: I0318 07:00:56.926479 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6qfpg"] Mar 18 07:00:56 crc kubenswrapper[4917]: I0318 07:00:56.927564 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:56 crc kubenswrapper[4917]: I0318 07:00:56.928776 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qfpg"] Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.113299 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-catalog-content\") pod \"community-operators-6qfpg\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.113352 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-utilities\") pod \"community-operators-6qfpg\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.113433 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crm4z\" (UniqueName: \"kubernetes.io/projected/aa2baf33-ecec-4730-8911-d4957ee463a8-kube-api-access-crm4z\") pod \"community-operators-6qfpg\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.218018 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-catalog-content\") pod \"community-operators-6qfpg\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.218085 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-utilities\") pod \"community-operators-6qfpg\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.218206 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crm4z\" (UniqueName: \"kubernetes.io/projected/aa2baf33-ecec-4730-8911-d4957ee463a8-kube-api-access-crm4z\") pod \"community-operators-6qfpg\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.218570 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-catalog-content\") pod \"community-operators-6qfpg\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.218861 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-utilities\") pod \"community-operators-6qfpg\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.249187 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crm4z\" (UniqueName: \"kubernetes.io/projected/aa2baf33-ecec-4730-8911-d4957ee463a8-kube-api-access-crm4z\") pod \"community-operators-6qfpg\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:57 crc kubenswrapper[4917]: I0318 07:00:57.543439 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:00:58 crc kubenswrapper[4917]: I0318 07:00:58.641772 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6qfpg"] Mar 18 07:00:59 crc kubenswrapper[4917]: I0318 07:00:59.356242 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" event={"ID":"ae747d59-ac7b-464b-9cb5-859d07e265a8","Type":"ContainerStarted","Data":"7b46a0618a6bee976914b86000bb77a6ce4cf165c4d21f16f2b7750f358c05c3"} Mar 18 07:00:59 crc kubenswrapper[4917]: I0318 07:00:59.356573 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:00:59 crc kubenswrapper[4917]: I0318 07:00:59.358843 4917 generic.go:334] "Generic (PLEG): container finished" podID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerID="2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504" exitCode=0 Mar 18 07:00:59 crc kubenswrapper[4917]: I0318 07:00:59.358885 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qfpg" event={"ID":"aa2baf33-ecec-4730-8911-d4957ee463a8","Type":"ContainerDied","Data":"2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504"} Mar 18 07:00:59 crc kubenswrapper[4917]: I0318 07:00:59.358909 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qfpg" event={"ID":"aa2baf33-ecec-4730-8911-d4957ee463a8","Type":"ContainerStarted","Data":"d39e5bc06bde22fde8b087bb21e51756871764066bada93c150ac33b8403328f"} Mar 18 07:00:59 crc kubenswrapper[4917]: I0318 07:00:59.400337 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" podStartSLOduration=1.896070784 podStartE2EDuration="5.400319542s" podCreationTimestamp="2026-03-18 07:00:54 +0000 UTC" firstStartedPulling="2026-03-18 07:00:54.919089175 +0000 UTC m=+839.860243889" lastFinishedPulling="2026-03-18 07:00:58.423337933 +0000 UTC m=+843.364492647" observedRunningTime="2026-03-18 07:00:59.382246686 +0000 UTC m=+844.323401420" watchObservedRunningTime="2026-03-18 07:00:59.400319542 +0000 UTC m=+844.341474256" Mar 18 07:01:00 crc kubenswrapper[4917]: I0318 07:01:00.365526 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" event={"ID":"b5699c2f-0727-4d0e-b131-6966e1bc8126","Type":"ContainerStarted","Data":"05ba547d7000b02182e08cbb8e80b5e85bad3d405d27ba185fd2113e9c2494ec"} Mar 18 07:01:00 crc kubenswrapper[4917]: I0318 07:01:00.365899 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:01:00 crc kubenswrapper[4917]: I0318 07:01:00.391707 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" podStartSLOduration=1.427881548 podStartE2EDuration="6.391686315s" podCreationTimestamp="2026-03-18 07:00:54 +0000 UTC" firstStartedPulling="2026-03-18 07:00:55.160316333 +0000 UTC m=+840.101471047" lastFinishedPulling="2026-03-18 07:01:00.1241211 +0000 UTC m=+845.065275814" observedRunningTime="2026-03-18 07:01:00.384984145 +0000 UTC m=+845.326138899" watchObservedRunningTime="2026-03-18 07:01:00.391686315 +0000 UTC m=+845.332841039" Mar 18 07:01:01 crc kubenswrapper[4917]: I0318 07:01:01.373570 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qfpg" event={"ID":"aa2baf33-ecec-4730-8911-d4957ee463a8","Type":"ContainerStarted","Data":"bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c"} Mar 18 07:01:02 crc kubenswrapper[4917]: I0318 07:01:02.380901 4917 generic.go:334] "Generic (PLEG): container finished" podID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerID="bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c" exitCode=0 Mar 18 07:01:02 crc kubenswrapper[4917]: I0318 07:01:02.380943 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qfpg" event={"ID":"aa2baf33-ecec-4730-8911-d4957ee463a8","Type":"ContainerDied","Data":"bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c"} Mar 18 07:01:03 crc kubenswrapper[4917]: I0318 07:01:03.390548 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qfpg" event={"ID":"aa2baf33-ecec-4730-8911-d4957ee463a8","Type":"ContainerStarted","Data":"2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00"} Mar 18 07:01:03 crc kubenswrapper[4917]: I0318 07:01:03.412330 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6qfpg" podStartSLOduration=4.513372816 podStartE2EDuration="7.412304235s" podCreationTimestamp="2026-03-18 07:00:56 +0000 UTC" firstStartedPulling="2026-03-18 07:01:00.06388279 +0000 UTC m=+845.005037504" lastFinishedPulling="2026-03-18 07:01:02.962814199 +0000 UTC m=+847.903968923" observedRunningTime="2026-03-18 07:01:03.41065823 +0000 UTC m=+848.351812974" watchObservedRunningTime="2026-03-18 07:01:03.412304235 +0000 UTC m=+848.353458979" Mar 18 07:01:07 crc kubenswrapper[4917]: I0318 07:01:07.544540 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:01:07 crc kubenswrapper[4917]: I0318 07:01:07.544887 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:01:07 crc kubenswrapper[4917]: I0318 07:01:07.596545 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:01:08 crc kubenswrapper[4917]: I0318 07:01:08.470225 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:01:09 crc kubenswrapper[4917]: I0318 07:01:09.896384 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6qfpg"] Mar 18 07:01:10 crc kubenswrapper[4917]: I0318 07:01:10.441167 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6qfpg" podUID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerName="registry-server" containerID="cri-o://2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00" gracePeriod=2 Mar 18 07:01:10 crc kubenswrapper[4917]: I0318 07:01:10.870962 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:01:10 crc kubenswrapper[4917]: I0318 07:01:10.920833 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-utilities\") pod \"aa2baf33-ecec-4730-8911-d4957ee463a8\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " Mar 18 07:01:10 crc kubenswrapper[4917]: I0318 07:01:10.920888 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crm4z\" (UniqueName: \"kubernetes.io/projected/aa2baf33-ecec-4730-8911-d4957ee463a8-kube-api-access-crm4z\") pod \"aa2baf33-ecec-4730-8911-d4957ee463a8\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " Mar 18 07:01:10 crc kubenswrapper[4917]: I0318 07:01:10.920915 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-catalog-content\") pod \"aa2baf33-ecec-4730-8911-d4957ee463a8\" (UID: \"aa2baf33-ecec-4730-8911-d4957ee463a8\") " Mar 18 07:01:10 crc kubenswrapper[4917]: I0318 07:01:10.923753 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-utilities" (OuterVolumeSpecName: "utilities") pod "aa2baf33-ecec-4730-8911-d4957ee463a8" (UID: "aa2baf33-ecec-4730-8911-d4957ee463a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:01:10 crc kubenswrapper[4917]: I0318 07:01:10.933002 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa2baf33-ecec-4730-8911-d4957ee463a8-kube-api-access-crm4z" (OuterVolumeSpecName: "kube-api-access-crm4z") pod "aa2baf33-ecec-4730-8911-d4957ee463a8" (UID: "aa2baf33-ecec-4730-8911-d4957ee463a8"). InnerVolumeSpecName "kube-api-access-crm4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:01:10 crc kubenswrapper[4917]: I0318 07:01:10.988465 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa2baf33-ecec-4730-8911-d4957ee463a8" (UID: "aa2baf33-ecec-4730-8911-d4957ee463a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.022914 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.022951 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crm4z\" (UniqueName: \"kubernetes.io/projected/aa2baf33-ecec-4730-8911-d4957ee463a8-kube-api-access-crm4z\") on node \"crc\" DevicePath \"\"" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.022964 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa2baf33-ecec-4730-8911-d4957ee463a8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.453159 4917 generic.go:334] "Generic (PLEG): container finished" podID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerID="2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00" exitCode=0 Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.453228 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qfpg" event={"ID":"aa2baf33-ecec-4730-8911-d4957ee463a8","Type":"ContainerDied","Data":"2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00"} Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.453288 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6qfpg" event={"ID":"aa2baf33-ecec-4730-8911-d4957ee463a8","Type":"ContainerDied","Data":"d39e5bc06bde22fde8b087bb21e51756871764066bada93c150ac33b8403328f"} Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.453323 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6qfpg" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.453330 4917 scope.go:117] "RemoveContainer" containerID="2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.493138 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6qfpg"] Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.494736 4917 scope.go:117] "RemoveContainer" containerID="bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.500548 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6qfpg"] Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.548739 4917 scope.go:117] "RemoveContainer" containerID="2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.596700 4917 scope.go:117] "RemoveContainer" containerID="2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00" Mar 18 07:01:11 crc kubenswrapper[4917]: E0318 07:01:11.597202 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00\": container with ID starting with 2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00 not found: ID does not exist" containerID="2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.597254 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00"} err="failed to get container status \"2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00\": rpc error: code = NotFound desc = could not find container \"2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00\": container with ID starting with 2030702ea751e9d944abd5dec0e8a06292b6f6d23b5198b289c21dcae5e7ec00 not found: ID does not exist" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.597286 4917 scope.go:117] "RemoveContainer" containerID="bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c" Mar 18 07:01:11 crc kubenswrapper[4917]: E0318 07:01:11.597569 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c\": container with ID starting with bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c not found: ID does not exist" containerID="bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.597619 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c"} err="failed to get container status \"bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c\": rpc error: code = NotFound desc = could not find container \"bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c\": container with ID starting with bd577cb4090857a255d8baa05cefd594c51cff56545c88d17ffe98ff06bcb44c not found: ID does not exist" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.597649 4917 scope.go:117] "RemoveContainer" containerID="2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504" Mar 18 07:01:11 crc kubenswrapper[4917]: E0318 07:01:11.597919 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504\": container with ID starting with 2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504 not found: ID does not exist" containerID="2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.597943 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504"} err="failed to get container status \"2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504\": rpc error: code = NotFound desc = could not find container \"2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504\": container with ID starting with 2ca75070d2dfa79d0e03ae05871e6c2c3465d0aaa55cce556a3bb909ed43c504 not found: ID does not exist" Mar 18 07:01:11 crc kubenswrapper[4917]: I0318 07:01:11.779654 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa2baf33-ecec-4730-8911-d4957ee463a8" path="/var/lib/kubelet/pods/aa2baf33-ecec-4730-8911-d4957ee463a8/volumes" Mar 18 07:01:14 crc kubenswrapper[4917]: I0318 07:01:14.943852 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-84fb45cdd8-gjn5b" Mar 18 07:01:32 crc kubenswrapper[4917]: I0318 07:01:32.928702 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:01:32 crc kubenswrapper[4917]: I0318 07:01:32.929382 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:01:34 crc kubenswrapper[4917]: I0318 07:01:34.672450 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-74bdc75999-hb7kz" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.486743 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-24wx8"] Mar 18 07:01:35 crc kubenswrapper[4917]: E0318 07:01:35.487220 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerName="registry-server" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.487232 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerName="registry-server" Mar 18 07:01:35 crc kubenswrapper[4917]: E0318 07:01:35.487244 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerName="extract-content" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.487249 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerName="extract-content" Mar 18 07:01:35 crc kubenswrapper[4917]: E0318 07:01:35.487265 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerName="extract-utilities" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.487271 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerName="extract-utilities" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.487366 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa2baf33-ecec-4730-8911-d4957ee463a8" containerName="registry-server" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.489069 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.490797 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.491384 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.496118 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-69cmf" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.512040 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k"] Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.512988 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.517202 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.526522 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k"] Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.580098 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-j9g22"] Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.580995 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.585475 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.586007 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.586206 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.589897 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-59rc2" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.590303 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-zscqs"] Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.591166 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.593992 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.595619 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zscqs"] Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.621808 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-frr-conf\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.621876 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0f2128-2563-4331-9c56-c7ea7a46b0d5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gc44k\" (UID: \"ba0f2128-2563-4331-9c56-c7ea7a46b0d5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.621934 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4f428b53-24d1-4ec3-8d38-271e43a0e71d-frr-startup\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.621958 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-reloader\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.621975 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-metrics\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.622028 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddb8d\" (UniqueName: \"kubernetes.io/projected/4f428b53-24d1-4ec3-8d38-271e43a0e71d-kube-api-access-ddb8d\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.622051 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-frr-sockets\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.622096 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h84md\" (UniqueName: \"kubernetes.io/projected/ba0f2128-2563-4331-9c56-c7ea7a46b0d5-kube-api-access-h84md\") pod \"frr-k8s-webhook-server-bcc4b6f68-gc44k\" (UID: \"ba0f2128-2563-4331-9c56-c7ea7a46b0d5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.622113 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f428b53-24d1-4ec3-8d38-271e43a0e71d-metrics-certs\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.723110 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9500ccc2-23e4-4cad-97db-6a0fac025f5e-cert\") pod \"controller-7bb4cc7c98-zscqs\" (UID: \"9500ccc2-23e4-4cad-97db-6a0fac025f5e\") " pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.723603 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwtf\" (UniqueName: \"kubernetes.io/projected/bc1ceb3d-1108-434e-892d-e08b164c8937-kube-api-access-hgwtf\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.723741 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddb8d\" (UniqueName: \"kubernetes.io/projected/4f428b53-24d1-4ec3-8d38-271e43a0e71d-kube-api-access-ddb8d\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.723828 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-frr-sockets\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.723902 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h84md\" (UniqueName: \"kubernetes.io/projected/ba0f2128-2563-4331-9c56-c7ea7a46b0d5-kube-api-access-h84md\") pod \"frr-k8s-webhook-server-bcc4b6f68-gc44k\" (UID: \"ba0f2128-2563-4331-9c56-c7ea7a46b0d5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.723968 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f428b53-24d1-4ec3-8d38-271e43a0e71d-metrics-certs\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724043 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-metrics-certs\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724121 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9500ccc2-23e4-4cad-97db-6a0fac025f5e-metrics-certs\") pod \"controller-7bb4cc7c98-zscqs\" (UID: \"9500ccc2-23e4-4cad-97db-6a0fac025f5e\") " pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724191 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-frr-conf\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724272 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-memberlist\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724347 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0f2128-2563-4331-9c56-c7ea7a46b0d5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gc44k\" (UID: \"ba0f2128-2563-4331-9c56-c7ea7a46b0d5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724419 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bc1ceb3d-1108-434e-892d-e08b164c8937-metallb-excludel2\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724494 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4f428b53-24d1-4ec3-8d38-271e43a0e71d-frr-startup\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724574 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-reloader\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724686 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-metrics\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.724754 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znf8n\" (UniqueName: \"kubernetes.io/projected/9500ccc2-23e4-4cad-97db-6a0fac025f5e-kube-api-access-znf8n\") pod \"controller-7bb4cc7c98-zscqs\" (UID: \"9500ccc2-23e4-4cad-97db-6a0fac025f5e\") " pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.726100 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-frr-sockets\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.727196 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-frr-conf\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.727320 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-metrics\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.727412 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4f428b53-24d1-4ec3-8d38-271e43a0e71d-reloader\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.727793 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4f428b53-24d1-4ec3-8d38-271e43a0e71d-frr-startup\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.732079 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4f428b53-24d1-4ec3-8d38-271e43a0e71d-metrics-certs\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.737893 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0f2128-2563-4331-9c56-c7ea7a46b0d5-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gc44k\" (UID: \"ba0f2128-2563-4331-9c56-c7ea7a46b0d5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.741357 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h84md\" (UniqueName: \"kubernetes.io/projected/ba0f2128-2563-4331-9c56-c7ea7a46b0d5-kube-api-access-h84md\") pod \"frr-k8s-webhook-server-bcc4b6f68-gc44k\" (UID: \"ba0f2128-2563-4331-9c56-c7ea7a46b0d5\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.744112 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddb8d\" (UniqueName: \"kubernetes.io/projected/4f428b53-24d1-4ec3-8d38-271e43a0e71d-kube-api-access-ddb8d\") pod \"frr-k8s-24wx8\" (UID: \"4f428b53-24d1-4ec3-8d38-271e43a0e71d\") " pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.808232 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.825691 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-metrics-certs\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.826014 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9500ccc2-23e4-4cad-97db-6a0fac025f5e-metrics-certs\") pod \"controller-7bb4cc7c98-zscqs\" (UID: \"9500ccc2-23e4-4cad-97db-6a0fac025f5e\") " pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.826103 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-memberlist\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.826175 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bc1ceb3d-1108-434e-892d-e08b164c8937-metallb-excludel2\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.826288 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znf8n\" (UniqueName: \"kubernetes.io/projected/9500ccc2-23e4-4cad-97db-6a0fac025f5e-kube-api-access-znf8n\") pod \"controller-7bb4cc7c98-zscqs\" (UID: \"9500ccc2-23e4-4cad-97db-6a0fac025f5e\") " pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.826337 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9500ccc2-23e4-4cad-97db-6a0fac025f5e-cert\") pod \"controller-7bb4cc7c98-zscqs\" (UID: \"9500ccc2-23e4-4cad-97db-6a0fac025f5e\") " pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.826370 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwtf\" (UniqueName: \"kubernetes.io/projected/bc1ceb3d-1108-434e-892d-e08b164c8937-kube-api-access-hgwtf\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.831082 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-metrics-certs\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.831548 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/bc1ceb3d-1108-434e-892d-e08b164c8937-metallb-excludel2\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: E0318 07:01:35.831630 4917 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 07:01:35 crc kubenswrapper[4917]: E0318 07:01:35.831668 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-memberlist podName:bc1ceb3d-1108-434e-892d-e08b164c8937 nodeName:}" failed. No retries permitted until 2026-03-18 07:01:36.331654629 +0000 UTC m=+881.272809343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-memberlist") pod "speaker-j9g22" (UID: "bc1ceb3d-1108-434e-892d-e08b164c8937") : secret "metallb-memberlist" not found Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.832273 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.836114 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9500ccc2-23e4-4cad-97db-6a0fac025f5e-metrics-certs\") pod \"controller-7bb4cc7c98-zscqs\" (UID: \"9500ccc2-23e4-4cad-97db-6a0fac025f5e\") " pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.838229 4917 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.846243 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9500ccc2-23e4-4cad-97db-6a0fac025f5e-cert\") pod \"controller-7bb4cc7c98-zscqs\" (UID: \"9500ccc2-23e4-4cad-97db-6a0fac025f5e\") " pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.848102 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwtf\" (UniqueName: \"kubernetes.io/projected/bc1ceb3d-1108-434e-892d-e08b164c8937-kube-api-access-hgwtf\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.873709 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znf8n\" (UniqueName: \"kubernetes.io/projected/9500ccc2-23e4-4cad-97db-6a0fac025f5e-kube-api-access-znf8n\") pod \"controller-7bb4cc7c98-zscqs\" (UID: \"9500ccc2-23e4-4cad-97db-6a0fac025f5e\") " pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:35 crc kubenswrapper[4917]: I0318 07:01:35.914110 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:36 crc kubenswrapper[4917]: I0318 07:01:36.106811 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k"] Mar 18 07:01:36 crc kubenswrapper[4917]: W0318 07:01:36.113638 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba0f2128_2563_4331_9c56_c7ea7a46b0d5.slice/crio-186c2e9af5ae519acc53eb6af31d3d49279654804c2bfc0d824c2676221144d2 WatchSource:0}: Error finding container 186c2e9af5ae519acc53eb6af31d3d49279654804c2bfc0d824c2676221144d2: Status 404 returned error can't find the container with id 186c2e9af5ae519acc53eb6af31d3d49279654804c2bfc0d824c2676221144d2 Mar 18 07:01:36 crc kubenswrapper[4917]: I0318 07:01:36.332745 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-memberlist\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:36 crc kubenswrapper[4917]: E0318 07:01:36.332962 4917 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 18 07:01:36 crc kubenswrapper[4917]: E0318 07:01:36.333094 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-memberlist podName:bc1ceb3d-1108-434e-892d-e08b164c8937 nodeName:}" failed. No retries permitted until 2026-03-18 07:01:37.333056057 +0000 UTC m=+882.274210811 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-memberlist") pod "speaker-j9g22" (UID: "bc1ceb3d-1108-434e-892d-e08b164c8937") : secret "metallb-memberlist" not found Mar 18 07:01:36 crc kubenswrapper[4917]: I0318 07:01:36.377175 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-zscqs"] Mar 18 07:01:36 crc kubenswrapper[4917]: W0318 07:01:36.382003 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9500ccc2_23e4_4cad_97db_6a0fac025f5e.slice/crio-7f014ea198ce7eca9e0962945008fbc75e872895193d787ad0dae5a07419e8fb WatchSource:0}: Error finding container 7f014ea198ce7eca9e0962945008fbc75e872895193d787ad0dae5a07419e8fb: Status 404 returned error can't find the container with id 7f014ea198ce7eca9e0962945008fbc75e872895193d787ad0dae5a07419e8fb Mar 18 07:01:36 crc kubenswrapper[4917]: I0318 07:01:36.638435 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" event={"ID":"ba0f2128-2563-4331-9c56-c7ea7a46b0d5","Type":"ContainerStarted","Data":"186c2e9af5ae519acc53eb6af31d3d49279654804c2bfc0d824c2676221144d2"} Mar 18 07:01:36 crc kubenswrapper[4917]: I0318 07:01:36.640001 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zscqs" event={"ID":"9500ccc2-23e4-4cad-97db-6a0fac025f5e","Type":"ContainerStarted","Data":"353141a4403741acd2768dcfaee76731c7705957b06e6602c29049729d3716e8"} Mar 18 07:01:36 crc kubenswrapper[4917]: I0318 07:01:36.640047 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zscqs" event={"ID":"9500ccc2-23e4-4cad-97db-6a0fac025f5e","Type":"ContainerStarted","Data":"7f014ea198ce7eca9e0962945008fbc75e872895193d787ad0dae5a07419e8fb"} Mar 18 07:01:36 crc kubenswrapper[4917]: I0318 07:01:36.641077 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerStarted","Data":"5079ab37f7f03f896c64ac0d5d15498a00e91ef500c87bd3841fffa7491c239b"} Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.347902 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-memberlist\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.357392 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/bc1ceb3d-1108-434e-892d-e08b164c8937-memberlist\") pod \"speaker-j9g22\" (UID: \"bc1ceb3d-1108-434e-892d-e08b164c8937\") " pod="metallb-system/speaker-j9g22" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.405662 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-j9g22" Mar 18 07:01:37 crc kubenswrapper[4917]: W0318 07:01:37.428330 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc1ceb3d_1108_434e_892d_e08b164c8937.slice/crio-1d755f8915e77ecb7a369f16388b0f0fb882a876df46e0520294a9ef2e89a852 WatchSource:0}: Error finding container 1d755f8915e77ecb7a369f16388b0f0fb882a876df46e0520294a9ef2e89a852: Status 404 returned error can't find the container with id 1d755f8915e77ecb7a369f16388b0f0fb882a876df46e0520294a9ef2e89a852 Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.516170 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zzrp4"] Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.517183 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.528567 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzrp4"] Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.654165 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57dtq\" (UniqueName: \"kubernetes.io/projected/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-kube-api-access-57dtq\") pod \"redhat-marketplace-zzrp4\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.654250 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-utilities\") pod \"redhat-marketplace-zzrp4\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.654275 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-catalog-content\") pod \"redhat-marketplace-zzrp4\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.675435 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-zscqs" event={"ID":"9500ccc2-23e4-4cad-97db-6a0fac025f5e","Type":"ContainerStarted","Data":"4b606ab32b3297c25921cab3ba880aa410a1c1ae9ac2a58acc281a2bc21a38db"} Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.676298 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.700760 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j9g22" event={"ID":"bc1ceb3d-1108-434e-892d-e08b164c8937","Type":"ContainerStarted","Data":"1d755f8915e77ecb7a369f16388b0f0fb882a876df46e0520294a9ef2e89a852"} Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.708483 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-zscqs" podStartSLOduration=2.708465797 podStartE2EDuration="2.708465797s" podCreationTimestamp="2026-03-18 07:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:01:37.706903345 +0000 UTC m=+882.648058059" watchObservedRunningTime="2026-03-18 07:01:37.708465797 +0000 UTC m=+882.649620511" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.754867 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-utilities\") pod \"redhat-marketplace-zzrp4\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.755086 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-catalog-content\") pod \"redhat-marketplace-zzrp4\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.755147 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57dtq\" (UniqueName: \"kubernetes.io/projected/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-kube-api-access-57dtq\") pod \"redhat-marketplace-zzrp4\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.755346 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-utilities\") pod \"redhat-marketplace-zzrp4\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.755528 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-catalog-content\") pod \"redhat-marketplace-zzrp4\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.777559 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57dtq\" (UniqueName: \"kubernetes.io/projected/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-kube-api-access-57dtq\") pod \"redhat-marketplace-zzrp4\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:37 crc kubenswrapper[4917]: I0318 07:01:37.830898 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:38 crc kubenswrapper[4917]: I0318 07:01:38.138555 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzrp4"] Mar 18 07:01:38 crc kubenswrapper[4917]: I0318 07:01:38.709910 4917 generic.go:334] "Generic (PLEG): container finished" podID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerID="0e1b1f453cbbe2cbae3fe868afdf7284acc3d3081f37a06d050a8f64335e8aee" exitCode=0 Mar 18 07:01:38 crc kubenswrapper[4917]: I0318 07:01:38.710454 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzrp4" event={"ID":"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928","Type":"ContainerDied","Data":"0e1b1f453cbbe2cbae3fe868afdf7284acc3d3081f37a06d050a8f64335e8aee"} Mar 18 07:01:38 crc kubenswrapper[4917]: I0318 07:01:38.710499 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzrp4" event={"ID":"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928","Type":"ContainerStarted","Data":"3539c31ffda9fe48a6959bf9471a70a9494ef8a2bcb126809fab762925bc1445"} Mar 18 07:01:38 crc kubenswrapper[4917]: I0318 07:01:38.735859 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j9g22" event={"ID":"bc1ceb3d-1108-434e-892d-e08b164c8937","Type":"ContainerStarted","Data":"e24dd31fe6b2d4c2a6563c59eab6b14a423e9e409ed001eb9ba3d2892fd1e6de"} Mar 18 07:01:38 crc kubenswrapper[4917]: I0318 07:01:38.735901 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-j9g22" event={"ID":"bc1ceb3d-1108-434e-892d-e08b164c8937","Type":"ContainerStarted","Data":"7659d96e91e45793b2a16e527225514cfb6b994dd0c92f0b7e3d2812022c81d4"} Mar 18 07:01:38 crc kubenswrapper[4917]: I0318 07:01:38.736555 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-j9g22" Mar 18 07:01:38 crc kubenswrapper[4917]: I0318 07:01:38.757914 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-j9g22" podStartSLOduration=3.757896309 podStartE2EDuration="3.757896309s" podCreationTimestamp="2026-03-18 07:01:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:01:38.757484739 +0000 UTC m=+883.698639463" watchObservedRunningTime="2026-03-18 07:01:38.757896309 +0000 UTC m=+883.699051023" Mar 18 07:01:39 crc kubenswrapper[4917]: I0318 07:01:39.742487 4917 generic.go:334] "Generic (PLEG): container finished" podID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerID="57501946401dc2cd82e83fdfe55200af99021693b25604e4d9ebda977327b8cf" exitCode=0 Mar 18 07:01:39 crc kubenswrapper[4917]: I0318 07:01:39.743488 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzrp4" event={"ID":"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928","Type":"ContainerDied","Data":"57501946401dc2cd82e83fdfe55200af99021693b25604e4d9ebda977327b8cf"} Mar 18 07:01:40 crc kubenswrapper[4917]: I0318 07:01:40.753925 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzrp4" event={"ID":"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928","Type":"ContainerStarted","Data":"b4d1e0b490700030c6936e0741d495e245f022a108a30bfe1163a8f3573d520a"} Mar 18 07:01:40 crc kubenswrapper[4917]: I0318 07:01:40.769434 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zzrp4" podStartSLOduration=2.347330151 podStartE2EDuration="3.76941699s" podCreationTimestamp="2026-03-18 07:01:37 +0000 UTC" firstStartedPulling="2026-03-18 07:01:38.715673181 +0000 UTC m=+883.656827895" lastFinishedPulling="2026-03-18 07:01:40.13776002 +0000 UTC m=+885.078914734" observedRunningTime="2026-03-18 07:01:40.76901215 +0000 UTC m=+885.710166854" watchObservedRunningTime="2026-03-18 07:01:40.76941699 +0000 UTC m=+885.710571704" Mar 18 07:01:43 crc kubenswrapper[4917]: I0318 07:01:43.776843 4917 generic.go:334] "Generic (PLEG): container finished" podID="4f428b53-24d1-4ec3-8d38-271e43a0e71d" containerID="5c628893a8c3366e15141a39970b4ed0ddb4b02b5755880fdd91e85d9e5f2df0" exitCode=0 Mar 18 07:01:43 crc kubenswrapper[4917]: I0318 07:01:43.785891 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:43 crc kubenswrapper[4917]: I0318 07:01:43.785950 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerDied","Data":"5c628893a8c3366e15141a39970b4ed0ddb4b02b5755880fdd91e85d9e5f2df0"} Mar 18 07:01:43 crc kubenswrapper[4917]: I0318 07:01:43.785993 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" event={"ID":"ba0f2128-2563-4331-9c56-c7ea7a46b0d5","Type":"ContainerStarted","Data":"ddda0491dc1591cca2be8625381ad95876940a95431e8dc86e89db5e81fd1740"} Mar 18 07:01:43 crc kubenswrapper[4917]: I0318 07:01:43.865388 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" podStartSLOduration=1.661289054 podStartE2EDuration="8.865365166s" podCreationTimestamp="2026-03-18 07:01:35 +0000 UTC" firstStartedPulling="2026-03-18 07:01:36.116073435 +0000 UTC m=+881.057228149" lastFinishedPulling="2026-03-18 07:01:43.320149537 +0000 UTC m=+888.261304261" observedRunningTime="2026-03-18 07:01:43.85410646 +0000 UTC m=+888.795261284" watchObservedRunningTime="2026-03-18 07:01:43.865365166 +0000 UTC m=+888.806519920" Mar 18 07:01:44 crc kubenswrapper[4917]: I0318 07:01:44.792509 4917 generic.go:334] "Generic (PLEG): container finished" podID="4f428b53-24d1-4ec3-8d38-271e43a0e71d" containerID="2ae06e07fb74557909d48496b717f1f12337f0231a246e4ffaf0d81f8009b298" exitCode=0 Mar 18 07:01:44 crc kubenswrapper[4917]: I0318 07:01:44.792850 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerDied","Data":"2ae06e07fb74557909d48496b717f1f12337f0231a246e4ffaf0d81f8009b298"} Mar 18 07:01:45 crc kubenswrapper[4917]: I0318 07:01:45.801481 4917 generic.go:334] "Generic (PLEG): container finished" podID="4f428b53-24d1-4ec3-8d38-271e43a0e71d" containerID="b8bf06478a69448522a499ae156b9bd87d4c375a680b84b068631c150b96d256" exitCode=0 Mar 18 07:01:45 crc kubenswrapper[4917]: I0318 07:01:45.801551 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerDied","Data":"b8bf06478a69448522a499ae156b9bd87d4c375a680b84b068631c150b96d256"} Mar 18 07:01:46 crc kubenswrapper[4917]: I0318 07:01:46.815688 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerStarted","Data":"98bfa4e0805fff2de17bc6a74c2f5298e579f4ea24fd5689b8c1cef147497291"} Mar 18 07:01:46 crc kubenswrapper[4917]: I0318 07:01:46.816045 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerStarted","Data":"2d1f23d714cea293ebd92701343503260a6c1d43063615e84a900b0b7115c806"} Mar 18 07:01:46 crc kubenswrapper[4917]: I0318 07:01:46.816066 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerStarted","Data":"91f53049ad401aef543c3ce030c71f44c0b22662f46eefb8d0f6506768d487e7"} Mar 18 07:01:46 crc kubenswrapper[4917]: I0318 07:01:46.816084 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerStarted","Data":"9cdaf9cfb2bdb3af966ba21c855d4b71f8a37b88e4a0072a815150ed7a273543"} Mar 18 07:01:47 crc kubenswrapper[4917]: I0318 07:01:47.409901 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-j9g22" Mar 18 07:01:47 crc kubenswrapper[4917]: I0318 07:01:47.824470 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerStarted","Data":"39acbfbfa6b4b40c5f1fc08a7e6fecc1abc3aa2a5e2cbf479c5deba08ccaa5fd"} Mar 18 07:01:47 crc kubenswrapper[4917]: I0318 07:01:47.824751 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:47 crc kubenswrapper[4917]: I0318 07:01:47.824764 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-24wx8" event={"ID":"4f428b53-24d1-4ec3-8d38-271e43a0e71d","Type":"ContainerStarted","Data":"9d643c659baa2becd8bc5068f1ec14891129f68cec1134043b6b17b2b92288aa"} Mar 18 07:01:47 crc kubenswrapper[4917]: I0318 07:01:47.831306 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:47 crc kubenswrapper[4917]: I0318 07:01:47.831371 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:47 crc kubenswrapper[4917]: I0318 07:01:47.853105 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-24wx8" podStartSLOduration=5.492936341 podStartE2EDuration="12.853085287s" podCreationTimestamp="2026-03-18 07:01:35 +0000 UTC" firstStartedPulling="2026-03-18 07:01:35.966771695 +0000 UTC m=+880.907926429" lastFinishedPulling="2026-03-18 07:01:43.326920651 +0000 UTC m=+888.268075375" observedRunningTime="2026-03-18 07:01:47.849734696 +0000 UTC m=+892.790889430" watchObservedRunningTime="2026-03-18 07:01:47.853085287 +0000 UTC m=+892.794240001" Mar 18 07:01:47 crc kubenswrapper[4917]: I0318 07:01:47.898974 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:48 crc kubenswrapper[4917]: I0318 07:01:48.911209 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.165642 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp"] Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.167045 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.170599 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.179949 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp"] Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.271339 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.271779 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.271807 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxskb\" (UniqueName: \"kubernetes.io/projected/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-kube-api-access-fxskb\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.372821 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.372917 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.372945 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxskb\" (UniqueName: \"kubernetes.io/projected/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-kube-api-access-fxskb\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.373616 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.373834 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.397902 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxskb\" (UniqueName: \"kubernetes.io/projected/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-kube-api-access-fxskb\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.487682 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:49 crc kubenswrapper[4917]: I0318 07:01:49.934096 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp"] Mar 18 07:01:49 crc kubenswrapper[4917]: W0318 07:01:49.944245 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f91b9a4_bcdc_4f7f_a5e9_b6a3706f0a0a.slice/crio-78f4b29d9587f74d6a101871bb6dcf76ceb4bfd995955bd42f12e97e97d51e2f WatchSource:0}: Error finding container 78f4b29d9587f74d6a101871bb6dcf76ceb4bfd995955bd42f12e97e97d51e2f: Status 404 returned error can't find the container with id 78f4b29d9587f74d6a101871bb6dcf76ceb4bfd995955bd42f12e97e97d51e2f Mar 18 07:01:50 crc kubenswrapper[4917]: I0318 07:01:50.809401 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:50 crc kubenswrapper[4917]: I0318 07:01:50.848775 4917 generic.go:334] "Generic (PLEG): container finished" podID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerID="108b1b4bafd9e636859c0882e06135933dc93e3ee07d52e78d93be351616de9a" exitCode=0 Mar 18 07:01:50 crc kubenswrapper[4917]: I0318 07:01:50.848867 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" event={"ID":"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a","Type":"ContainerDied","Data":"108b1b4bafd9e636859c0882e06135933dc93e3ee07d52e78d93be351616de9a"} Mar 18 07:01:50 crc kubenswrapper[4917]: I0318 07:01:50.848950 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" event={"ID":"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a","Type":"ContainerStarted","Data":"78f4b29d9587f74d6a101871bb6dcf76ceb4bfd995955bd42f12e97e97d51e2f"} Mar 18 07:01:50 crc kubenswrapper[4917]: I0318 07:01:50.873919 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:52 crc kubenswrapper[4917]: I0318 07:01:52.095103 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzrp4"] Mar 18 07:01:52 crc kubenswrapper[4917]: I0318 07:01:52.095343 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zzrp4" podUID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerName="registry-server" containerID="cri-o://b4d1e0b490700030c6936e0741d495e245f022a108a30bfe1163a8f3573d520a" gracePeriod=2 Mar 18 07:01:52 crc kubenswrapper[4917]: I0318 07:01:52.867843 4917 generic.go:334] "Generic (PLEG): container finished" podID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerID="b4d1e0b490700030c6936e0741d495e245f022a108a30bfe1163a8f3573d520a" exitCode=0 Mar 18 07:01:52 crc kubenswrapper[4917]: I0318 07:01:52.867919 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzrp4" event={"ID":"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928","Type":"ContainerDied","Data":"b4d1e0b490700030c6936e0741d495e245f022a108a30bfe1163a8f3573d520a"} Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.088340 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.250718 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-catalog-content\") pod \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.250814 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57dtq\" (UniqueName: \"kubernetes.io/projected/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-kube-api-access-57dtq\") pod \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.250876 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-utilities\") pod \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\" (UID: \"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928\") " Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.252469 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-utilities" (OuterVolumeSpecName: "utilities") pod "1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" (UID: "1e5e9ab1-f5ac-4076-a1c8-f95a86c84928"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.257546 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-kube-api-access-57dtq" (OuterVolumeSpecName: "kube-api-access-57dtq") pod "1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" (UID: "1e5e9ab1-f5ac-4076-a1c8-f95a86c84928"). InnerVolumeSpecName "kube-api-access-57dtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.278662 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" (UID: "1e5e9ab1-f5ac-4076-a1c8-f95a86c84928"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.352372 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57dtq\" (UniqueName: \"kubernetes.io/projected/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-kube-api-access-57dtq\") on node \"crc\" DevicePath \"\"" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.352404 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.352414 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.887166 4917 generic.go:334] "Generic (PLEG): container finished" podID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerID="66be5b5530db210cf11e38063fc619e6c60c048998dfadb64b42f9c2d99d60d5" exitCode=0 Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.887222 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" event={"ID":"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a","Type":"ContainerDied","Data":"66be5b5530db210cf11e38063fc619e6c60c048998dfadb64b42f9c2d99d60d5"} Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.891634 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zzrp4" event={"ID":"1e5e9ab1-f5ac-4076-a1c8-f95a86c84928","Type":"ContainerDied","Data":"3539c31ffda9fe48a6959bf9471a70a9494ef8a2bcb126809fab762925bc1445"} Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.891717 4917 scope.go:117] "RemoveContainer" containerID="b4d1e0b490700030c6936e0741d495e245f022a108a30bfe1163a8f3573d520a" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.891954 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zzrp4" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.924775 4917 scope.go:117] "RemoveContainer" containerID="57501946401dc2cd82e83fdfe55200af99021693b25604e4d9ebda977327b8cf" Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.977791 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzrp4"] Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.986651 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zzrp4"] Mar 18 07:01:54 crc kubenswrapper[4917]: I0318 07:01:54.989470 4917 scope.go:117] "RemoveContainer" containerID="0e1b1f453cbbe2cbae3fe868afdf7284acc3d3081f37a06d050a8f64335e8aee" Mar 18 07:01:55 crc kubenswrapper[4917]: I0318 07:01:55.792351 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" path="/var/lib/kubelet/pods/1e5e9ab1-f5ac-4076-a1c8-f95a86c84928/volumes" Mar 18 07:01:55 crc kubenswrapper[4917]: I0318 07:01:55.811948 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-24wx8" Mar 18 07:01:55 crc kubenswrapper[4917]: I0318 07:01:55.843391 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gc44k" Mar 18 07:01:55 crc kubenswrapper[4917]: I0318 07:01:55.910465 4917 generic.go:334] "Generic (PLEG): container finished" podID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerID="85414301083eed91418267f6f4e4a36bdbbf3f3ace43658597a3803c10581db4" exitCode=0 Mar 18 07:01:55 crc kubenswrapper[4917]: I0318 07:01:55.910554 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" event={"ID":"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a","Type":"ContainerDied","Data":"85414301083eed91418267f6f4e4a36bdbbf3f3ace43658597a3803c10581db4"} Mar 18 07:01:55 crc kubenswrapper[4917]: I0318 07:01:55.920232 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-zscqs" Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.251784 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.399818 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxskb\" (UniqueName: \"kubernetes.io/projected/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-kube-api-access-fxskb\") pod \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.400035 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-bundle\") pod \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.400072 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-util\") pod \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\" (UID: \"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a\") " Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.401210 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-bundle" (OuterVolumeSpecName: "bundle") pod "8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" (UID: "8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.406859 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-kube-api-access-fxskb" (OuterVolumeSpecName: "kube-api-access-fxskb") pod "8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" (UID: "8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a"). InnerVolumeSpecName "kube-api-access-fxskb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.415478 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-util" (OuterVolumeSpecName: "util") pod "8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" (UID: "8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.501829 4917 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.501863 4917 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-util\") on node \"crc\" DevicePath \"\"" Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.501872 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxskb\" (UniqueName: \"kubernetes.io/projected/8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a-kube-api-access-fxskb\") on node \"crc\" DevicePath \"\"" Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.932150 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" event={"ID":"8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a","Type":"ContainerDied","Data":"78f4b29d9587f74d6a101871bb6dcf76ceb4bfd995955bd42f12e97e97d51e2f"} Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.932211 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78f4b29d9587f74d6a101871bb6dcf76ceb4bfd995955bd42f12e97e97d51e2f" Mar 18 07:01:57 crc kubenswrapper[4917]: I0318 07:01:57.932302 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.182929 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563622-qxl74"] Mar 18 07:02:00 crc kubenswrapper[4917]: E0318 07:02:00.184705 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerName="extract-content" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.184798 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerName="extract-content" Mar 18 07:02:00 crc kubenswrapper[4917]: E0318 07:02:00.184876 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerName="extract-utilities" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.184950 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerName="extract-utilities" Mar 18 07:02:00 crc kubenswrapper[4917]: E0318 07:02:00.185030 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerName="registry-server" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.185189 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerName="registry-server" Mar 18 07:02:00 crc kubenswrapper[4917]: E0318 07:02:00.185274 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerName="util" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.185358 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerName="util" Mar 18 07:02:00 crc kubenswrapper[4917]: E0318 07:02:00.185462 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerName="extract" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.185535 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerName="extract" Mar 18 07:02:00 crc kubenswrapper[4917]: E0318 07:02:00.185634 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerName="pull" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.185708 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerName="pull" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.185900 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e5e9ab1-f5ac-4076-a1c8-f95a86c84928" containerName="registry-server" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.186001 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a" containerName="extract" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.186523 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563622-qxl74" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.191383 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.191672 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.194879 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.219821 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563622-qxl74"] Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.348331 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9swf\" (UniqueName: \"kubernetes.io/projected/1bdcf478-11b5-4797-8f30-ba7446642b04-kube-api-access-k9swf\") pod \"auto-csr-approver-29563622-qxl74\" (UID: \"1bdcf478-11b5-4797-8f30-ba7446642b04\") " pod="openshift-infra/auto-csr-approver-29563622-qxl74" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.449462 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9swf\" (UniqueName: \"kubernetes.io/projected/1bdcf478-11b5-4797-8f30-ba7446642b04-kube-api-access-k9swf\") pod \"auto-csr-approver-29563622-qxl74\" (UID: \"1bdcf478-11b5-4797-8f30-ba7446642b04\") " pod="openshift-infra/auto-csr-approver-29563622-qxl74" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.469706 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9swf\" (UniqueName: \"kubernetes.io/projected/1bdcf478-11b5-4797-8f30-ba7446642b04-kube-api-access-k9swf\") pod \"auto-csr-approver-29563622-qxl74\" (UID: \"1bdcf478-11b5-4797-8f30-ba7446642b04\") " pod="openshift-infra/auto-csr-approver-29563622-qxl74" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.499927 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563622-qxl74" Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.782738 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563622-qxl74"] Mar 18 07:02:00 crc kubenswrapper[4917]: W0318 07:02:00.786970 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bdcf478_11b5_4797_8f30_ba7446642b04.slice/crio-1cf5b9cca6d0f70f5ba26e06a00abb69b1f1a73fba32fe859202f7d8d9d5455d WatchSource:0}: Error finding container 1cf5b9cca6d0f70f5ba26e06a00abb69b1f1a73fba32fe859202f7d8d9d5455d: Status 404 returned error can't find the container with id 1cf5b9cca6d0f70f5ba26e06a00abb69b1f1a73fba32fe859202f7d8d9d5455d Mar 18 07:02:00 crc kubenswrapper[4917]: I0318 07:02:00.949508 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563622-qxl74" event={"ID":"1bdcf478-11b5-4797-8f30-ba7446642b04","Type":"ContainerStarted","Data":"1cf5b9cca6d0f70f5ba26e06a00abb69b1f1a73fba32fe859202f7d8d9d5455d"} Mar 18 07:02:01 crc kubenswrapper[4917]: I0318 07:02:01.955697 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563622-qxl74" event={"ID":"1bdcf478-11b5-4797-8f30-ba7446642b04","Type":"ContainerStarted","Data":"4250ccdf8ecf8d41da280ff13cad4625a476403d650a3122ccab601098fea96e"} Mar 18 07:02:01 crc kubenswrapper[4917]: I0318 07:02:01.972249 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563622-qxl74" podStartSLOduration=1.064207425 podStartE2EDuration="1.972233647s" podCreationTimestamp="2026-03-18 07:02:00 +0000 UTC" firstStartedPulling="2026-03-18 07:02:00.789631214 +0000 UTC m=+905.730785928" lastFinishedPulling="2026-03-18 07:02:01.697657436 +0000 UTC m=+906.638812150" observedRunningTime="2026-03-18 07:02:01.96906077 +0000 UTC m=+906.910215484" watchObservedRunningTime="2026-03-18 07:02:01.972233647 +0000 UTC m=+906.913388361" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.592641 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk"] Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.593500 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.596057 4917 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-pzsf5" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.596453 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.596922 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.668272 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk"] Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.680411 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-466v7\" (UniqueName: \"kubernetes.io/projected/1dba3b1e-b109-4881-aaf7-4787b344c84a-kube-api-access-466v7\") pod \"cert-manager-operator-controller-manager-66c8bdd694-sjqrk\" (UID: \"1dba3b1e-b109-4881-aaf7-4787b344c84a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.680538 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1dba3b1e-b109-4881-aaf7-4787b344c84a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-sjqrk\" (UID: \"1dba3b1e-b109-4881-aaf7-4787b344c84a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.781327 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-466v7\" (UniqueName: \"kubernetes.io/projected/1dba3b1e-b109-4881-aaf7-4787b344c84a-kube-api-access-466v7\") pod \"cert-manager-operator-controller-manager-66c8bdd694-sjqrk\" (UID: \"1dba3b1e-b109-4881-aaf7-4787b344c84a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.781430 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1dba3b1e-b109-4881-aaf7-4787b344c84a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-sjqrk\" (UID: \"1dba3b1e-b109-4881-aaf7-4787b344c84a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.781921 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1dba3b1e-b109-4881-aaf7-4787b344c84a-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-sjqrk\" (UID: \"1dba3b1e-b109-4881-aaf7-4787b344c84a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.816544 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-466v7\" (UniqueName: \"kubernetes.io/projected/1dba3b1e-b109-4881-aaf7-4787b344c84a-kube-api-access-466v7\") pod \"cert-manager-operator-controller-manager-66c8bdd694-sjqrk\" (UID: \"1dba3b1e-b109-4881-aaf7-4787b344c84a\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.911075 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.929449 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.929495 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.961862 4917 generic.go:334] "Generic (PLEG): container finished" podID="1bdcf478-11b5-4797-8f30-ba7446642b04" containerID="4250ccdf8ecf8d41da280ff13cad4625a476403d650a3122ccab601098fea96e" exitCode=0 Mar 18 07:02:02 crc kubenswrapper[4917]: I0318 07:02:02.961901 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563622-qxl74" event={"ID":"1bdcf478-11b5-4797-8f30-ba7446642b04","Type":"ContainerDied","Data":"4250ccdf8ecf8d41da280ff13cad4625a476403d650a3122ccab601098fea96e"} Mar 18 07:02:03 crc kubenswrapper[4917]: I0318 07:02:03.132351 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk"] Mar 18 07:02:03 crc kubenswrapper[4917]: W0318 07:02:03.143035 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dba3b1e_b109_4881_aaf7_4787b344c84a.slice/crio-f9e9347b7f21f736bd1250fc433e44aea29f61f2f620384b43a1e737b93ecea4 WatchSource:0}: Error finding container f9e9347b7f21f736bd1250fc433e44aea29f61f2f620384b43a1e737b93ecea4: Status 404 returned error can't find the container with id f9e9347b7f21f736bd1250fc433e44aea29f61f2f620384b43a1e737b93ecea4 Mar 18 07:02:03 crc kubenswrapper[4917]: I0318 07:02:03.970933 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" event={"ID":"1dba3b1e-b109-4881-aaf7-4787b344c84a","Type":"ContainerStarted","Data":"f9e9347b7f21f736bd1250fc433e44aea29f61f2f620384b43a1e737b93ecea4"} Mar 18 07:02:04 crc kubenswrapper[4917]: I0318 07:02:04.237107 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563622-qxl74" Mar 18 07:02:04 crc kubenswrapper[4917]: I0318 07:02:04.301501 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9swf\" (UniqueName: \"kubernetes.io/projected/1bdcf478-11b5-4797-8f30-ba7446642b04-kube-api-access-k9swf\") pod \"1bdcf478-11b5-4797-8f30-ba7446642b04\" (UID: \"1bdcf478-11b5-4797-8f30-ba7446642b04\") " Mar 18 07:02:04 crc kubenswrapper[4917]: I0318 07:02:04.315090 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bdcf478-11b5-4797-8f30-ba7446642b04-kube-api-access-k9swf" (OuterVolumeSpecName: "kube-api-access-k9swf") pod "1bdcf478-11b5-4797-8f30-ba7446642b04" (UID: "1bdcf478-11b5-4797-8f30-ba7446642b04"). InnerVolumeSpecName "kube-api-access-k9swf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:02:04 crc kubenswrapper[4917]: I0318 07:02:04.403034 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k9swf\" (UniqueName: \"kubernetes.io/projected/1bdcf478-11b5-4797-8f30-ba7446642b04-kube-api-access-k9swf\") on node \"crc\" DevicePath \"\"" Mar 18 07:02:04 crc kubenswrapper[4917]: I0318 07:02:04.979822 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563622-qxl74" event={"ID":"1bdcf478-11b5-4797-8f30-ba7446642b04","Type":"ContainerDied","Data":"1cf5b9cca6d0f70f5ba26e06a00abb69b1f1a73fba32fe859202f7d8d9d5455d"} Mar 18 07:02:04 crc kubenswrapper[4917]: I0318 07:02:04.979855 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563622-qxl74" Mar 18 07:02:04 crc kubenswrapper[4917]: I0318 07:02:04.979863 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cf5b9cca6d0f70f5ba26e06a00abb69b1f1a73fba32fe859202f7d8d9d5455d" Mar 18 07:02:05 crc kubenswrapper[4917]: I0318 07:02:05.020994 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563616-nx5gm"] Mar 18 07:02:05 crc kubenswrapper[4917]: I0318 07:02:05.024617 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563616-nx5gm"] Mar 18 07:02:05 crc kubenswrapper[4917]: I0318 07:02:05.784258 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dde11ad-4301-47f1-a433-9a1d37b7f482" path="/var/lib/kubelet/pods/0dde11ad-4301-47f1-a433-9a1d37b7f482/volumes" Mar 18 07:02:06 crc kubenswrapper[4917]: I0318 07:02:06.992999 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" event={"ID":"1dba3b1e-b109-4881-aaf7-4787b344c84a","Type":"ContainerStarted","Data":"42436cf1fd4502f51ef028b6503727a74e0991b887880ca7d9b6deb7efa83a7d"} Mar 18 07:02:07 crc kubenswrapper[4917]: I0318 07:02:07.008337 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-sjqrk" podStartSLOduration=1.634585212 podStartE2EDuration="5.008321195s" podCreationTimestamp="2026-03-18 07:02:02 +0000 UTC" firstStartedPulling="2026-03-18 07:02:03.145714207 +0000 UTC m=+908.086868921" lastFinishedPulling="2026-03-18 07:02:06.51945019 +0000 UTC m=+911.460604904" observedRunningTime="2026-03-18 07:02:07.007390553 +0000 UTC m=+911.948545257" watchObservedRunningTime="2026-03-18 07:02:07.008321195 +0000 UTC m=+911.949475919" Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.721190 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-h8dkf"] Mar 18 07:02:10 crc kubenswrapper[4917]: E0318 07:02:10.722156 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdcf478-11b5-4797-8f30-ba7446642b04" containerName="oc" Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.722177 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdcf478-11b5-4797-8f30-ba7446642b04" containerName="oc" Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.722480 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdcf478-11b5-4797-8f30-ba7446642b04" containerName="oc" Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.723156 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.730150 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.730163 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.730397 4917 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-z759c" Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.750354 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-h8dkf"] Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.900311 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/487f44b8-8709-4c21-a458-5bf381a76858-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-h8dkf\" (UID: \"487f44b8-8709-4c21-a458-5bf381a76858\") " pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:10 crc kubenswrapper[4917]: I0318 07:02:10.900353 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2p9d\" (UniqueName: \"kubernetes.io/projected/487f44b8-8709-4c21-a458-5bf381a76858-kube-api-access-g2p9d\") pod \"cert-manager-webhook-6888856db4-h8dkf\" (UID: \"487f44b8-8709-4c21-a458-5bf381a76858\") " pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:11 crc kubenswrapper[4917]: I0318 07:02:11.002260 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/487f44b8-8709-4c21-a458-5bf381a76858-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-h8dkf\" (UID: \"487f44b8-8709-4c21-a458-5bf381a76858\") " pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:11 crc kubenswrapper[4917]: I0318 07:02:11.002314 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2p9d\" (UniqueName: \"kubernetes.io/projected/487f44b8-8709-4c21-a458-5bf381a76858-kube-api-access-g2p9d\") pod \"cert-manager-webhook-6888856db4-h8dkf\" (UID: \"487f44b8-8709-4c21-a458-5bf381a76858\") " pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:11 crc kubenswrapper[4917]: I0318 07:02:11.031320 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/487f44b8-8709-4c21-a458-5bf381a76858-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-h8dkf\" (UID: \"487f44b8-8709-4c21-a458-5bf381a76858\") " pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:11 crc kubenswrapper[4917]: I0318 07:02:11.031538 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2p9d\" (UniqueName: \"kubernetes.io/projected/487f44b8-8709-4c21-a458-5bf381a76858-kube-api-access-g2p9d\") pod \"cert-manager-webhook-6888856db4-h8dkf\" (UID: \"487f44b8-8709-4c21-a458-5bf381a76858\") " pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:11 crc kubenswrapper[4917]: I0318 07:02:11.103043 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:11 crc kubenswrapper[4917]: I0318 07:02:11.545407 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-h8dkf"] Mar 18 07:02:12 crc kubenswrapper[4917]: I0318 07:02:12.020009 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" event={"ID":"487f44b8-8709-4c21-a458-5bf381a76858","Type":"ContainerStarted","Data":"92a38b7891e92b689407476074d34c993516ad3f8484ee75c67f5aa3f3d50eee"} Mar 18 07:02:13 crc kubenswrapper[4917]: I0318 07:02:13.854945 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-wkfj5"] Mar 18 07:02:13 crc kubenswrapper[4917]: I0318 07:02:13.855847 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" Mar 18 07:02:13 crc kubenswrapper[4917]: I0318 07:02:13.857987 4917 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-g5qn9" Mar 18 07:02:13 crc kubenswrapper[4917]: I0318 07:02:13.864240 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-wkfj5"] Mar 18 07:02:14 crc kubenswrapper[4917]: I0318 07:02:14.042105 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz87g\" (UniqueName: \"kubernetes.io/projected/1690baaf-577c-4ce7-afe8-480821bb1419-kube-api-access-fz87g\") pod \"cert-manager-cainjector-5545bd876-wkfj5\" (UID: \"1690baaf-577c-4ce7-afe8-480821bb1419\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" Mar 18 07:02:14 crc kubenswrapper[4917]: I0318 07:02:14.042464 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1690baaf-577c-4ce7-afe8-480821bb1419-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-wkfj5\" (UID: \"1690baaf-577c-4ce7-afe8-480821bb1419\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" Mar 18 07:02:14 crc kubenswrapper[4917]: I0318 07:02:14.143415 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1690baaf-577c-4ce7-afe8-480821bb1419-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-wkfj5\" (UID: \"1690baaf-577c-4ce7-afe8-480821bb1419\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" Mar 18 07:02:14 crc kubenswrapper[4917]: I0318 07:02:14.143535 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz87g\" (UniqueName: \"kubernetes.io/projected/1690baaf-577c-4ce7-afe8-480821bb1419-kube-api-access-fz87g\") pod \"cert-manager-cainjector-5545bd876-wkfj5\" (UID: \"1690baaf-577c-4ce7-afe8-480821bb1419\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" Mar 18 07:02:14 crc kubenswrapper[4917]: I0318 07:02:14.161056 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1690baaf-577c-4ce7-afe8-480821bb1419-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-wkfj5\" (UID: \"1690baaf-577c-4ce7-afe8-480821bb1419\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" Mar 18 07:02:14 crc kubenswrapper[4917]: I0318 07:02:14.161284 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz87g\" (UniqueName: \"kubernetes.io/projected/1690baaf-577c-4ce7-afe8-480821bb1419-kube-api-access-fz87g\") pod \"cert-manager-cainjector-5545bd876-wkfj5\" (UID: \"1690baaf-577c-4ce7-afe8-480821bb1419\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" Mar 18 07:02:14 crc kubenswrapper[4917]: I0318 07:02:14.181743 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" Mar 18 07:02:14 crc kubenswrapper[4917]: I0318 07:02:14.590302 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-wkfj5"] Mar 18 07:02:17 crc kubenswrapper[4917]: I0318 07:02:17.055708 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" event={"ID":"1690baaf-577c-4ce7-afe8-480821bb1419","Type":"ContainerStarted","Data":"40e31d6e50de62f5e6505e05f3e4edc51e6cab7cef7967e70ac3e3a8f6693649"} Mar 18 07:02:17 crc kubenswrapper[4917]: I0318 07:02:17.058353 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" event={"ID":"487f44b8-8709-4c21-a458-5bf381a76858","Type":"ContainerStarted","Data":"58c3e8dc7861c64ed4f3bcdce99644739341cfbc4b8cd04b3d12739bd3794fe8"} Mar 18 07:02:17 crc kubenswrapper[4917]: I0318 07:02:17.058852 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:17 crc kubenswrapper[4917]: I0318 07:02:17.085541 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" podStartSLOduration=2.435340081 podStartE2EDuration="7.085516786s" podCreationTimestamp="2026-03-18 07:02:10 +0000 UTC" firstStartedPulling="2026-03-18 07:02:11.554514164 +0000 UTC m=+916.495668878" lastFinishedPulling="2026-03-18 07:02:16.204690859 +0000 UTC m=+921.145845583" observedRunningTime="2026-03-18 07:02:17.081213585 +0000 UTC m=+922.022368349" watchObservedRunningTime="2026-03-18 07:02:17.085516786 +0000 UTC m=+922.026671540" Mar 18 07:02:18 crc kubenswrapper[4917]: I0318 07:02:18.066341 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" event={"ID":"1690baaf-577c-4ce7-afe8-480821bb1419","Type":"ContainerStarted","Data":"7ccefd5632df6426e92e7f08e0dd6245fc512318b83a5521fb0ff64b2256ba2b"} Mar 18 07:02:18 crc kubenswrapper[4917]: I0318 07:02:18.094018 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-wkfj5" podStartSLOduration=3.367004879 podStartE2EDuration="5.093997033s" podCreationTimestamp="2026-03-18 07:02:13 +0000 UTC" firstStartedPulling="2026-03-18 07:02:16.138429832 +0000 UTC m=+921.079584546" lastFinishedPulling="2026-03-18 07:02:17.865421946 +0000 UTC m=+922.806576700" observedRunningTime="2026-03-18 07:02:18.091183838 +0000 UTC m=+923.032338572" watchObservedRunningTime="2026-03-18 07:02:18.093997033 +0000 UTC m=+923.035151777" Mar 18 07:02:21 crc kubenswrapper[4917]: I0318 07:02:21.118095 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-h8dkf" Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.632084 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-hf4fm"] Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.633629 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-hf4fm" Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.636731 4917 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-k5pl7" Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.644031 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-hf4fm"] Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.674063 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24c2a3d3-a81f-4920-bef3-7def59acacec-bound-sa-token\") pod \"cert-manager-545d4d4674-hf4fm\" (UID: \"24c2a3d3-a81f-4920-bef3-7def59acacec\") " pod="cert-manager/cert-manager-545d4d4674-hf4fm" Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.674180 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpmbg\" (UniqueName: \"kubernetes.io/projected/24c2a3d3-a81f-4920-bef3-7def59acacec-kube-api-access-zpmbg\") pod \"cert-manager-545d4d4674-hf4fm\" (UID: \"24c2a3d3-a81f-4920-bef3-7def59acacec\") " pod="cert-manager/cert-manager-545d4d4674-hf4fm" Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.775617 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpmbg\" (UniqueName: \"kubernetes.io/projected/24c2a3d3-a81f-4920-bef3-7def59acacec-kube-api-access-zpmbg\") pod \"cert-manager-545d4d4674-hf4fm\" (UID: \"24c2a3d3-a81f-4920-bef3-7def59acacec\") " pod="cert-manager/cert-manager-545d4d4674-hf4fm" Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.775738 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24c2a3d3-a81f-4920-bef3-7def59acacec-bound-sa-token\") pod \"cert-manager-545d4d4674-hf4fm\" (UID: \"24c2a3d3-a81f-4920-bef3-7def59acacec\") " pod="cert-manager/cert-manager-545d4d4674-hf4fm" Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.809472 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24c2a3d3-a81f-4920-bef3-7def59acacec-bound-sa-token\") pod \"cert-manager-545d4d4674-hf4fm\" (UID: \"24c2a3d3-a81f-4920-bef3-7def59acacec\") " pod="cert-manager/cert-manager-545d4d4674-hf4fm" Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.810087 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpmbg\" (UniqueName: \"kubernetes.io/projected/24c2a3d3-a81f-4920-bef3-7def59acacec-kube-api-access-zpmbg\") pod \"cert-manager-545d4d4674-hf4fm\" (UID: \"24c2a3d3-a81f-4920-bef3-7def59acacec\") " pod="cert-manager/cert-manager-545d4d4674-hf4fm" Mar 18 07:02:29 crc kubenswrapper[4917]: I0318 07:02:29.951893 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-hf4fm" Mar 18 07:02:30 crc kubenswrapper[4917]: I0318 07:02:30.463194 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-hf4fm"] Mar 18 07:02:31 crc kubenswrapper[4917]: I0318 07:02:31.202893 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-hf4fm" event={"ID":"24c2a3d3-a81f-4920-bef3-7def59acacec","Type":"ContainerStarted","Data":"4d239dae8b76bfdbfb717b52dfcd36a8db1c8a86d98789b4f8c5667c972f622b"} Mar 18 07:02:31 crc kubenswrapper[4917]: I0318 07:02:31.203312 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-hf4fm" event={"ID":"24c2a3d3-a81f-4920-bef3-7def59acacec","Type":"ContainerStarted","Data":"292080a4407403ac59d42187f3e5a583301cc5dd7883688a5638ed9f7b27ae4f"} Mar 18 07:02:31 crc kubenswrapper[4917]: I0318 07:02:31.232182 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-hf4fm" podStartSLOduration=2.232164656 podStartE2EDuration="2.232164656s" podCreationTimestamp="2026-03-18 07:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:02:31.226613296 +0000 UTC m=+936.167768020" watchObservedRunningTime="2026-03-18 07:02:31.232164656 +0000 UTC m=+936.173319380" Mar 18 07:02:32 crc kubenswrapper[4917]: I0318 07:02:32.929063 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:02:32 crc kubenswrapper[4917]: I0318 07:02:32.929136 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:02:32 crc kubenswrapper[4917]: I0318 07:02:32.929455 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:02:32 crc kubenswrapper[4917]: I0318 07:02:32.930098 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1fa455e15b1a756345723f1e179413cfc4b43062d137c44fce060567289753f"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:02:32 crc kubenswrapper[4917]: I0318 07:02:32.930180 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://d1fa455e15b1a756345723f1e179413cfc4b43062d137c44fce060567289753f" gracePeriod=600 Mar 18 07:02:33 crc kubenswrapper[4917]: I0318 07:02:33.226352 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="d1fa455e15b1a756345723f1e179413cfc4b43062d137c44fce060567289753f" exitCode=0 Mar 18 07:02:33 crc kubenswrapper[4917]: I0318 07:02:33.226842 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"d1fa455e15b1a756345723f1e179413cfc4b43062d137c44fce060567289753f"} Mar 18 07:02:33 crc kubenswrapper[4917]: I0318 07:02:33.227212 4917 scope.go:117] "RemoveContainer" containerID="db29f517217de042b6ac5da88be9d3c468c408a7daa895e65fa34100384451a5" Mar 18 07:02:34 crc kubenswrapper[4917]: I0318 07:02:34.242120 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"d0854ed049fe9b7cfdc3675b0efe75d5c58d0e2de88a3782f41bc4ebb18b9f74"} Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.068332 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-rz4fx"] Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.069845 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rz4fx" Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.072561 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-w9c5l" Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.073330 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.073436 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.081956 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rz4fx"] Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.280131 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5ct8\" (UniqueName: \"kubernetes.io/projected/d435652d-acf5-4a71-8bbb-81296e5c91a5-kube-api-access-r5ct8\") pod \"openstack-operator-index-rz4fx\" (UID: \"d435652d-acf5-4a71-8bbb-81296e5c91a5\") " pod="openstack-operators/openstack-operator-index-rz4fx" Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.381214 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5ct8\" (UniqueName: \"kubernetes.io/projected/d435652d-acf5-4a71-8bbb-81296e5c91a5-kube-api-access-r5ct8\") pod \"openstack-operator-index-rz4fx\" (UID: \"d435652d-acf5-4a71-8bbb-81296e5c91a5\") " pod="openstack-operators/openstack-operator-index-rz4fx" Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.403990 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5ct8\" (UniqueName: \"kubernetes.io/projected/d435652d-acf5-4a71-8bbb-81296e5c91a5-kube-api-access-r5ct8\") pod \"openstack-operator-index-rz4fx\" (UID: \"d435652d-acf5-4a71-8bbb-81296e5c91a5\") " pod="openstack-operators/openstack-operator-index-rz4fx" Mar 18 07:02:35 crc kubenswrapper[4917]: I0318 07:02:35.694868 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rz4fx" Mar 18 07:02:36 crc kubenswrapper[4917]: I0318 07:02:36.148340 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-rz4fx"] Mar 18 07:02:36 crc kubenswrapper[4917]: I0318 07:02:36.254395 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rz4fx" event={"ID":"d435652d-acf5-4a71-8bbb-81296e5c91a5","Type":"ContainerStarted","Data":"924454b6bac2d02c4a5f239b506771d560ed0897af69f51c5010c2a09f94eeb8"} Mar 18 07:02:38 crc kubenswrapper[4917]: I0318 07:02:38.274411 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rz4fx" event={"ID":"d435652d-acf5-4a71-8bbb-81296e5c91a5","Type":"ContainerStarted","Data":"99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4"} Mar 18 07:02:38 crc kubenswrapper[4917]: I0318 07:02:38.301698 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-rz4fx" podStartSLOduration=2.257474272 podStartE2EDuration="3.301666982s" podCreationTimestamp="2026-03-18 07:02:35 +0000 UTC" firstStartedPulling="2026-03-18 07:02:36.157278903 +0000 UTC m=+941.098433627" lastFinishedPulling="2026-03-18 07:02:37.201471613 +0000 UTC m=+942.142626337" observedRunningTime="2026-03-18 07:02:38.297372502 +0000 UTC m=+943.238527286" watchObservedRunningTime="2026-03-18 07:02:38.301666982 +0000 UTC m=+943.242821776" Mar 18 07:02:38 crc kubenswrapper[4917]: I0318 07:02:38.434288 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rz4fx"] Mar 18 07:02:39 crc kubenswrapper[4917]: I0318 07:02:39.041678 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mtzqz"] Mar 18 07:02:39 crc kubenswrapper[4917]: I0318 07:02:39.043271 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtzqz" Mar 18 07:02:39 crc kubenswrapper[4917]: I0318 07:02:39.057350 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mtzqz"] Mar 18 07:02:39 crc kubenswrapper[4917]: I0318 07:02:39.238322 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt4st\" (UniqueName: \"kubernetes.io/projected/ad827dce-95e9-4fc0-bcc8-14788aba8766-kube-api-access-xt4st\") pod \"openstack-operator-index-mtzqz\" (UID: \"ad827dce-95e9-4fc0-bcc8-14788aba8766\") " pod="openstack-operators/openstack-operator-index-mtzqz" Mar 18 07:02:39 crc kubenswrapper[4917]: I0318 07:02:39.339578 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt4st\" (UniqueName: \"kubernetes.io/projected/ad827dce-95e9-4fc0-bcc8-14788aba8766-kube-api-access-xt4st\") pod \"openstack-operator-index-mtzqz\" (UID: \"ad827dce-95e9-4fc0-bcc8-14788aba8766\") " pod="openstack-operators/openstack-operator-index-mtzqz" Mar 18 07:02:39 crc kubenswrapper[4917]: I0318 07:02:39.372531 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt4st\" (UniqueName: \"kubernetes.io/projected/ad827dce-95e9-4fc0-bcc8-14788aba8766-kube-api-access-xt4st\") pod \"openstack-operator-index-mtzqz\" (UID: \"ad827dce-95e9-4fc0-bcc8-14788aba8766\") " pod="openstack-operators/openstack-operator-index-mtzqz" Mar 18 07:02:39 crc kubenswrapper[4917]: I0318 07:02:39.379139 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mtzqz" Mar 18 07:02:39 crc kubenswrapper[4917]: I0318 07:02:39.872838 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mtzqz"] Mar 18 07:02:39 crc kubenswrapper[4917]: W0318 07:02:39.878395 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad827dce_95e9_4fc0_bcc8_14788aba8766.slice/crio-febb6c29d943dbc119d94a3703af7687bf2dc8b83f154b3372c5ce3093d2b8fe WatchSource:0}: Error finding container febb6c29d943dbc119d94a3703af7687bf2dc8b83f154b3372c5ce3093d2b8fe: Status 404 returned error can't find the container with id febb6c29d943dbc119d94a3703af7687bf2dc8b83f154b3372c5ce3093d2b8fe Mar 18 07:02:40 crc kubenswrapper[4917]: I0318 07:02:40.291411 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtzqz" event={"ID":"ad827dce-95e9-4fc0-bcc8-14788aba8766","Type":"ContainerStarted","Data":"febb6c29d943dbc119d94a3703af7687bf2dc8b83f154b3372c5ce3093d2b8fe"} Mar 18 07:02:40 crc kubenswrapper[4917]: I0318 07:02:40.291558 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-rz4fx" podUID="d435652d-acf5-4a71-8bbb-81296e5c91a5" containerName="registry-server" containerID="cri-o://99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4" gracePeriod=2 Mar 18 07:02:40 crc kubenswrapper[4917]: I0318 07:02:40.759159 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rz4fx" Mar 18 07:02:40 crc kubenswrapper[4917]: I0318 07:02:40.867335 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5ct8\" (UniqueName: \"kubernetes.io/projected/d435652d-acf5-4a71-8bbb-81296e5c91a5-kube-api-access-r5ct8\") pod \"d435652d-acf5-4a71-8bbb-81296e5c91a5\" (UID: \"d435652d-acf5-4a71-8bbb-81296e5c91a5\") " Mar 18 07:02:40 crc kubenswrapper[4917]: I0318 07:02:40.876374 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d435652d-acf5-4a71-8bbb-81296e5c91a5-kube-api-access-r5ct8" (OuterVolumeSpecName: "kube-api-access-r5ct8") pod "d435652d-acf5-4a71-8bbb-81296e5c91a5" (UID: "d435652d-acf5-4a71-8bbb-81296e5c91a5"). InnerVolumeSpecName "kube-api-access-r5ct8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:02:40 crc kubenswrapper[4917]: I0318 07:02:40.970853 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5ct8\" (UniqueName: \"kubernetes.io/projected/d435652d-acf5-4a71-8bbb-81296e5c91a5-kube-api-access-r5ct8\") on node \"crc\" DevicePath \"\"" Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.301081 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mtzqz" event={"ID":"ad827dce-95e9-4fc0-bcc8-14788aba8766","Type":"ContainerStarted","Data":"504f7b08f55780fbfb2f211370861d2c9fc0ba9e56ca6f23d9e2302d81f5b1f9"} Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.304495 4917 generic.go:334] "Generic (PLEG): container finished" podID="d435652d-acf5-4a71-8bbb-81296e5c91a5" containerID="99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4" exitCode=0 Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.304541 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rz4fx" event={"ID":"d435652d-acf5-4a71-8bbb-81296e5c91a5","Type":"ContainerDied","Data":"99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4"} Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.304573 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-rz4fx" event={"ID":"d435652d-acf5-4a71-8bbb-81296e5c91a5","Type":"ContainerDied","Data":"924454b6bac2d02c4a5f239b506771d560ed0897af69f51c5010c2a09f94eeb8"} Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.304624 4917 scope.go:117] "RemoveContainer" containerID="99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4" Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.304664 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-rz4fx" Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.327309 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mtzqz" podStartSLOduration=1.916549957 podStartE2EDuration="2.327265537s" podCreationTimestamp="2026-03-18 07:02:39 +0000 UTC" firstStartedPulling="2026-03-18 07:02:39.885242567 +0000 UTC m=+944.826397321" lastFinishedPulling="2026-03-18 07:02:40.295958147 +0000 UTC m=+945.237112901" observedRunningTime="2026-03-18 07:02:41.322833623 +0000 UTC m=+946.263988337" watchObservedRunningTime="2026-03-18 07:02:41.327265537 +0000 UTC m=+946.268420251" Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.333125 4917 scope.go:117] "RemoveContainer" containerID="99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4" Mar 18 07:02:41 crc kubenswrapper[4917]: E0318 07:02:41.333683 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4\": container with ID starting with 99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4 not found: ID does not exist" containerID="99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4" Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.333722 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4"} err="failed to get container status \"99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4\": rpc error: code = NotFound desc = could not find container \"99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4\": container with ID starting with 99fe8ec4e084df44d2f29e6240c82728fed8b39e0a2798b8239b39810af8f0a4 not found: ID does not exist" Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.343930 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-rz4fx"] Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.348038 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-rz4fx"] Mar 18 07:02:41 crc kubenswrapper[4917]: I0318 07:02:41.785774 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d435652d-acf5-4a71-8bbb-81296e5c91a5" path="/var/lib/kubelet/pods/d435652d-acf5-4a71-8bbb-81296e5c91a5/volumes" Mar 18 07:02:49 crc kubenswrapper[4917]: I0318 07:02:49.380345 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-mtzqz" Mar 18 07:02:49 crc kubenswrapper[4917]: I0318 07:02:49.381983 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mtzqz" Mar 18 07:02:49 crc kubenswrapper[4917]: I0318 07:02:49.430737 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mtzqz" Mar 18 07:02:50 crc kubenswrapper[4917]: I0318 07:02:50.433486 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mtzqz" Mar 18 07:02:54 crc kubenswrapper[4917]: I0318 07:02:54.970429 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv"] Mar 18 07:02:54 crc kubenswrapper[4917]: E0318 07:02:54.972130 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d435652d-acf5-4a71-8bbb-81296e5c91a5" containerName="registry-server" Mar 18 07:02:54 crc kubenswrapper[4917]: I0318 07:02:54.972172 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d435652d-acf5-4a71-8bbb-81296e5c91a5" containerName="registry-server" Mar 18 07:02:54 crc kubenswrapper[4917]: I0318 07:02:54.972450 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d435652d-acf5-4a71-8bbb-81296e5c91a5" containerName="registry-server" Mar 18 07:02:54 crc kubenswrapper[4917]: I0318 07:02:54.974098 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:54 crc kubenswrapper[4917]: I0318 07:02:54.995522 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-jnnll" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.007846 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv"] Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.100547 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-bundle\") pod \"dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.100795 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-util\") pod \"dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.100975 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj8s4\" (UniqueName: \"kubernetes.io/projected/305d3d2a-a9ee-443c-a91a-8c72ec108b60-kube-api-access-gj8s4\") pod \"dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.203324 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-bundle\") pod \"dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.203416 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-util\") pod \"dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.203482 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj8s4\" (UniqueName: \"kubernetes.io/projected/305d3d2a-a9ee-443c-a91a-8c72ec108b60-kube-api-access-gj8s4\") pod \"dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.203899 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-bundle\") pod \"dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.204044 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-util\") pod \"dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.230108 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj8s4\" (UniqueName: \"kubernetes.io/projected/305d3d2a-a9ee-443c-a91a-8c72ec108b60-kube-api-access-gj8s4\") pod \"dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.303116 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:02:55 crc kubenswrapper[4917]: I0318 07:02:55.587065 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv"] Mar 18 07:02:56 crc kubenswrapper[4917]: I0318 07:02:56.452311 4917 generic.go:334] "Generic (PLEG): container finished" podID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerID="0cc9de5bd9e4991efa60774975f99e3227fabd35c385abea7ba586d56d4cefab" exitCode=0 Mar 18 07:02:56 crc kubenswrapper[4917]: I0318 07:02:56.452417 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" event={"ID":"305d3d2a-a9ee-443c-a91a-8c72ec108b60","Type":"ContainerDied","Data":"0cc9de5bd9e4991efa60774975f99e3227fabd35c385abea7ba586d56d4cefab"} Mar 18 07:02:56 crc kubenswrapper[4917]: I0318 07:02:56.452917 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" event={"ID":"305d3d2a-a9ee-443c-a91a-8c72ec108b60","Type":"ContainerStarted","Data":"c0762893ba7d1960b78e001f69dd27ae0f4c2f12d55e02cdebdd193c7119fe1c"} Mar 18 07:02:56 crc kubenswrapper[4917]: I0318 07:02:56.524670 4917 scope.go:117] "RemoveContainer" containerID="5009656d9ccd2927642c0a8f3bd120b0a5772d75d6a890c507113c93b9aa92d0" Mar 18 07:02:58 crc kubenswrapper[4917]: I0318 07:02:58.475491 4917 generic.go:334] "Generic (PLEG): container finished" podID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerID="07f1e7980ff1b97c5de3b78a9c90c1ef97194fd8b74d975028d9fa689a3bba7e" exitCode=0 Mar 18 07:02:58 crc kubenswrapper[4917]: I0318 07:02:58.475627 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" event={"ID":"305d3d2a-a9ee-443c-a91a-8c72ec108b60","Type":"ContainerDied","Data":"07f1e7980ff1b97c5de3b78a9c90c1ef97194fd8b74d975028d9fa689a3bba7e"} Mar 18 07:02:59 crc kubenswrapper[4917]: I0318 07:02:59.488289 4917 generic.go:334] "Generic (PLEG): container finished" podID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerID="d7339e010baa20f46c2f98607170dded94e944595e37927cdec4eeb9786e8345" exitCode=0 Mar 18 07:02:59 crc kubenswrapper[4917]: I0318 07:02:59.488425 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" event={"ID":"305d3d2a-a9ee-443c-a91a-8c72ec108b60","Type":"ContainerDied","Data":"d7339e010baa20f46c2f98607170dded94e944595e37927cdec4eeb9786e8345"} Mar 18 07:03:00 crc kubenswrapper[4917]: I0318 07:03:00.861266 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:03:00 crc kubenswrapper[4917]: I0318 07:03:00.897567 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-bundle\") pod \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " Mar 18 07:03:00 crc kubenswrapper[4917]: I0318 07:03:00.897725 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj8s4\" (UniqueName: \"kubernetes.io/projected/305d3d2a-a9ee-443c-a91a-8c72ec108b60-kube-api-access-gj8s4\") pod \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " Mar 18 07:03:00 crc kubenswrapper[4917]: I0318 07:03:00.897825 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-util\") pod \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\" (UID: \"305d3d2a-a9ee-443c-a91a-8c72ec108b60\") " Mar 18 07:03:00 crc kubenswrapper[4917]: I0318 07:03:00.902224 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-bundle" (OuterVolumeSpecName: "bundle") pod "305d3d2a-a9ee-443c-a91a-8c72ec108b60" (UID: "305d3d2a-a9ee-443c-a91a-8c72ec108b60"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:03:00 crc kubenswrapper[4917]: I0318 07:03:00.909849 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305d3d2a-a9ee-443c-a91a-8c72ec108b60-kube-api-access-gj8s4" (OuterVolumeSpecName: "kube-api-access-gj8s4") pod "305d3d2a-a9ee-443c-a91a-8c72ec108b60" (UID: "305d3d2a-a9ee-443c-a91a-8c72ec108b60"). InnerVolumeSpecName "kube-api-access-gj8s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:03:00 crc kubenswrapper[4917]: I0318 07:03:00.927858 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-util" (OuterVolumeSpecName: "util") pod "305d3d2a-a9ee-443c-a91a-8c72ec108b60" (UID: "305d3d2a-a9ee-443c-a91a-8c72ec108b60"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:03:01 crc kubenswrapper[4917]: I0318 07:03:01.000074 4917 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-util\") on node \"crc\" DevicePath \"\"" Mar 18 07:03:01 crc kubenswrapper[4917]: I0318 07:03:01.000107 4917 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/305d3d2a-a9ee-443c-a91a-8c72ec108b60-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:03:01 crc kubenswrapper[4917]: I0318 07:03:01.000121 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj8s4\" (UniqueName: \"kubernetes.io/projected/305d3d2a-a9ee-443c-a91a-8c72ec108b60-kube-api-access-gj8s4\") on node \"crc\" DevicePath \"\"" Mar 18 07:03:01 crc kubenswrapper[4917]: I0318 07:03:01.519765 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" event={"ID":"305d3d2a-a9ee-443c-a91a-8c72ec108b60","Type":"ContainerDied","Data":"c0762893ba7d1960b78e001f69dd27ae0f4c2f12d55e02cdebdd193c7119fe1c"} Mar 18 07:03:01 crc kubenswrapper[4917]: I0318 07:03:01.519856 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0762893ba7d1960b78e001f69dd27ae0f4c2f12d55e02cdebdd193c7119fe1c" Mar 18 07:03:01 crc kubenswrapper[4917]: I0318 07:03:01.519865 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.633328 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn"] Mar 18 07:03:07 crc kubenswrapper[4917]: E0318 07:03:07.634195 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerName="extract" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.634211 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerName="extract" Mar 18 07:03:07 crc kubenswrapper[4917]: E0318 07:03:07.634226 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerName="pull" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.634234 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerName="pull" Mar 18 07:03:07 crc kubenswrapper[4917]: E0318 07:03:07.634253 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerName="util" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.634263 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerName="util" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.634405 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="305d3d2a-a9ee-443c-a91a-8c72ec108b60" containerName="extract" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.634940 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.652958 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-n5j24" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.684620 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn"] Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.710453 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqxn8\" (UniqueName: \"kubernetes.io/projected/fc407819-4730-4ca0-9ccb-dcf8ba968fa9-kube-api-access-gqxn8\") pod \"openstack-operator-controller-init-57d974f4f8-dztqn\" (UID: \"fc407819-4730-4ca0-9ccb-dcf8ba968fa9\") " pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.811733 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqxn8\" (UniqueName: \"kubernetes.io/projected/fc407819-4730-4ca0-9ccb-dcf8ba968fa9-kube-api-access-gqxn8\") pod \"openstack-operator-controller-init-57d974f4f8-dztqn\" (UID: \"fc407819-4730-4ca0-9ccb-dcf8ba968fa9\") " pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.833302 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqxn8\" (UniqueName: \"kubernetes.io/projected/fc407819-4730-4ca0-9ccb-dcf8ba968fa9-kube-api-access-gqxn8\") pod \"openstack-operator-controller-init-57d974f4f8-dztqn\" (UID: \"fc407819-4730-4ca0-9ccb-dcf8ba968fa9\") " pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" Mar 18 07:03:07 crc kubenswrapper[4917]: I0318 07:03:07.953170 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" Mar 18 07:03:08 crc kubenswrapper[4917]: I0318 07:03:08.465243 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn"] Mar 18 07:03:08 crc kubenswrapper[4917]: I0318 07:03:08.568598 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" event={"ID":"fc407819-4730-4ca0-9ccb-dcf8ba968fa9","Type":"ContainerStarted","Data":"1259ba64b59b04e94d925c35f38c9ba480d56b47e88209982f95be1e7f93de49"} Mar 18 07:03:13 crc kubenswrapper[4917]: I0318 07:03:13.613550 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" event={"ID":"fc407819-4730-4ca0-9ccb-dcf8ba968fa9","Type":"ContainerStarted","Data":"6f3911603a5fb32f7cc15f54acd166253aea122c25b5eb8dabe2d6579ead4852"} Mar 18 07:03:13 crc kubenswrapper[4917]: I0318 07:03:13.614374 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" Mar 18 07:03:13 crc kubenswrapper[4917]: I0318 07:03:13.659608 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" podStartSLOduration=2.225098982 podStartE2EDuration="6.659559052s" podCreationTimestamp="2026-03-18 07:03:07 +0000 UTC" firstStartedPulling="2026-03-18 07:03:08.479567145 +0000 UTC m=+973.420721879" lastFinishedPulling="2026-03-18 07:03:12.914027195 +0000 UTC m=+977.855181949" observedRunningTime="2026-03-18 07:03:13.656968872 +0000 UTC m=+978.598123626" watchObservedRunningTime="2026-03-18 07:03:13.659559052 +0000 UTC m=+978.600713806" Mar 18 07:03:27 crc kubenswrapper[4917]: I0318 07:03:27.957259 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-57d974f4f8-dztqn" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.046813 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.049292 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.050696 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.051636 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.058412 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.059318 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.061613 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-cqtwn" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.061962 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-85pv2" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.069781 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4z4cf" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.072139 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.104614 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.108550 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llm8m\" (UniqueName: \"kubernetes.io/projected/f0153e44-31ff-4d0f-8592-d6171f714f97-kube-api-access-llm8m\") pod \"designate-operator-controller-manager-588d4d986b-ptnzg\" (UID: \"f0153e44-31ff-4d0f-8592-d6171f714f97\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.108622 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htqn7\" (UniqueName: \"kubernetes.io/projected/e340972f-e354-42b8-98f0-7188fe46ae69-kube-api-access-htqn7\") pod \"barbican-operator-controller-manager-59bc569d95-9xcfc\" (UID: \"e340972f-e354-42b8-98f0-7188fe46ae69\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.108664 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhwzv\" (UniqueName: \"kubernetes.io/projected/bcaaf338-4e05-4578-ab2c-55a4a6909f0d-kube-api-access-bhwzv\") pod \"cinder-operator-controller-manager-8d58dc466-jh9lg\" (UID: \"bcaaf338-4e05-4578-ab2c-55a4a6909f0d\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.118121 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.124184 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.124989 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.127929 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-7qkph" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.139288 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.140085 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.141750 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8s5vf" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.153723 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.179695 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.209687 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.210671 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.212008 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llm8m\" (UniqueName: \"kubernetes.io/projected/f0153e44-31ff-4d0f-8592-d6171f714f97-kube-api-access-llm8m\") pod \"designate-operator-controller-manager-588d4d986b-ptnzg\" (UID: \"f0153e44-31ff-4d0f-8592-d6171f714f97\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.212036 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htqn7\" (UniqueName: \"kubernetes.io/projected/e340972f-e354-42b8-98f0-7188fe46ae69-kube-api-access-htqn7\") pod \"barbican-operator-controller-manager-59bc569d95-9xcfc\" (UID: \"e340972f-e354-42b8-98f0-7188fe46ae69\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.212074 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vc72\" (UniqueName: \"kubernetes.io/projected/617f6235-ba96-4558-8bc4-bef2488095d2-kube-api-access-5vc72\") pod \"glance-operator-controller-manager-79df6bcc97-5d6d5\" (UID: \"617f6235-ba96-4558-8bc4-bef2488095d2\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.212093 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhwzv\" (UniqueName: \"kubernetes.io/projected/bcaaf338-4e05-4578-ab2c-55a4a6909f0d-kube-api-access-bhwzv\") pod \"cinder-operator-controller-manager-8d58dc466-jh9lg\" (UID: \"bcaaf338-4e05-4578-ab2c-55a4a6909f0d\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.212125 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b5fc\" (UniqueName: \"kubernetes.io/projected/46d6297c-23d8-40d4-8238-c4b5c4c8669c-kube-api-access-6b5fc\") pod \"heat-operator-controller-manager-67dd5f86f5-d2748\" (UID: \"46d6297c-23d8-40d4-8238-c4b5c4c8669c\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.212476 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.213208 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.214066 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.216666 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-mr248" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.216871 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-25xq6" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.227963 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.228679 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.233371 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-pcnqp" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.237738 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.258020 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llm8m\" (UniqueName: \"kubernetes.io/projected/f0153e44-31ff-4d0f-8592-d6171f714f97-kube-api-access-llm8m\") pod \"designate-operator-controller-manager-588d4d986b-ptnzg\" (UID: \"f0153e44-31ff-4d0f-8592-d6171f714f97\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.259201 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htqn7\" (UniqueName: \"kubernetes.io/projected/e340972f-e354-42b8-98f0-7188fe46ae69-kube-api-access-htqn7\") pod \"barbican-operator-controller-manager-59bc569d95-9xcfc\" (UID: \"e340972f-e354-42b8-98f0-7188fe46ae69\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.270844 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.279173 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhwzv\" (UniqueName: \"kubernetes.io/projected/bcaaf338-4e05-4578-ab2c-55a4a6909f0d-kube-api-access-bhwzv\") pod \"cinder-operator-controller-manager-8d58dc466-jh9lg\" (UID: \"bcaaf338-4e05-4578-ab2c-55a4a6909f0d\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.300749 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.301641 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.304454 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cw9wj" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.312784 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5xx\" (UniqueName: \"kubernetes.io/projected/52be5f50-56f7-4258-863d-998730dd87b7-kube-api-access-jp5xx\") pod \"ironic-operator-controller-manager-6f787dddc9-mbcrf\" (UID: \"52be5f50-56f7-4258-863d-998730dd87b7\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.312847 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vc72\" (UniqueName: \"kubernetes.io/projected/617f6235-ba96-4558-8bc4-bef2488095d2-kube-api-access-5vc72\") pod \"glance-operator-controller-manager-79df6bcc97-5d6d5\" (UID: \"617f6235-ba96-4558-8bc4-bef2488095d2\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.312893 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b5fc\" (UniqueName: \"kubernetes.io/projected/46d6297c-23d8-40d4-8238-c4b5c4c8669c-kube-api-access-6b5fc\") pod \"heat-operator-controller-manager-67dd5f86f5-d2748\" (UID: \"46d6297c-23d8-40d4-8238-c4b5c4c8669c\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.312926 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlxns\" (UniqueName: \"kubernetes.io/projected/83b25f75-498a-4c21-9fd4-222f866b7dec-kube-api-access-nlxns\") pod \"horizon-operator-controller-manager-8464cc45fb-ghz2b\" (UID: \"83b25f75-498a-4c21-9fd4-222f866b7dec\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.312948 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.312999 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29hf4\" (UniqueName: \"kubernetes.io/projected/74a539c5-379b-44f2-ac68-c91ba6a7bafa-kube-api-access-29hf4\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.314655 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.340153 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vc72\" (UniqueName: \"kubernetes.io/projected/617f6235-ba96-4558-8bc4-bef2488095d2-kube-api-access-5vc72\") pod \"glance-operator-controller-manager-79df6bcc97-5d6d5\" (UID: \"617f6235-ba96-4558-8bc4-bef2488095d2\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.347654 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b5fc\" (UniqueName: \"kubernetes.io/projected/46d6297c-23d8-40d4-8238-c4b5c4c8669c-kube-api-access-6b5fc\") pod \"heat-operator-controller-manager-67dd5f86f5-d2748\" (UID: \"46d6297c-23d8-40d4-8238-c4b5c4c8669c\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.350791 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.392957 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-h845j"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.396649 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.400437 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.402100 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.402774 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.414720 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lw2f4" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.414938 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6w4jf" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.415456 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlxns\" (UniqueName: \"kubernetes.io/projected/83b25f75-498a-4c21-9fd4-222f866b7dec-kube-api-access-nlxns\") pod \"horizon-operator-controller-manager-8464cc45fb-ghz2b\" (UID: \"83b25f75-498a-4c21-9fd4-222f866b7dec\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.415484 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.415520 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ksr2\" (UniqueName: \"kubernetes.io/projected/7b8cbed9-9d0f-48dd-a497-910e2e1036ad-kube-api-access-4ksr2\") pod \"keystone-operator-controller-manager-768b96df4c-dt54q\" (UID: \"7b8cbed9-9d0f-48dd-a497-910e2e1036ad\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.415550 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29hf4\" (UniqueName: \"kubernetes.io/projected/74a539c5-379b-44f2-ac68-c91ba6a7bafa-kube-api-access-29hf4\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.415618 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5xx\" (UniqueName: \"kubernetes.io/projected/52be5f50-56f7-4258-863d-998730dd87b7-kube-api-access-jp5xx\") pod \"ironic-operator-controller-manager-6f787dddc9-mbcrf\" (UID: \"52be5f50-56f7-4258-863d-998730dd87b7\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" Mar 18 07:03:48 crc kubenswrapper[4917]: E0318 07:03:48.416152 4917 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:48 crc kubenswrapper[4917]: E0318 07:03:48.416198 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert podName:74a539c5-379b-44f2-ac68-c91ba6a7bafa nodeName:}" failed. No retries permitted until 2026-03-18 07:03:48.916183943 +0000 UTC m=+1013.857338657 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert") pod "infra-operator-controller-manager-7b9c774f96-wjfbl" (UID: "74a539c5-379b-44f2-ac68-c91ba6a7bafa") : secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.418735 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.419793 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.420740 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.423196 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-zzt2d" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.429946 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.436188 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-blj7z"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.436994 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-h845j"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.437070 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.440119 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlxns\" (UniqueName: \"kubernetes.io/projected/83b25f75-498a-4c21-9fd4-222f866b7dec-kube-api-access-nlxns\") pod \"horizon-operator-controller-manager-8464cc45fb-ghz2b\" (UID: \"83b25f75-498a-4c21-9fd4-222f866b7dec\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.441490 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5xx\" (UniqueName: \"kubernetes.io/projected/52be5f50-56f7-4258-863d-998730dd87b7-kube-api-access-jp5xx\") pod \"ironic-operator-controller-manager-6f787dddc9-mbcrf\" (UID: \"52be5f50-56f7-4258-863d-998730dd87b7\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.444139 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-fx692" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.446842 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.449771 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29hf4\" (UniqueName: \"kubernetes.io/projected/74a539c5-379b-44f2-ac68-c91ba6a7bafa-kube-api-access-29hf4\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.451873 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.459609 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.472873 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-blj7z"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.473333 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.483279 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.484220 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.487669 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.488505 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.490968 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-qkq99" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.491236 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-nh6vt" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.491754 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.492635 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.498543 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-56x8z" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.498709 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.511195 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.517096 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.517228 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mkww\" (UniqueName: \"kubernetes.io/projected/155909c6-db8f-43ca-88e1-4757d8583af3-kube-api-access-6mkww\") pod \"manila-operator-controller-manager-55f864c847-h845j\" (UID: \"155909c6-db8f-43ca-88e1-4757d8583af3\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.517307 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcmxn\" (UniqueName: \"kubernetes.io/projected/02fe4313-7ce9-497e-a01d-b69b6ed0faa5-kube-api-access-mcmxn\") pod \"neutron-operator-controller-manager-767865f676-blj7z\" (UID: \"02fe4313-7ce9-497e-a01d-b69b6ed0faa5\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.517358 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx46c\" (UniqueName: \"kubernetes.io/projected/e807d4e5-b371-4e8c-ad6b-d0a3dc68e495-kube-api-access-vx46c\") pod \"mariadb-operator-controller-manager-67ccfc9778-krlq9\" (UID: \"e807d4e5-b371-4e8c-ad6b-d0a3dc68e495\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.517383 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxmzp\" (UniqueName: \"kubernetes.io/projected/b29b952f-ff1c-4734-a43b-4b836d090108-kube-api-access-dxmzp\") pod \"nova-operator-controller-manager-5d488d59fb-f8nxt\" (UID: \"b29b952f-ff1c-4734-a43b-4b836d090108\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.517465 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ksr2\" (UniqueName: \"kubernetes.io/projected/7b8cbed9-9d0f-48dd-a497-910e2e1036ad-kube-api-access-4ksr2\") pod \"keystone-operator-controller-manager-768b96df4c-dt54q\" (UID: \"7b8cbed9-9d0f-48dd-a497-910e2e1036ad\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.520806 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.523782 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-w4kqk" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.523965 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.536331 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.539473 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ksr2\" (UniqueName: \"kubernetes.io/projected/7b8cbed9-9d0f-48dd-a497-910e2e1036ad-kube-api-access-4ksr2\") pod \"keystone-operator-controller-manager-768b96df4c-dt54q\" (UID: \"7b8cbed9-9d0f-48dd-a497-910e2e1036ad\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.548644 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.570047 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.571219 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.572711 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-jw2n9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.589700 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.603635 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.603883 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.604485 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.604638 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.612569 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-m9brn" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620207 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxmzp\" (UniqueName: \"kubernetes.io/projected/b29b952f-ff1c-4734-a43b-4b836d090108-kube-api-access-dxmzp\") pod \"nova-operator-controller-manager-5d488d59fb-f8nxt\" (UID: \"b29b952f-ff1c-4734-a43b-4b836d090108\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620264 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mwqm\" (UniqueName: \"kubernetes.io/projected/d6ab874a-dbdb-4a4d-bcf0-366d44849574-kube-api-access-6mwqm\") pod \"octavia-operator-controller-manager-5b9f45d989-72hk7\" (UID: \"d6ab874a-dbdb-4a4d-bcf0-366d44849574\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620299 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdhs\" (UniqueName: \"kubernetes.io/projected/58ee89c8-9799-45bc-8851-9f0309c9fe9c-kube-api-access-kvdhs\") pod \"placement-operator-controller-manager-5784578c99-hhtxp\" (UID: \"58ee89c8-9799-45bc-8851-9f0309c9fe9c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620394 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mkww\" (UniqueName: \"kubernetes.io/projected/155909c6-db8f-43ca-88e1-4757d8583af3-kube-api-access-6mkww\") pod \"manila-operator-controller-manager-55f864c847-h845j\" (UID: \"155909c6-db8f-43ca-88e1-4757d8583af3\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620420 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6wtw\" (UniqueName: \"kubernetes.io/projected/a51336ad-c2bd-4c62-80f1-45fc36883ead-kube-api-access-k6wtw\") pod \"ovn-operator-controller-manager-884679f54-n4sbn\" (UID: \"a51336ad-c2bd-4c62-80f1-45fc36883ead\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620462 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28cdj\" (UniqueName: \"kubernetes.io/projected/4e8c2c8e-89ce-487f-b6f9-21c3af395094-kube-api-access-28cdj\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620495 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620532 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4dj8\" (UniqueName: \"kubernetes.io/projected/cf290619-453a-4fe0-802d-8a079ad9268f-kube-api-access-z4dj8\") pod \"swift-operator-controller-manager-c674c5965-6gvpd\" (UID: \"cf290619-453a-4fe0-802d-8a079ad9268f\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620558 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcmxn\" (UniqueName: \"kubernetes.io/projected/02fe4313-7ce9-497e-a01d-b69b6ed0faa5-kube-api-access-mcmxn\") pod \"neutron-operator-controller-manager-767865f676-blj7z\" (UID: \"02fe4313-7ce9-497e-a01d-b69b6ed0faa5\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.620624 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx46c\" (UniqueName: \"kubernetes.io/projected/e807d4e5-b371-4e8c-ad6b-d0a3dc68e495-kube-api-access-vx46c\") pod \"mariadb-operator-controller-manager-67ccfc9778-krlq9\" (UID: \"e807d4e5-b371-4e8c-ad6b-d0a3dc68e495\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.638480 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.667728 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mkww\" (UniqueName: \"kubernetes.io/projected/155909c6-db8f-43ca-88e1-4757d8583af3-kube-api-access-6mkww\") pod \"manila-operator-controller-manager-55f864c847-h845j\" (UID: \"155909c6-db8f-43ca-88e1-4757d8583af3\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.674567 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.675645 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.677619 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.679405 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-csvnw" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.681191 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxmzp\" (UniqueName: \"kubernetes.io/projected/b29b952f-ff1c-4734-a43b-4b836d090108-kube-api-access-dxmzp\") pod \"nova-operator-controller-manager-5d488d59fb-f8nxt\" (UID: \"b29b952f-ff1c-4734-a43b-4b836d090108\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.692415 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx46c\" (UniqueName: \"kubernetes.io/projected/e807d4e5-b371-4e8c-ad6b-d0a3dc68e495-kube-api-access-vx46c\") pod \"mariadb-operator-controller-manager-67ccfc9778-krlq9\" (UID: \"e807d4e5-b371-4e8c-ad6b-d0a3dc68e495\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.696472 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcmxn\" (UniqueName: \"kubernetes.io/projected/02fe4313-7ce9-497e-a01d-b69b6ed0faa5-kube-api-access-mcmxn\") pod \"neutron-operator-controller-manager-767865f676-blj7z\" (UID: \"02fe4313-7ce9-497e-a01d-b69b6ed0faa5\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.705130 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.721799 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdhs\" (UniqueName: \"kubernetes.io/projected/58ee89c8-9799-45bc-8851-9f0309c9fe9c-kube-api-access-kvdhs\") pod \"placement-operator-controller-manager-5784578c99-hhtxp\" (UID: \"58ee89c8-9799-45bc-8851-9f0309c9fe9c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.721891 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6wtw\" (UniqueName: \"kubernetes.io/projected/a51336ad-c2bd-4c62-80f1-45fc36883ead-kube-api-access-k6wtw\") pod \"ovn-operator-controller-manager-884679f54-n4sbn\" (UID: \"a51336ad-c2bd-4c62-80f1-45fc36883ead\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.721914 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzzr6\" (UniqueName: \"kubernetes.io/projected/31f7a213-0bfe-4a21-9f97-684d862587fe-kube-api-access-gzzr6\") pod \"telemetry-operator-controller-manager-d6b694c5-fkdn9\" (UID: \"31f7a213-0bfe-4a21-9f97-684d862587fe\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.721946 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28cdj\" (UniqueName: \"kubernetes.io/projected/4e8c2c8e-89ce-487f-b6f9-21c3af395094-kube-api-access-28cdj\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.721968 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.721988 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx9gt\" (UniqueName: \"kubernetes.io/projected/33ac06ad-c83e-4881-85f4-c13e31f72ac9-kube-api-access-dx9gt\") pod \"test-operator-controller-manager-5c5cb9c4d7-pbdwd\" (UID: \"33ac06ad-c83e-4881-85f4-c13e31f72ac9\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.722013 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4dj8\" (UniqueName: \"kubernetes.io/projected/cf290619-453a-4fe0-802d-8a079ad9268f-kube-api-access-z4dj8\") pod \"swift-operator-controller-manager-c674c5965-6gvpd\" (UID: \"cf290619-453a-4fe0-802d-8a079ad9268f\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.722040 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mwqm\" (UniqueName: \"kubernetes.io/projected/d6ab874a-dbdb-4a4d-bcf0-366d44849574-kube-api-access-6mwqm\") pod \"octavia-operator-controller-manager-5b9f45d989-72hk7\" (UID: \"d6ab874a-dbdb-4a4d-bcf0-366d44849574\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" Mar 18 07:03:48 crc kubenswrapper[4917]: E0318 07:03:48.722373 4917 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:48 crc kubenswrapper[4917]: E0318 07:03:48.722410 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert podName:4e8c2c8e-89ce-487f-b6f9-21c3af395094 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:49.222397712 +0000 UTC m=+1014.163552426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert") pod "openstack-baremetal-operator-controller-manager-c68874588-kjm55" (UID: "4e8c2c8e-89ce-487f-b6f9-21c3af395094") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.743593 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdhs\" (UniqueName: \"kubernetes.io/projected/58ee89c8-9799-45bc-8851-9f0309c9fe9c-kube-api-access-kvdhs\") pod \"placement-operator-controller-manager-5784578c99-hhtxp\" (UID: \"58ee89c8-9799-45bc-8851-9f0309c9fe9c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.743813 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6wtw\" (UniqueName: \"kubernetes.io/projected/a51336ad-c2bd-4c62-80f1-45fc36883ead-kube-api-access-k6wtw\") pod \"ovn-operator-controller-manager-884679f54-n4sbn\" (UID: \"a51336ad-c2bd-4c62-80f1-45fc36883ead\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.746169 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28cdj\" (UniqueName: \"kubernetes.io/projected/4e8c2c8e-89ce-487f-b6f9-21c3af395094-kube-api-access-28cdj\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.751751 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4dj8\" (UniqueName: \"kubernetes.io/projected/cf290619-453a-4fe0-802d-8a079ad9268f-kube-api-access-z4dj8\") pod \"swift-operator-controller-manager-c674c5965-6gvpd\" (UID: \"cf290619-453a-4fe0-802d-8a079ad9268f\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.764365 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.769008 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mwqm\" (UniqueName: \"kubernetes.io/projected/d6ab874a-dbdb-4a4d-bcf0-366d44849574-kube-api-access-6mwqm\") pod \"octavia-operator-controller-manager-5b9f45d989-72hk7\" (UID: \"d6ab874a-dbdb-4a4d-bcf0-366d44849574\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.771932 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.772775 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.775496 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-4w58z" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.779464 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.799859 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.800426 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.804096 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.806122 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.806564 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-56bz7" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.806824 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.814274 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.820763 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.832298 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.832629 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzzr6\" (UniqueName: \"kubernetes.io/projected/31f7a213-0bfe-4a21-9f97-684d862587fe-kube-api-access-gzzr6\") pod \"telemetry-operator-controller-manager-d6b694c5-fkdn9\" (UID: \"31f7a213-0bfe-4a21-9f97-684d862587fe\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.835732 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx9gt\" (UniqueName: \"kubernetes.io/projected/33ac06ad-c83e-4881-85f4-c13e31f72ac9-kube-api-access-dx9gt\") pod \"test-operator-controller-manager-5c5cb9c4d7-pbdwd\" (UID: \"33ac06ad-c83e-4881-85f4-c13e31f72ac9\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.835907 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qv5s\" (UniqueName: \"kubernetes.io/projected/d6e0ddcc-f5b6-4201-8261-2f2b975a6532-kube-api-access-2qv5s\") pod \"watcher-operator-controller-manager-6c4d75f7f9-b8bsp\" (UID: \"d6e0ddcc-f5b6-4201-8261-2f2b975a6532\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.863533 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx9gt\" (UniqueName: \"kubernetes.io/projected/33ac06ad-c83e-4881-85f4-c13e31f72ac9-kube-api-access-dx9gt\") pod \"test-operator-controller-manager-5c5cb9c4d7-pbdwd\" (UID: \"33ac06ad-c83e-4881-85f4-c13e31f72ac9\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.863541 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzzr6\" (UniqueName: \"kubernetes.io/projected/31f7a213-0bfe-4a21-9f97-684d862587fe-kube-api-access-gzzr6\") pod \"telemetry-operator-controller-manager-d6b694c5-fkdn9\" (UID: \"31f7a213-0bfe-4a21-9f97-684d862587fe\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.868045 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.874792 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.875071 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.876961 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.879318 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r"] Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.881937 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-9jxhw" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.888810 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.927320 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.947959 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.948124 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jch64\" (UniqueName: \"kubernetes.io/projected/93bce59e-ffb2-4962-9e6d-79d909c26899-kube-api-access-jch64\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.948195 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.948241 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.948278 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:48 crc kubenswrapper[4917]: E0318 07:03:48.948390 4917 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.948422 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qv5s\" (UniqueName: \"kubernetes.io/projected/d6e0ddcc-f5b6-4201-8261-2f2b975a6532-kube-api-access-2qv5s\") pod \"watcher-operator-controller-manager-6c4d75f7f9-b8bsp\" (UID: \"d6e0ddcc-f5b6-4201-8261-2f2b975a6532\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" Mar 18 07:03:48 crc kubenswrapper[4917]: E0318 07:03:48.948475 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert podName:74a539c5-379b-44f2-ac68-c91ba6a7bafa nodeName:}" failed. No retries permitted until 2026-03-18 07:03:49.948461075 +0000 UTC m=+1014.889615789 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert") pod "infra-operator-controller-manager-7b9c774f96-wjfbl" (UID: "74a539c5-379b-44f2-ac68-c91ba6a7bafa") : secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.948532 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw266\" (UniqueName: \"kubernetes.io/projected/b1879bdf-322b-447c-964f-18eeda782b83-kube-api-access-tw266\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pmp5r\" (UID: \"b1879bdf-322b-447c-964f-18eeda782b83\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.961806 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.970493 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qv5s\" (UniqueName: \"kubernetes.io/projected/d6e0ddcc-f5b6-4201-8261-2f2b975a6532-kube-api-access-2qv5s\") pod \"watcher-operator-controller-manager-6c4d75f7f9-b8bsp\" (UID: \"d6e0ddcc-f5b6-4201-8261-2f2b975a6532\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" Mar 18 07:03:48 crc kubenswrapper[4917]: I0318 07:03:48.987719 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.052282 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw266\" (UniqueName: \"kubernetes.io/projected/b1879bdf-322b-447c-964f-18eeda782b83-kube-api-access-tw266\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pmp5r\" (UID: \"b1879bdf-322b-447c-964f-18eeda782b83\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.052326 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jch64\" (UniqueName: \"kubernetes.io/projected/93bce59e-ffb2-4962-9e6d-79d909c26899-kube-api-access-jch64\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.052368 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.052411 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.052514 4917 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.052555 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:49.552542547 +0000 UTC m=+1014.493697261 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "metrics-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.053206 4917 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.053234 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:49.553225014 +0000 UTC m=+1014.494379728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "webhook-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.083334 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw266\" (UniqueName: \"kubernetes.io/projected/b1879bdf-322b-447c-964f-18eeda782b83-kube-api-access-tw266\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pmp5r\" (UID: \"b1879bdf-322b-447c-964f-18eeda782b83\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.088559 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.091937 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jch64\" (UniqueName: \"kubernetes.io/projected/93bce59e-ffb2-4962-9e6d-79d909c26899-kube-api-access-jch64\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.255854 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.256301 4917 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.256519 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert podName:4e8c2c8e-89ce-487f-b6f9-21c3af395094 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:50.256503198 +0000 UTC m=+1015.197657912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert") pod "openstack-baremetal-operator-controller-manager-c68874588-kjm55" (UID: "4e8c2c8e-89ce-487f-b6f9-21c3af395094") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.559466 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.562325 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.562399 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.562474 4917 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.562529 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:50.562514203 +0000 UTC m=+1015.503668907 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "webhook-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.562549 4917 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.562628 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:50.562607915 +0000 UTC m=+1015.503762719 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "metrics-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.573959 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.599117 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.603138 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.769323 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.808388 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg"] Mar 18 07:03:49 crc kubenswrapper[4917]: W0318 07:03:49.825725 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod155909c6_db8f_43ca_88e1_4757d8583af3.slice/crio-e04b82690fae4e4caf79367a0dd403369e01a70b570911df0f225c1317758cf9 WatchSource:0}: Error finding container e04b82690fae4e4caf79367a0dd403369e01a70b570911df0f225c1317758cf9: Status 404 returned error can't find the container with id e04b82690fae4e4caf79367a0dd403369e01a70b570911df0f225c1317758cf9 Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.830700 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-blj7z"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.837624 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-h845j"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.845891 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.851659 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.856270 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q"] Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.885676 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" event={"ID":"83b25f75-498a-4c21-9fd4-222f866b7dec","Type":"ContainerStarted","Data":"946d519175f5d4331919c512c930932b9cadd96789d998f9c1c914e0d495490b"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.887320 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" event={"ID":"bcaaf338-4e05-4578-ab2c-55a4a6909f0d","Type":"ContainerStarted","Data":"a672531c743c19549ba5c9e7ab90e95bc9fec76ea51b222ba1a7788c3351ea60"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.887984 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" event={"ID":"7b8cbed9-9d0f-48dd-a497-910e2e1036ad","Type":"ContainerStarted","Data":"fe9f0cb9138893fc4b86ebe43c0919fd6925c8a01269611b6e0ce547378950f3"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.888751 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" event={"ID":"46d6297c-23d8-40d4-8238-c4b5c4c8669c","Type":"ContainerStarted","Data":"550d4de6951df3b644b0b50c5591b2997a9451ac1ed7e230640bc2c611296746"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.889619 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" event={"ID":"52be5f50-56f7-4258-863d-998730dd87b7","Type":"ContainerStarted","Data":"66e529288fa644091349a75698c97ec2d27463d0b2d1a7162b9a24d4af6d2bfd"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.890345 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" event={"ID":"02fe4313-7ce9-497e-a01d-b69b6ed0faa5","Type":"ContainerStarted","Data":"b01253fb824f84065d6c4d73a61dd4ce45f21f717adbb8c2a43069c77b189c61"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.891524 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" event={"ID":"e340972f-e354-42b8-98f0-7188fe46ae69","Type":"ContainerStarted","Data":"37ccbe2cbaee46a8405e76920e2b728c10581d03cfdfdce99bcf6a8950ad44cf"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.892768 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" event={"ID":"f0153e44-31ff-4d0f-8592-d6171f714f97","Type":"ContainerStarted","Data":"c67a6b8420d95475e2740920ef535b7ce422bfd0c691c3644c69208ffbf52709"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.893565 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" event={"ID":"617f6235-ba96-4558-8bc4-bef2488095d2","Type":"ContainerStarted","Data":"38b9e4eb539651b81508ca7df9d151ae870ed13f103f706463b1784c0d0419d1"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.894548 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" event={"ID":"b29b952f-ff1c-4734-a43b-4b836d090108","Type":"ContainerStarted","Data":"f770a48b24f50ea7964a11190df62e65697d4ee8f5ffa81b3ab6711ca02d3c52"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.895495 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" event={"ID":"155909c6-db8f-43ca-88e1-4757d8583af3","Type":"ContainerStarted","Data":"e04b82690fae4e4caf79367a0dd403369e01a70b570911df0f225c1317758cf9"} Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.968758 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.969114 4917 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: E0318 07:03:49.969187 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert podName:74a539c5-379b-44f2-ac68-c91ba6a7bafa nodeName:}" failed. No retries permitted until 2026-03-18 07:03:51.969171834 +0000 UTC m=+1016.910326548 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert") pod "infra-operator-controller-manager-7b9c774f96-wjfbl" (UID: "74a539c5-379b-44f2-ac68-c91ba6a7bafa") : secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:49 crc kubenswrapper[4917]: I0318 07:03:49.991482 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp"] Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.007339 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd"] Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.008865 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kvdhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-hhtxp_openstack-operators(58ee89c8-9799-45bc-8851-9f0309c9fe9c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.010032 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" podUID="58ee89c8-9799-45bc-8851-9f0309c9fe9c" Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.015032 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn"] Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.017956 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k6wtw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-n4sbn_openstack-operators(a51336ad-c2bd-4c62-80f1-45fc36883ead): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.019728 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" podUID="a51336ad-c2bd-4c62-80f1-45fc36883ead" Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.047419 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9"] Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.048077 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vx46c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-krlq9_openstack-operators(e807d4e5-b371-4e8c-ad6b-d0a3dc68e495): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.049256 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" podUID="e807d4e5-b371-4e8c-ad6b-d0a3dc68e495" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.053102 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2qv5s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-b8bsp_openstack-operators(d6e0ddcc-f5b6-4201-8261-2f2b975a6532): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.055783 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" podUID="d6e0ddcc-f5b6-4201-8261-2f2b975a6532" Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.057576 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp"] Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.070740 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7"] Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.075648 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r"] Mar 18 07:03:50 crc kubenswrapper[4917]: W0318 07:03:50.077051 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31f7a213_0bfe_4a21_9f97_684d862587fe.slice/crio-f6d4856e1c6283d3cc95ca7c839d4c15af02bb8ec473784dc560a3a87b60eeae WatchSource:0}: Error finding container f6d4856e1c6283d3cc95ca7c839d4c15af02bb8ec473784dc560a3a87b60eeae: Status 404 returned error can't find the container with id f6d4856e1c6283d3cc95ca7c839d4c15af02bb8ec473784dc560a3a87b60eeae Mar 18 07:03:50 crc kubenswrapper[4917]: W0318 07:03:50.077815 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1879bdf_322b_447c_964f_18eeda782b83.slice/crio-44ab8d926d6f429198c203ea5185cc5d1ad634f89cd3add25badc207c3030186 WatchSource:0}: Error finding container 44ab8d926d6f429198c203ea5185cc5d1ad634f89cd3add25badc207c3030186: Status 404 returned error can't find the container with id 44ab8d926d6f429198c203ea5185cc5d1ad634f89cd3add25badc207c3030186 Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.079260 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gzzr6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-fkdn9_openstack-operators(31f7a213-0bfe-4a21-9f97-684d862587fe): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.080123 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tw266,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pmp5r_openstack-operators(b1879bdf-322b-447c-964f-18eeda782b83): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.080616 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9"] Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.080695 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" podUID="31f7a213-0bfe-4a21-9f97-684d862587fe" Mar 18 07:03:50 crc kubenswrapper[4917]: W0318 07:03:50.081201 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6ab874a_dbdb_4a4d_bcf0_366d44849574.slice/crio-37735e61f35652f70545022cf38075e9b75ab437ba9d5abdd745cb4483e9fa8b WatchSource:0}: Error finding container 37735e61f35652f70545022cf38075e9b75ab437ba9d5abdd745cb4483e9fa8b: Status 404 returned error can't find the container with id 37735e61f35652f70545022cf38075e9b75ab437ba9d5abdd745cb4483e9fa8b Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.081242 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" podUID="b1879bdf-322b-447c-964f-18eeda782b83" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.088160 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6mwqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-72hk7_openstack-operators(d6ab874a-dbdb-4a4d-bcf0-366d44849574): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.089454 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" podUID="d6ab874a-dbdb-4a4d-bcf0-366d44849574" Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.222225 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd"] Mar 18 07:03:50 crc kubenswrapper[4917]: W0318 07:03:50.238785 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33ac06ad_c83e_4881_85f4_c13e31f72ac9.slice/crio-beee70d492bc93a3b2e21ab3a3c46c2fd43d30367714a891e8afc8a04e69ad17 WatchSource:0}: Error finding container beee70d492bc93a3b2e21ab3a3c46c2fd43d30367714a891e8afc8a04e69ad17: Status 404 returned error can't find the container with id beee70d492bc93a3b2e21ab3a3c46c2fd43d30367714a891e8afc8a04e69ad17 Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.276375 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.277145 4917 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.277230 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert podName:4e8c2c8e-89ce-487f-b6f9-21c3af395094 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:52.277208327 +0000 UTC m=+1017.218363041 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert") pod "openstack-baremetal-operator-controller-manager-c68874588-kjm55" (UID: "4e8c2c8e-89ce-487f-b6f9-21c3af395094") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.580007 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.580350 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.580522 4917 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.580553 4917 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.580570 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:52.580556046 +0000 UTC m=+1017.521710760 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "metrics-server-cert" not found Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.580732 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:52.580692569 +0000 UTC m=+1017.521847283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "webhook-server-cert" not found Mar 18 07:03:50 crc kubenswrapper[4917]: I0318 07:03:50.979877 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" event={"ID":"a51336ad-c2bd-4c62-80f1-45fc36883ead","Type":"ContainerStarted","Data":"37a8c75b5941700f60cb7c6c5e0a1810850c5f509ad314b6f7ccd4b08e5ed572"} Mar 18 07:03:50 crc kubenswrapper[4917]: E0318 07:03:50.992838 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" podUID="a51336ad-c2bd-4c62-80f1-45fc36883ead" Mar 18 07:03:51 crc kubenswrapper[4917]: I0318 07:03:51.009323 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" event={"ID":"33ac06ad-c83e-4881-85f4-c13e31f72ac9","Type":"ContainerStarted","Data":"beee70d492bc93a3b2e21ab3a3c46c2fd43d30367714a891e8afc8a04e69ad17"} Mar 18 07:03:51 crc kubenswrapper[4917]: I0318 07:03:51.012309 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" event={"ID":"58ee89c8-9799-45bc-8851-9f0309c9fe9c","Type":"ContainerStarted","Data":"0f7edce0ed29c4a2354703c7811ac70bb686641655c1360f561d9fa9171c0b23"} Mar 18 07:03:51 crc kubenswrapper[4917]: E0318 07:03:51.013929 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" podUID="58ee89c8-9799-45bc-8851-9f0309c9fe9c" Mar 18 07:03:51 crc kubenswrapper[4917]: I0318 07:03:51.016239 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" event={"ID":"cf290619-453a-4fe0-802d-8a079ad9268f","Type":"ContainerStarted","Data":"09655bc6c40d0bcdb944ca44d25621bd5fd7dde3255ca8bcf5126dc50915f1d0"} Mar 18 07:03:51 crc kubenswrapper[4917]: I0318 07:03:51.031425 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" event={"ID":"31f7a213-0bfe-4a21-9f97-684d862587fe","Type":"ContainerStarted","Data":"f6d4856e1c6283d3cc95ca7c839d4c15af02bb8ec473784dc560a3a87b60eeae"} Mar 18 07:03:51 crc kubenswrapper[4917]: E0318 07:03:51.033080 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" podUID="31f7a213-0bfe-4a21-9f97-684d862587fe" Mar 18 07:03:51 crc kubenswrapper[4917]: I0318 07:03:51.034005 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" event={"ID":"d6e0ddcc-f5b6-4201-8261-2f2b975a6532","Type":"ContainerStarted","Data":"58657fcc113587eb1d70fc33fa0536ada4ec5b8342afa3b1f646c3724d9bb52f"} Mar 18 07:03:51 crc kubenswrapper[4917]: E0318 07:03:51.035212 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" podUID="d6e0ddcc-f5b6-4201-8261-2f2b975a6532" Mar 18 07:03:51 crc kubenswrapper[4917]: I0318 07:03:51.063910 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" event={"ID":"e807d4e5-b371-4e8c-ad6b-d0a3dc68e495","Type":"ContainerStarted","Data":"2ec6576ebc2be64eca787e72e1108725bb77fe83f0c8bd7f6a5961495a6b00f6"} Mar 18 07:03:51 crc kubenswrapper[4917]: E0318 07:03:51.065307 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" podUID="e807d4e5-b371-4e8c-ad6b-d0a3dc68e495" Mar 18 07:03:51 crc kubenswrapper[4917]: I0318 07:03:51.068005 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" event={"ID":"b1879bdf-322b-447c-964f-18eeda782b83","Type":"ContainerStarted","Data":"44ab8d926d6f429198c203ea5185cc5d1ad634f89cd3add25badc207c3030186"} Mar 18 07:03:51 crc kubenswrapper[4917]: E0318 07:03:51.069842 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" podUID="b1879bdf-322b-447c-964f-18eeda782b83" Mar 18 07:03:51 crc kubenswrapper[4917]: I0318 07:03:51.070534 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" event={"ID":"d6ab874a-dbdb-4a4d-bcf0-366d44849574","Type":"ContainerStarted","Data":"37735e61f35652f70545022cf38075e9b75ab437ba9d5abdd745cb4483e9fa8b"} Mar 18 07:03:51 crc kubenswrapper[4917]: E0318 07:03:51.072507 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" podUID="d6ab874a-dbdb-4a4d-bcf0-366d44849574" Mar 18 07:03:52 crc kubenswrapper[4917]: I0318 07:03:52.013207 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.013369 4917 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.013429 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert podName:74a539c5-379b-44f2-ac68-c91ba6a7bafa nodeName:}" failed. No retries permitted until 2026-03-18 07:03:56.013411291 +0000 UTC m=+1020.954566005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert") pod "infra-operator-controller-manager-7b9c774f96-wjfbl" (UID: "74a539c5-379b-44f2-ac68-c91ba6a7bafa") : secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.084672 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" podUID="d6e0ddcc-f5b6-4201-8261-2f2b975a6532" Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.084973 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" podUID="e807d4e5-b371-4e8c-ad6b-d0a3dc68e495" Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.085014 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" podUID="d6ab874a-dbdb-4a4d-bcf0-366d44849574" Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.085068 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" podUID="a51336ad-c2bd-4c62-80f1-45fc36883ead" Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.085068 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" podUID="b1879bdf-322b-447c-964f-18eeda782b83" Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.085122 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" podUID="31f7a213-0bfe-4a21-9f97-684d862587fe" Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.085174 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" podUID="58ee89c8-9799-45bc-8851-9f0309c9fe9c" Mar 18 07:03:52 crc kubenswrapper[4917]: I0318 07:03:52.322290 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.322448 4917 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.322491 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert podName:4e8c2c8e-89ce-487f-b6f9-21c3af395094 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:56.3224788 +0000 UTC m=+1021.263633514 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert") pod "openstack-baremetal-operator-controller-manager-c68874588-kjm55" (UID: "4e8c2c8e-89ce-487f-b6f9-21c3af395094") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:52 crc kubenswrapper[4917]: I0318 07:03:52.625727 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:52 crc kubenswrapper[4917]: I0318 07:03:52.626012 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.625914 4917 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.626121 4917 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.626147 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:56.626127916 +0000 UTC m=+1021.567282630 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "webhook-server-cert" not found Mar 18 07:03:52 crc kubenswrapper[4917]: E0318 07:03:52.626170 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:03:56.626156246 +0000 UTC m=+1021.567310960 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "metrics-server-cert" not found Mar 18 07:03:56 crc kubenswrapper[4917]: I0318 07:03:56.085714 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:03:56 crc kubenswrapper[4917]: E0318 07:03:56.086839 4917 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:56 crc kubenswrapper[4917]: E0318 07:03:56.087091 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert podName:74a539c5-379b-44f2-ac68-c91ba6a7bafa nodeName:}" failed. No retries permitted until 2026-03-18 07:04:04.087072282 +0000 UTC m=+1029.028226996 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert") pod "infra-operator-controller-manager-7b9c774f96-wjfbl" (UID: "74a539c5-379b-44f2-ac68-c91ba6a7bafa") : secret "infra-operator-webhook-server-cert" not found Mar 18 07:03:56 crc kubenswrapper[4917]: I0318 07:03:56.391250 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:03:56 crc kubenswrapper[4917]: E0318 07:03:56.391412 4917 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:56 crc kubenswrapper[4917]: E0318 07:03:56.391486 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert podName:4e8c2c8e-89ce-487f-b6f9-21c3af395094 nodeName:}" failed. No retries permitted until 2026-03-18 07:04:04.391470666 +0000 UTC m=+1029.332625380 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert") pod "openstack-baremetal-operator-controller-manager-c68874588-kjm55" (UID: "4e8c2c8e-89ce-487f-b6f9-21c3af395094") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:03:56 crc kubenswrapper[4917]: I0318 07:03:56.695745 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:56 crc kubenswrapper[4917]: I0318 07:03:56.695935 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:03:56 crc kubenswrapper[4917]: E0318 07:03:56.696014 4917 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 07:03:56 crc kubenswrapper[4917]: E0318 07:03:56.696180 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:04:04.696157327 +0000 UTC m=+1029.637312031 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "metrics-server-cert" not found Mar 18 07:03:56 crc kubenswrapper[4917]: E0318 07:03:56.696073 4917 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 07:03:56 crc kubenswrapper[4917]: E0318 07:03:56.696260 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:04:04.696242139 +0000 UTC m=+1029.637396863 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "webhook-server-cert" not found Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.127360 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563624-ws22n"] Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.128547 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563624-ws22n" Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.130717 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.131271 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.138971 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563624-ws22n"] Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.139029 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.248302 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdl7h\" (UniqueName: \"kubernetes.io/projected/ff1cc3bc-d171-4a57-9eba-cfc6da82fe52-kube-api-access-vdl7h\") pod \"auto-csr-approver-29563624-ws22n\" (UID: \"ff1cc3bc-d171-4a57-9eba-cfc6da82fe52\") " pod="openshift-infra/auto-csr-approver-29563624-ws22n" Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.349932 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdl7h\" (UniqueName: \"kubernetes.io/projected/ff1cc3bc-d171-4a57-9eba-cfc6da82fe52-kube-api-access-vdl7h\") pod \"auto-csr-approver-29563624-ws22n\" (UID: \"ff1cc3bc-d171-4a57-9eba-cfc6da82fe52\") " pod="openshift-infra/auto-csr-approver-29563624-ws22n" Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.367122 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdl7h\" (UniqueName: \"kubernetes.io/projected/ff1cc3bc-d171-4a57-9eba-cfc6da82fe52-kube-api-access-vdl7h\") pod \"auto-csr-approver-29563624-ws22n\" (UID: \"ff1cc3bc-d171-4a57-9eba-cfc6da82fe52\") " pod="openshift-infra/auto-csr-approver-29563624-ws22n" Mar 18 07:04:00 crc kubenswrapper[4917]: I0318 07:04:00.449783 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563624-ws22n" Mar 18 07:04:02 crc kubenswrapper[4917]: E0318 07:04:02.241465 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 18 07:04:02 crc kubenswrapper[4917]: E0318 07:04:02.242011 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6b5fc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-d2748_openstack-operators(46d6297c-23d8-40d4-8238-c4b5c4c8669c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:04:02 crc kubenswrapper[4917]: E0318 07:04:02.243630 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" podUID="46d6297c-23d8-40d4-8238-c4b5c4c8669c" Mar 18 07:04:02 crc kubenswrapper[4917]: E0318 07:04:02.869784 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8" Mar 18 07:04:02 crc kubenswrapper[4917]: E0318 07:04:02.869949 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jp5xx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6f787dddc9-mbcrf_openstack-operators(52be5f50-56f7-4258-863d-998730dd87b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:04:02 crc kubenswrapper[4917]: E0318 07:04:02.871794 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" podUID="52be5f50-56f7-4258-863d-998730dd87b7" Mar 18 07:04:03 crc kubenswrapper[4917]: E0318 07:04:03.182906 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" podUID="46d6297c-23d8-40d4-8238-c4b5c4c8669c" Mar 18 07:04:03 crc kubenswrapper[4917]: E0318 07:04:03.183015 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" podUID="52be5f50-56f7-4258-863d-998730dd87b7" Mar 18 07:04:03 crc kubenswrapper[4917]: E0318 07:04:03.557550 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 18 07:04:03 crc kubenswrapper[4917]: E0318 07:04:03.557745 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dx9gt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-pbdwd_openstack-operators(33ac06ad-c83e-4881-85f4-c13e31f72ac9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:04:03 crc kubenswrapper[4917]: E0318 07:04:03.559309 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" podUID="33ac06ad-c83e-4881-85f4-c13e31f72ac9" Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.098844 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.098999 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5vc72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-5d6d5_openstack-operators(617f6235-ba96-4558-8bc4-bef2488095d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.100144 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" podUID="617f6235-ba96-4558-8bc4-bef2488095d2" Mar 18 07:04:04 crc kubenswrapper[4917]: I0318 07:04:04.106221 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:04:04 crc kubenswrapper[4917]: I0318 07:04:04.129000 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/74a539c5-379b-44f2-ac68-c91ba6a7bafa-cert\") pod \"infra-operator-controller-manager-7b9c774f96-wjfbl\" (UID: \"74a539c5-379b-44f2-ac68-c91ba6a7bafa\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:04:04 crc kubenswrapper[4917]: I0318 07:04:04.145443 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-25xq6" Mar 18 07:04:04 crc kubenswrapper[4917]: I0318 07:04:04.154280 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.185041 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" podUID="33ac06ad-c83e-4881-85f4-c13e31f72ac9" Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.185041 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" podUID="617f6235-ba96-4558-8bc4-bef2488095d2" Mar 18 07:04:04 crc kubenswrapper[4917]: I0318 07:04:04.413485 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.413673 4917 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.413727 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert podName:4e8c2c8e-89ce-487f-b6f9-21c3af395094 nodeName:}" failed. No retries permitted until 2026-03-18 07:04:20.41370912 +0000 UTC m=+1045.354863834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert") pod "openstack-baremetal-operator-controller-manager-c68874588-kjm55" (UID: "4e8c2c8e-89ce-487f-b6f9-21c3af395094") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 18 07:04:04 crc kubenswrapper[4917]: I0318 07:04:04.721219 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:04:04 crc kubenswrapper[4917]: I0318 07:04:04.721274 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.721651 4917 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.721700 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:04:20.721686372 +0000 UTC m=+1045.662841076 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "metrics-server-cert" not found Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.722689 4917 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.722722 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs podName:93bce59e-ffb2-4962-9e6d-79d909c26899 nodeName:}" failed. No retries permitted until 2026-03-18 07:04:20.722714346 +0000 UTC m=+1045.663869060 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs") pod "openstack-operator-controller-manager-5b7b5fbd97-fbnbz" (UID: "93bce59e-ffb2-4962-9e6d-79d909c26899") : secret "webhook-server-cert" not found Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.816056 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.816234 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4ksr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-dt54q_openstack-operators(7b8cbed9-9d0f-48dd-a497-910e2e1036ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:04:04 crc kubenswrapper[4917]: E0318 07:04:04.847908 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" podUID="7b8cbed9-9d0f-48dd-a497-910e2e1036ad" Mar 18 07:04:05 crc kubenswrapper[4917]: E0318 07:04:05.189308 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" podUID="7b8cbed9-9d0f-48dd-a497-910e2e1036ad" Mar 18 07:04:05 crc kubenswrapper[4917]: E0318 07:04:05.513131 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 18 07:04:05 crc kubenswrapper[4917]: E0318 07:04:05.513265 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dxmzp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-f8nxt_openstack-operators(b29b952f-ff1c-4734-a43b-4b836d090108): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:04:05 crc kubenswrapper[4917]: E0318 07:04:05.514438 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" podUID="b29b952f-ff1c-4734-a43b-4b836d090108" Mar 18 07:04:06 crc kubenswrapper[4917]: E0318 07:04:06.196111 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" podUID="b29b952f-ff1c-4734-a43b-4b836d090108" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.012411 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563624-ws22n"] Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.127551 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl"] Mar 18 07:04:08 crc kubenswrapper[4917]: W0318 07:04:08.139890 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74a539c5_379b_44f2_ac68_c91ba6a7bafa.slice/crio-4a5e6ff5bc7509af215fbafc1d619ddda9fc8f1198b1ca2795daf2ec78e6d6c3 WatchSource:0}: Error finding container 4a5e6ff5bc7509af215fbafc1d619ddda9fc8f1198b1ca2795daf2ec78e6d6c3: Status 404 returned error can't find the container with id 4a5e6ff5bc7509af215fbafc1d619ddda9fc8f1198b1ca2795daf2ec78e6d6c3 Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.214405 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" event={"ID":"a51336ad-c2bd-4c62-80f1-45fc36883ead","Type":"ContainerStarted","Data":"a751999406caed0d2063875b5028316ffaef9ded49046c1d359e0354b48cd550"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.215255 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.222029 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563624-ws22n" event={"ID":"ff1cc3bc-d171-4a57-9eba-cfc6da82fe52","Type":"ContainerStarted","Data":"ee855327346f15a728524c1412bf29f7a05c70fc78b502b8594e36b56d1bbd0e"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.223379 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" event={"ID":"155909c6-db8f-43ca-88e1-4757d8583af3","Type":"ContainerStarted","Data":"27f247cc5d14dbdb047cf3578d35ecc68bb8a2e1d92b973d55555913139b6729"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.224048 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.225406 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" event={"ID":"e807d4e5-b371-4e8c-ad6b-d0a3dc68e495","Type":"ContainerStarted","Data":"cef89bf0814ec331dd9b49ef4660c12017cd1b3ad0614b9cbf7c347f03db2be0"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.225762 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.228575 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" event={"ID":"d6ab874a-dbdb-4a4d-bcf0-366d44849574","Type":"ContainerStarted","Data":"756f4941dba4dcb4a1640d6c31013b4871d068663d4e8d9cd46bb29668ed2fa9"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.228912 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.243304 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" event={"ID":"e340972f-e354-42b8-98f0-7188fe46ae69","Type":"ContainerStarted","Data":"5be3c663c44b1e9ca57c38f0f0af057f13853b1dd3c4716c190e6582e525673f"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.243551 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.250322 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" event={"ID":"cf290619-453a-4fe0-802d-8a079ad9268f","Type":"ContainerStarted","Data":"ed904a619704e84081ca548da4979e4c9f73a1b4a08f62e7271a1e8d3b1cdd58"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.250446 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.254122 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" event={"ID":"74a539c5-379b-44f2-ac68-c91ba6a7bafa","Type":"ContainerStarted","Data":"4a5e6ff5bc7509af215fbafc1d619ddda9fc8f1198b1ca2795daf2ec78e6d6c3"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.260482 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" event={"ID":"f0153e44-31ff-4d0f-8592-d6171f714f97","Type":"ContainerStarted","Data":"258b9ec1e2a3a665c8f8025d54dceae7d7473c50c4446c5b136e56b6cf75aef9"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.260789 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.263715 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" podStartSLOduration=2.516402673 podStartE2EDuration="20.263699925s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:50.017805647 +0000 UTC m=+1014.958960361" lastFinishedPulling="2026-03-18 07:04:07.765102899 +0000 UTC m=+1032.706257613" observedRunningTime="2026-03-18 07:04:08.241928112 +0000 UTC m=+1033.183082826" watchObservedRunningTime="2026-03-18 07:04:08.263699925 +0000 UTC m=+1033.204854639" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.263901 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" podStartSLOduration=4.579800719 podStartE2EDuration="20.26389727s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.828046823 +0000 UTC m=+1014.769201537" lastFinishedPulling="2026-03-18 07:04:05.512143374 +0000 UTC m=+1030.453298088" observedRunningTime="2026-03-18 07:04:08.262118206 +0000 UTC m=+1033.203272920" watchObservedRunningTime="2026-03-18 07:04:08.26389727 +0000 UTC m=+1033.205051994" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.265079 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" event={"ID":"02fe4313-7ce9-497e-a01d-b69b6ed0faa5","Type":"ContainerStarted","Data":"647d7ef377a5c78d6b583ad10d75f973cfc4e31c524a0d60efe0634cbdda1f99"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.265697 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.277080 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" event={"ID":"83b25f75-498a-4c21-9fd4-222f866b7dec","Type":"ContainerStarted","Data":"6b628a946d916dd2e270c819feeee5b857b3043120d8229aba3fb831428ddc6b"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.277167 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.281076 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" event={"ID":"bcaaf338-4e05-4578-ab2c-55a4a6909f0d","Type":"ContainerStarted","Data":"4b27f317fd6424282daf81ecb9669a4e35969fefcfb3a9c6921e3698d3aa7349"} Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.281304 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.296852 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" podStartSLOduration=2.690472361 podStartE2EDuration="20.296832328s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:50.088055869 +0000 UTC m=+1015.029210583" lastFinishedPulling="2026-03-18 07:04:07.694415836 +0000 UTC m=+1032.635570550" observedRunningTime="2026-03-18 07:04:08.289622751 +0000 UTC m=+1033.230777465" watchObservedRunningTime="2026-03-18 07:04:08.296832328 +0000 UTC m=+1033.237987042" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.314127 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" podStartSLOduration=2.667608349 podStartE2EDuration="20.314110291s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:50.047903354 +0000 UTC m=+1014.989058068" lastFinishedPulling="2026-03-18 07:04:07.694405296 +0000 UTC m=+1032.635560010" observedRunningTime="2026-03-18 07:04:08.31240644 +0000 UTC m=+1033.253561154" watchObservedRunningTime="2026-03-18 07:04:08.314110291 +0000 UTC m=+1033.255265005" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.353903 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" podStartSLOduration=3.4354347179999998 podStartE2EDuration="20.353884957s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:50.008704813 +0000 UTC m=+1014.949859547" lastFinishedPulling="2026-03-18 07:04:06.927155062 +0000 UTC m=+1031.868309786" observedRunningTime="2026-03-18 07:04:08.349865488 +0000 UTC m=+1033.291020202" watchObservedRunningTime="2026-03-18 07:04:08.353884957 +0000 UTC m=+1033.295039691" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.365772 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" podStartSLOduration=4.651245101 podStartE2EDuration="20.365754298s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.79734224 +0000 UTC m=+1014.738496954" lastFinishedPulling="2026-03-18 07:04:05.511851437 +0000 UTC m=+1030.453006151" observedRunningTime="2026-03-18 07:04:08.364991349 +0000 UTC m=+1033.306146063" watchObservedRunningTime="2026-03-18 07:04:08.365754298 +0000 UTC m=+1033.306909012" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.390743 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" podStartSLOduration=4.451665347 podStartE2EDuration="20.39072814s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.576490975 +0000 UTC m=+1014.517645689" lastFinishedPulling="2026-03-18 07:04:05.515553768 +0000 UTC m=+1030.456708482" observedRunningTime="2026-03-18 07:04:08.386969788 +0000 UTC m=+1033.328124502" watchObservedRunningTime="2026-03-18 07:04:08.39072814 +0000 UTC m=+1033.331882854" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.416569 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" podStartSLOduration=3.315290272 podStartE2EDuration="20.416553033s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.825685086 +0000 UTC m=+1014.766839800" lastFinishedPulling="2026-03-18 07:04:06.926947847 +0000 UTC m=+1031.868102561" observedRunningTime="2026-03-18 07:04:08.413722524 +0000 UTC m=+1033.354877268" watchObservedRunningTime="2026-03-18 07:04:08.416553033 +0000 UTC m=+1033.357707747" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.446877 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" podStartSLOduration=4.773251912 podStartE2EDuration="20.446858196s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.842841176 +0000 UTC m=+1014.783995900" lastFinishedPulling="2026-03-18 07:04:05.51644747 +0000 UTC m=+1030.457602184" observedRunningTime="2026-03-18 07:04:08.445833601 +0000 UTC m=+1033.386988335" watchObservedRunningTime="2026-03-18 07:04:08.446858196 +0000 UTC m=+1033.388012910" Mar 18 07:04:08 crc kubenswrapper[4917]: I0318 07:04:08.479268 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" podStartSLOduration=4.583515851 podStartE2EDuration="20.479252401s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.619773967 +0000 UTC m=+1014.560928681" lastFinishedPulling="2026-03-18 07:04:05.515510517 +0000 UTC m=+1030.456665231" observedRunningTime="2026-03-18 07:04:08.4759659 +0000 UTC m=+1033.417120614" watchObservedRunningTime="2026-03-18 07:04:08.479252401 +0000 UTC m=+1033.420407115" Mar 18 07:04:10 crc kubenswrapper[4917]: I0318 07:04:10.294124 4917 generic.go:334] "Generic (PLEG): container finished" podID="ff1cc3bc-d171-4a57-9eba-cfc6da82fe52" containerID="519a8823772650121fd4e0d821d4540e3c6794d038c69e58bbdf20d9315eb955" exitCode=0 Mar 18 07:04:10 crc kubenswrapper[4917]: I0318 07:04:10.295352 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563624-ws22n" event={"ID":"ff1cc3bc-d171-4a57-9eba-cfc6da82fe52","Type":"ContainerDied","Data":"519a8823772650121fd4e0d821d4540e3c6794d038c69e58bbdf20d9315eb955"} Mar 18 07:04:16 crc kubenswrapper[4917]: I0318 07:04:16.313924 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563624-ws22n" Mar 18 07:04:16 crc kubenswrapper[4917]: I0318 07:04:16.408864 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563624-ws22n" event={"ID":"ff1cc3bc-d171-4a57-9eba-cfc6da82fe52","Type":"ContainerDied","Data":"ee855327346f15a728524c1412bf29f7a05c70fc78b502b8594e36b56d1bbd0e"} Mar 18 07:04:16 crc kubenswrapper[4917]: I0318 07:04:16.409364 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee855327346f15a728524c1412bf29f7a05c70fc78b502b8594e36b56d1bbd0e" Mar 18 07:04:16 crc kubenswrapper[4917]: I0318 07:04:16.409520 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563624-ws22n" Mar 18 07:04:16 crc kubenswrapper[4917]: I0318 07:04:16.459212 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdl7h\" (UniqueName: \"kubernetes.io/projected/ff1cc3bc-d171-4a57-9eba-cfc6da82fe52-kube-api-access-vdl7h\") pod \"ff1cc3bc-d171-4a57-9eba-cfc6da82fe52\" (UID: \"ff1cc3bc-d171-4a57-9eba-cfc6da82fe52\") " Mar 18 07:04:16 crc kubenswrapper[4917]: I0318 07:04:16.488789 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1cc3bc-d171-4a57-9eba-cfc6da82fe52-kube-api-access-vdl7h" (OuterVolumeSpecName: "kube-api-access-vdl7h") pod "ff1cc3bc-d171-4a57-9eba-cfc6da82fe52" (UID: "ff1cc3bc-d171-4a57-9eba-cfc6da82fe52"). InnerVolumeSpecName "kube-api-access-vdl7h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:04:16 crc kubenswrapper[4917]: I0318 07:04:16.561262 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdl7h\" (UniqueName: \"kubernetes.io/projected/ff1cc3bc-d171-4a57-9eba-cfc6da82fe52-kube-api-access-vdl7h\") on node \"crc\" DevicePath \"\"" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.421271 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563618-78xd9"] Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.439358 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563618-78xd9"] Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.472202 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" event={"ID":"b1879bdf-322b-447c-964f-18eeda782b83","Type":"ContainerStarted","Data":"fbeb4fd4e177915a0d8813bf8048527b151ba5f98ef5201cf575b9479595c167"} Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.485326 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" event={"ID":"58ee89c8-9799-45bc-8851-9f0309c9fe9c","Type":"ContainerStarted","Data":"4e132c255004718b2ea4487cc71b1ca80aee8b4858b10ba61df56ba70a977bce"} Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.487757 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.493017 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pmp5r" podStartSLOduration=2.590452156 podStartE2EDuration="29.49299887s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:50.079980991 +0000 UTC m=+1015.021135705" lastFinishedPulling="2026-03-18 07:04:16.982527695 +0000 UTC m=+1041.923682419" observedRunningTime="2026-03-18 07:04:17.489974234 +0000 UTC m=+1042.431128938" watchObservedRunningTime="2026-03-18 07:04:17.49299887 +0000 UTC m=+1042.434153574" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.499963 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" event={"ID":"74a539c5-379b-44f2-ac68-c91ba6a7bafa","Type":"ContainerStarted","Data":"531668ea04f1e66207cf242150c10bb61d02432397dad41dd175e80920932cb0"} Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.500127 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.504939 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" event={"ID":"d6e0ddcc-f5b6-4201-8261-2f2b975a6532","Type":"ContainerStarted","Data":"76adf0319bacb4a5ea458fc75ba431a815dd1f5499ab22ebbdde4e7dd2508505"} Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.505407 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.507809 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" event={"ID":"31f7a213-0bfe-4a21-9f97-684d862587fe","Type":"ContainerStarted","Data":"def7c9ca84420e38c6dba7c5539d42caa2ff2f0c0929eefd9ed219efebc4efca"} Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.508152 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.510916 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" event={"ID":"7b8cbed9-9d0f-48dd-a497-910e2e1036ad","Type":"ContainerStarted","Data":"190ba12560b5a3d7a1b2f4df9325b6ebe4490f9fcd663109cc5ec9cb8a3bd6fd"} Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.511214 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.516689 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" podStartSLOduration=2.616763866 podStartE2EDuration="29.516670199s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:50.008741834 +0000 UTC m=+1014.949896548" lastFinishedPulling="2026-03-18 07:04:16.908648157 +0000 UTC m=+1041.849802881" observedRunningTime="2026-03-18 07:04:17.509297846 +0000 UTC m=+1042.450452560" watchObservedRunningTime="2026-03-18 07:04:17.516670199 +0000 UTC m=+1042.457824913" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.560718 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" podStartSLOduration=20.746204254 podStartE2EDuration="29.560699674s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:04:08.148790007 +0000 UTC m=+1033.089944721" lastFinishedPulling="2026-03-18 07:04:16.963285417 +0000 UTC m=+1041.904440141" observedRunningTime="2026-03-18 07:04:17.553775972 +0000 UTC m=+1042.494930686" watchObservedRunningTime="2026-03-18 07:04:17.560699674 +0000 UTC m=+1042.501854388" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.563729 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" podStartSLOduration=2.66937943 podStartE2EDuration="29.56372222s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:50.079166141 +0000 UTC m=+1015.020320855" lastFinishedPulling="2026-03-18 07:04:16.973508921 +0000 UTC m=+1041.914663645" observedRunningTime="2026-03-18 07:04:17.528833572 +0000 UTC m=+1042.469988286" watchObservedRunningTime="2026-03-18 07:04:17.56372222 +0000 UTC m=+1042.504876934" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.583482 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" podStartSLOduration=2.733878365 podStartE2EDuration="29.583465361s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:50.052906007 +0000 UTC m=+1014.994060721" lastFinishedPulling="2026-03-18 07:04:16.902493003 +0000 UTC m=+1041.843647717" observedRunningTime="2026-03-18 07:04:17.579422631 +0000 UTC m=+1042.520577345" watchObservedRunningTime="2026-03-18 07:04:17.583465361 +0000 UTC m=+1042.524620075" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.602960 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" podStartSLOduration=2.266540073 podStartE2EDuration="29.602946096s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.864143838 +0000 UTC m=+1014.805298552" lastFinishedPulling="2026-03-18 07:04:17.200549861 +0000 UTC m=+1042.141704575" observedRunningTime="2026-03-18 07:04:17.601803838 +0000 UTC m=+1042.542958552" watchObservedRunningTime="2026-03-18 07:04:17.602946096 +0000 UTC m=+1042.544100810" Mar 18 07:04:17 crc kubenswrapper[4917]: I0318 07:04:17.785290 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f96a1728-b2c0-42da-b1a0-8794cd50c63c" path="/var/lib/kubelet/pods/f96a1728-b2c0-42da-b1a0-8794cd50c63c/volumes" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.398436 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-jh9lg" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.423060 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-9xcfc" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.434975 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-ptnzg" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.518086 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" event={"ID":"33ac06ad-c83e-4881-85f4-c13e31f72ac9","Type":"ContainerStarted","Data":"011c27b4a3dc24cb3d511798253f5e392956fd57e125e455b691ae10b5c6c190"} Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.519068 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.522547 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" event={"ID":"52be5f50-56f7-4258-863d-998730dd87b7","Type":"ContainerStarted","Data":"03cbfabcdcab4cacc2d9758105befb1bd65bb90463460e4add3ac6c92d0ee6c9"} Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.522899 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.540848 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" podStartSLOduration=3.387311988 podStartE2EDuration="30.540830487s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:50.243285036 +0000 UTC m=+1015.184439750" lastFinishedPulling="2026-03-18 07:04:17.396803525 +0000 UTC m=+1042.337958249" observedRunningTime="2026-03-18 07:04:18.537000862 +0000 UTC m=+1043.478155576" watchObservedRunningTime="2026-03-18 07:04:18.540830487 +0000 UTC m=+1043.481985201" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.561511 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" podStartSLOduration=3.085127366 podStartE2EDuration="30.561489741s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.861409031 +0000 UTC m=+1014.802563745" lastFinishedPulling="2026-03-18 07:04:17.337771406 +0000 UTC m=+1042.278926120" observedRunningTime="2026-03-18 07:04:18.555719578 +0000 UTC m=+1043.496874292" watchObservedRunningTime="2026-03-18 07:04:18.561489741 +0000 UTC m=+1043.502644455" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.606118 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-ghz2b" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.771136 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-krlq9" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.808691 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-h845j" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.840690 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-blj7z" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.877452 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-72hk7" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.880502 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6gvpd" Mar 18 07:04:18 crc kubenswrapper[4917]: I0318 07:04:18.891485 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-n4sbn" Mar 18 07:04:19 crc kubenswrapper[4917]: I0318 07:04:19.543167 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" event={"ID":"46d6297c-23d8-40d4-8238-c4b5c4c8669c","Type":"ContainerStarted","Data":"e162922246df1adba2790aa3b418ea9da4c1e0b635406ceb983e237c99c05305"} Mar 18 07:04:19 crc kubenswrapper[4917]: I0318 07:04:19.544904 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" Mar 18 07:04:19 crc kubenswrapper[4917]: I0318 07:04:19.550945 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" event={"ID":"617f6235-ba96-4558-8bc4-bef2488095d2","Type":"ContainerStarted","Data":"72f2d0760f7d61fc7846e72a982e12f6a53599be31ac0cdd1cc6800d8bedb036"} Mar 18 07:04:19 crc kubenswrapper[4917]: I0318 07:04:19.577458 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" podStartSLOduration=2.773254004 podStartE2EDuration="31.577406545s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.618822023 +0000 UTC m=+1014.559976737" lastFinishedPulling="2026-03-18 07:04:18.422974524 +0000 UTC m=+1043.364129278" observedRunningTime="2026-03-18 07:04:19.567807546 +0000 UTC m=+1044.508962340" watchObservedRunningTime="2026-03-18 07:04:19.577406545 +0000 UTC m=+1044.518561299" Mar 18 07:04:19 crc kubenswrapper[4917]: I0318 07:04:19.587919 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" podStartSLOduration=1.98131601 podStartE2EDuration="31.587900946s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.57910527 +0000 UTC m=+1014.520259984" lastFinishedPulling="2026-03-18 07:04:19.185690206 +0000 UTC m=+1044.126844920" observedRunningTime="2026-03-18 07:04:19.584951243 +0000 UTC m=+1044.526105957" watchObservedRunningTime="2026-03-18 07:04:19.587900946 +0000 UTC m=+1044.529055660" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.415951 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.421317 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4e8c2c8e-89ce-487f-b6f9-21c3af395094-cert\") pod \"openstack-baremetal-operator-controller-manager-c68874588-kjm55\" (UID: \"4e8c2c8e-89ce-487f-b6f9-21c3af395094\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.714806 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-56x8z" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.723021 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.820687 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.823052 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.829017 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-metrics-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.829719 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/93bce59e-ffb2-4962-9e6d-79d909c26899-webhook-certs\") pod \"openstack-operator-controller-manager-5b7b5fbd97-fbnbz\" (UID: \"93bce59e-ffb2-4962-9e6d-79d909c26899\") " pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.844906 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-56bz7" Mar 18 07:04:20 crc kubenswrapper[4917]: I0318 07:04:20.854826 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:04:21 crc kubenswrapper[4917]: I0318 07:04:21.183564 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz"] Mar 18 07:04:21 crc kubenswrapper[4917]: W0318 07:04:21.188564 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93bce59e_ffb2_4962_9e6d_79d909c26899.slice/crio-683bcf37498ff2307014b50b85c0e887104dfa9fbdf82d1b1252a0ea9a63b7f4 WatchSource:0}: Error finding container 683bcf37498ff2307014b50b85c0e887104dfa9fbdf82d1b1252a0ea9a63b7f4: Status 404 returned error can't find the container with id 683bcf37498ff2307014b50b85c0e887104dfa9fbdf82d1b1252a0ea9a63b7f4 Mar 18 07:04:21 crc kubenswrapper[4917]: I0318 07:04:21.235345 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55"] Mar 18 07:04:21 crc kubenswrapper[4917]: W0318 07:04:21.237767 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e8c2c8e_89ce_487f_b6f9_21c3af395094.slice/crio-13bc907b4dba0336a0d39e2521047efdda7fab5528a0aa5f776032f3f7db2304 WatchSource:0}: Error finding container 13bc907b4dba0336a0d39e2521047efdda7fab5528a0aa5f776032f3f7db2304: Status 404 returned error can't find the container with id 13bc907b4dba0336a0d39e2521047efdda7fab5528a0aa5f776032f3f7db2304 Mar 18 07:04:21 crc kubenswrapper[4917]: I0318 07:04:21.565662 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" event={"ID":"93bce59e-ffb2-4962-9e6d-79d909c26899","Type":"ContainerStarted","Data":"683bcf37498ff2307014b50b85c0e887104dfa9fbdf82d1b1252a0ea9a63b7f4"} Mar 18 07:04:21 crc kubenswrapper[4917]: I0318 07:04:21.567150 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" event={"ID":"4e8c2c8e-89ce-487f-b6f9-21c3af395094","Type":"ContainerStarted","Data":"13bc907b4dba0336a0d39e2521047efdda7fab5528a0aa5f776032f3f7db2304"} Mar 18 07:04:24 crc kubenswrapper[4917]: I0318 07:04:24.165196 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-wjfbl" Mar 18 07:04:26 crc kubenswrapper[4917]: I0318 07:04:26.609490 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" event={"ID":"93bce59e-ffb2-4962-9e6d-79d909c26899","Type":"ContainerStarted","Data":"0aaca93ae28f0af6db980f41c64b6e8c962c6e1dfd9c9a49ba71c1e5237c8502"} Mar 18 07:04:27 crc kubenswrapper[4917]: I0318 07:04:27.621292 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:04:27 crc kubenswrapper[4917]: I0318 07:04:27.666569 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" podStartSLOduration=39.666551091 podStartE2EDuration="39.666551091s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:04:27.650352358 +0000 UTC m=+1052.591507112" watchObservedRunningTime="2026-03-18 07:04:27.666551091 +0000 UTC m=+1052.607705825" Mar 18 07:04:28 crc kubenswrapper[4917]: I0318 07:04:28.448078 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" Mar 18 07:04:28 crc kubenswrapper[4917]: I0318 07:04:28.451761 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-5d6d5" Mar 18 07:04:28 crc kubenswrapper[4917]: I0318 07:04:28.477554 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-d2748" Mar 18 07:04:28 crc kubenswrapper[4917]: I0318 07:04:28.642515 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-mbcrf" Mar 18 07:04:28 crc kubenswrapper[4917]: I0318 07:04:28.705099 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-dt54q" Mar 18 07:04:28 crc kubenswrapper[4917]: I0318 07:04:28.931351 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-fkdn9" Mar 18 07:04:28 crc kubenswrapper[4917]: I0318 07:04:28.957895 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hhtxp" Mar 18 07:04:28 crc kubenswrapper[4917]: I0318 07:04:28.965289 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbdwd" Mar 18 07:04:28 crc kubenswrapper[4917]: I0318 07:04:28.993380 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-b8bsp" Mar 18 07:04:29 crc kubenswrapper[4917]: I0318 07:04:29.636298 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" event={"ID":"4e8c2c8e-89ce-487f-b6f9-21c3af395094","Type":"ContainerStarted","Data":"5a3715d46863d984bd073f63e92233bccd0e4eae8a562c8980468ed89a5904d2"} Mar 18 07:04:29 crc kubenswrapper[4917]: I0318 07:04:29.636739 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:04:29 crc kubenswrapper[4917]: I0318 07:04:29.639028 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" event={"ID":"b29b952f-ff1c-4734-a43b-4b836d090108","Type":"ContainerStarted","Data":"f209be9e0f54143c106c1e80acd44a5a7c9ea2f25f2ef35684d28fdf538eedcb"} Mar 18 07:04:29 crc kubenswrapper[4917]: I0318 07:04:29.639427 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" Mar 18 07:04:29 crc kubenswrapper[4917]: I0318 07:04:29.670069 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" podStartSLOduration=33.460761656 podStartE2EDuration="41.670047673s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:04:21.240019092 +0000 UTC m=+1046.181173806" lastFinishedPulling="2026-03-18 07:04:29.449305109 +0000 UTC m=+1054.390459823" observedRunningTime="2026-03-18 07:04:29.665379257 +0000 UTC m=+1054.606534041" watchObservedRunningTime="2026-03-18 07:04:29.670047673 +0000 UTC m=+1054.611202397" Mar 18 07:04:38 crc kubenswrapper[4917]: I0318 07:04:38.825294 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" Mar 18 07:04:38 crc kubenswrapper[4917]: I0318 07:04:38.855566 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-f8nxt" podStartSLOduration=11.314860011 podStartE2EDuration="50.855539354s" podCreationTimestamp="2026-03-18 07:03:48 +0000 UTC" firstStartedPulling="2026-03-18 07:03:49.86379129 +0000 UTC m=+1014.804946004" lastFinishedPulling="2026-03-18 07:04:29.404470633 +0000 UTC m=+1054.345625347" observedRunningTime="2026-03-18 07:04:29.689088677 +0000 UTC m=+1054.630243401" watchObservedRunningTime="2026-03-18 07:04:38.855539354 +0000 UTC m=+1063.796694138" Mar 18 07:04:40 crc kubenswrapper[4917]: I0318 07:04:40.733237 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c68874588-kjm55" Mar 18 07:04:40 crc kubenswrapper[4917]: I0318 07:04:40.863236 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5b7b5fbd97-fbnbz" Mar 18 07:04:56 crc kubenswrapper[4917]: I0318 07:04:56.612310 4917 scope.go:117] "RemoveContainer" containerID="e987278e5355ddd1c2438c79c9c14123be9ee59a9274536d40cc2fb35b569400" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.607962 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-8hnnw"] Mar 18 07:04:57 crc kubenswrapper[4917]: E0318 07:04:57.608519 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1cc3bc-d171-4a57-9eba-cfc6da82fe52" containerName="oc" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.608530 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1cc3bc-d171-4a57-9eba-cfc6da82fe52" containerName="oc" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.608688 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1cc3bc-d171-4a57-9eba-cfc6da82fe52" containerName="oc" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.609483 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.613721 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.613938 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-vdbwt" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.614110 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.614244 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.623529 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-8hnnw"] Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.656750 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64696987c5-nkrcg"] Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.657762 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.664059 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.673062 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-nkrcg"] Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.704796 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jvmq\" (UniqueName: \"kubernetes.io/projected/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-kube-api-access-8jvmq\") pod \"dnsmasq-dns-5448ff6dc7-8hnnw\" (UID: \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\") " pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.704878 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-config\") pod \"dnsmasq-dns-5448ff6dc7-8hnnw\" (UID: \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\") " pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.806092 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-dns-svc\") pod \"dnsmasq-dns-64696987c5-nkrcg\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.806468 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jvmq\" (UniqueName: \"kubernetes.io/projected/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-kube-api-access-8jvmq\") pod \"dnsmasq-dns-5448ff6dc7-8hnnw\" (UID: \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\") " pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.806638 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-config\") pod \"dnsmasq-dns-64696987c5-nkrcg\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.808442 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-config\") pod \"dnsmasq-dns-5448ff6dc7-8hnnw\" (UID: \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\") " pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.808516 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-config\") pod \"dnsmasq-dns-5448ff6dc7-8hnnw\" (UID: \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\") " pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.808577 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjzfx\" (UniqueName: \"kubernetes.io/projected/fe9c96ad-e63b-4401-8bf9-a52e177193e3-kube-api-access-wjzfx\") pod \"dnsmasq-dns-64696987c5-nkrcg\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.839117 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jvmq\" (UniqueName: \"kubernetes.io/projected/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-kube-api-access-8jvmq\") pod \"dnsmasq-dns-5448ff6dc7-8hnnw\" (UID: \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\") " pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.910105 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-config\") pod \"dnsmasq-dns-64696987c5-nkrcg\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.910225 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjzfx\" (UniqueName: \"kubernetes.io/projected/fe9c96ad-e63b-4401-8bf9-a52e177193e3-kube-api-access-wjzfx\") pod \"dnsmasq-dns-64696987c5-nkrcg\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.910318 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-dns-svc\") pod \"dnsmasq-dns-64696987c5-nkrcg\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.911356 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-config\") pod \"dnsmasq-dns-64696987c5-nkrcg\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.911494 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-dns-svc\") pod \"dnsmasq-dns-64696987c5-nkrcg\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.929012 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjzfx\" (UniqueName: \"kubernetes.io/projected/fe9c96ad-e63b-4401-8bf9-a52e177193e3-kube-api-access-wjzfx\") pod \"dnsmasq-dns-64696987c5-nkrcg\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.933446 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:04:57 crc kubenswrapper[4917]: I0318 07:04:57.971721 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.245467 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-nkrcg"] Mar 18 07:04:58 crc kubenswrapper[4917]: W0318 07:04:58.261783 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe9c96ad_e63b_4401_8bf9_a52e177193e3.slice/crio-39311fe05500b2330c6dd9e8389860dde90a1e05f55edbe0d5ba98f5d33e3e1f WatchSource:0}: Error finding container 39311fe05500b2330c6dd9e8389860dde90a1e05f55edbe0d5ba98f5d33e3e1f: Status 404 returned error can't find the container with id 39311fe05500b2330c6dd9e8389860dde90a1e05f55edbe0d5ba98f5d33e3e1f Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.264003 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.389184 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-8hnnw"] Mar 18 07:04:58 crc kubenswrapper[4917]: W0318 07:04:58.391679 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d8013c7_f3c9_4196_acd2_c1a08cb6426e.slice/crio-bb596f9d83bb0b50ae511078e5b8bccfedcdc99d9ff6cbee5485b40c6561e15d WatchSource:0}: Error finding container bb596f9d83bb0b50ae511078e5b8bccfedcdc99d9ff6cbee5485b40c6561e15d: Status 404 returned error can't find the container with id bb596f9d83bb0b50ae511078e5b8bccfedcdc99d9ff6cbee5485b40c6561e15d Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.727280 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-8hnnw"] Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.746669 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-t89fr"] Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.748081 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.757006 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-t89fr"] Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.827248 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvxn\" (UniqueName: \"kubernetes.io/projected/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-kube-api-access-hpvxn\") pod \"dnsmasq-dns-658f55c9f5-t89fr\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.827302 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-t89fr\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.827346 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-config\") pod \"dnsmasq-dns-658f55c9f5-t89fr\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.915479 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" event={"ID":"6d8013c7-f3c9-4196-acd2-c1a08cb6426e","Type":"ContainerStarted","Data":"bb596f9d83bb0b50ae511078e5b8bccfedcdc99d9ff6cbee5485b40c6561e15d"} Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.916663 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-nkrcg" event={"ID":"fe9c96ad-e63b-4401-8bf9-a52e177193e3","Type":"ContainerStarted","Data":"39311fe05500b2330c6dd9e8389860dde90a1e05f55edbe0d5ba98f5d33e3e1f"} Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.928489 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-config\") pod \"dnsmasq-dns-658f55c9f5-t89fr\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.928765 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvxn\" (UniqueName: \"kubernetes.io/projected/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-kube-api-access-hpvxn\") pod \"dnsmasq-dns-658f55c9f5-t89fr\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.929060 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-t89fr\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.929362 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-config\") pod \"dnsmasq-dns-658f55c9f5-t89fr\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.929961 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-dns-svc\") pod \"dnsmasq-dns-658f55c9f5-t89fr\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:58 crc kubenswrapper[4917]: I0318 07:04:58.951953 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvxn\" (UniqueName: \"kubernetes.io/projected/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-kube-api-access-hpvxn\") pod \"dnsmasq-dns-658f55c9f5-t89fr\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.067193 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.552548 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-t89fr"] Mar 18 07:04:59 crc kubenswrapper[4917]: W0318 07:04:59.568334 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod002ecc9e_4964_4a4e_873c_2ddaadebc6fc.slice/crio-78f60bc1ba33b08e13de2aaa4817bf8b7aa55140fa453618366b53a921ed5de0 WatchSource:0}: Error finding container 78f60bc1ba33b08e13de2aaa4817bf8b7aa55140fa453618366b53a921ed5de0: Status 404 returned error can't find the container with id 78f60bc1ba33b08e13de2aaa4817bf8b7aa55140fa453618366b53a921ed5de0 Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.620826 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-nkrcg"] Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.640610 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-b9jzc"] Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.642728 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.649050 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-b9jzc"] Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.741843 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-b9jzc\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.741921 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbh6j\" (UniqueName: \"kubernetes.io/projected/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-kube-api-access-sbh6j\") pod \"dnsmasq-dns-54b5dffb47-b9jzc\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.741957 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-config\") pod \"dnsmasq-dns-54b5dffb47-b9jzc\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.843324 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbh6j\" (UniqueName: \"kubernetes.io/projected/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-kube-api-access-sbh6j\") pod \"dnsmasq-dns-54b5dffb47-b9jzc\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.843367 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-config\") pod \"dnsmasq-dns-54b5dffb47-b9jzc\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.843451 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-b9jzc\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.844379 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-config\") pod \"dnsmasq-dns-54b5dffb47-b9jzc\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.845825 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-dns-svc\") pod \"dnsmasq-dns-54b5dffb47-b9jzc\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.859418 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbh6j\" (UniqueName: \"kubernetes.io/projected/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-kube-api-access-sbh6j\") pod \"dnsmasq-dns-54b5dffb47-b9jzc\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.872653 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.874108 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.878532 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.878789 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.878956 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.879094 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.879274 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.879404 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.879540 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-8pzj2" Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.889855 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 07:04:59 crc kubenswrapper[4917]: I0318 07:04:59.927665 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" event={"ID":"002ecc9e-4964-4a4e-873c-2ddaadebc6fc","Type":"ContainerStarted","Data":"78f60bc1ba33b08e13de2aaa4817bf8b7aa55140fa453618366b53a921ed5de0"} Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.009960 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045281 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a2ed8f1-269d-45fb-a766-46c867bd0a91-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045331 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045357 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045378 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045398 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045417 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045455 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mnbs\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-kube-api-access-8mnbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045492 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045510 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045528 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.045543 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a2ed8f1-269d-45fb-a766-46c867bd0a91-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149307 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149348 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149372 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149391 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149411 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149450 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mnbs\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-kube-api-access-8mnbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149487 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149507 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149525 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149538 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a2ed8f1-269d-45fb-a766-46c867bd0a91-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.149557 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a2ed8f1-269d-45fb-a766-46c867bd0a91-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.151178 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.151404 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.151417 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.152962 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.154197 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.155542 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.157697 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.160861 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a2ed8f1-269d-45fb-a766-46c867bd0a91-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.166241 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a2ed8f1-269d-45fb-a766-46c867bd0a91-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.168378 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.175177 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mnbs\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-kube-api-access-8mnbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.184079 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.215306 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.275832 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-b9jzc"] Mar 18 07:05:00 crc kubenswrapper[4917]: W0318 07:05:00.279382 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4dea0e2_6a23_4d5f_91b9_c65f5c7da380.slice/crio-3c909bd787bebea8a5b7ac18b8aba1b5ac283c31a04e14ab2c515558557ad79c WatchSource:0}: Error finding container 3c909bd787bebea8a5b7ac18b8aba1b5ac283c31a04e14ab2c515558557ad79c: Status 404 returned error can't find the container with id 3c909bd787bebea8a5b7ac18b8aba1b5ac283c31a04e14ab2c515558557ad79c Mar 18 07:05:00 crc kubenswrapper[4917]: W0318 07:05:00.631329 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2ed8f1_269d_45fb_a766_46c867bd0a91.slice/crio-ea7460b86f89a147f264b57eef417205a660e20414287c6dc10f2350ca339469 WatchSource:0}: Error finding container ea7460b86f89a147f264b57eef417205a660e20414287c6dc10f2350ca339469: Status 404 returned error can't find the container with id ea7460b86f89a147f264b57eef417205a660e20414287c6dc10f2350ca339469 Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.641569 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.745858 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.747436 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.749425 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2zq5l" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.750187 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.791262 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.791528 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.791543 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.791669 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.798441 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.798805 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.895784 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.895850 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.895903 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.895951 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.895971 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11fb09df-78b6-44c6-a78f-2b720a98cfad-pod-info\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.895989 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11fb09df-78b6-44c6-a78f-2b720a98cfad-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.896013 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-server-conf\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.896030 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25j97\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-kube-api-access-25j97\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.896060 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.896082 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.896168 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-config-data\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.946768 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a2ed8f1-269d-45fb-a766-46c867bd0a91","Type":"ContainerStarted","Data":"ea7460b86f89a147f264b57eef417205a660e20414287c6dc10f2350ca339469"} Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.952022 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" event={"ID":"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380","Type":"ContainerStarted","Data":"3c909bd787bebea8a5b7ac18b8aba1b5ac283c31a04e14ab2c515558557ad79c"} Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.997716 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.997770 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.997797 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.997827 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.997850 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11fb09df-78b6-44c6-a78f-2b720a98cfad-pod-info\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.997875 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11fb09df-78b6-44c6-a78f-2b720a98cfad-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.997891 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-server-conf\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.997909 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25j97\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-kube-api-access-25j97\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.997963 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.998004 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.998036 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-config-data\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.998618 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.999082 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.999215 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:00 crc kubenswrapper[4917]: I0318 07:05:00.999287 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-server-conf\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.000016 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.000417 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-config-data\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.005836 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11fb09df-78b6-44c6-a78f-2b720a98cfad-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.005910 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.006178 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.006204 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11fb09df-78b6-44c6-a78f-2b720a98cfad-pod-info\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.017959 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25j97\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-kube-api-access-25j97\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.023080 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.112444 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.636066 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 07:05:01 crc kubenswrapper[4917]: I0318 07:05:01.959643 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11fb09df-78b6-44c6-a78f-2b720a98cfad","Type":"ContainerStarted","Data":"2ff3c9e276be85f73c62af181c5793a51b056b4ea2cb32371faa29a948b35ef0"} Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.160613 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.161714 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.165901 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.166316 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-zhq7w" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.166876 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.167777 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.171653 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.171959 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.316170 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.316463 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.316489 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/84500248-15f0-4049-8423-43502d6587cf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.316513 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92fvn\" (UniqueName: \"kubernetes.io/projected/84500248-15f0-4049-8423-43502d6587cf-kube-api-access-92fvn\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.316695 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.316780 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-config-data-default\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.316859 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-kolla-config\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.316933 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.422242 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.422310 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.422329 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.422347 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/84500248-15f0-4049-8423-43502d6587cf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.422367 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92fvn\" (UniqueName: \"kubernetes.io/projected/84500248-15f0-4049-8423-43502d6587cf-kube-api-access-92fvn\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.422403 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.422428 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-config-data-default\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.422475 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-kolla-config\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.422946 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.423319 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/84500248-15f0-4049-8423-43502d6587cf-config-data-generated\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.424431 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-operator-scripts\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.425036 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-kolla-config\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.425568 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-config-data-default\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.438030 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.456823 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.458947 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92fvn\" (UniqueName: \"kubernetes.io/projected/84500248-15f0-4049-8423-43502d6587cf-kube-api-access-92fvn\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.460441 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-galera-0\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.480904 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.929413 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:05:02 crc kubenswrapper[4917]: I0318 07:05:02.929463 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.580178 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.586774 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.593210 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.593351 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-54t4b" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.593452 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.594050 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.602315 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.742105 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcffm\" (UniqueName: \"kubernetes.io/projected/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kube-api-access-gcffm\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.742149 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.742170 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.742190 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.742212 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.742434 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.742506 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.742657 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.834360 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.835424 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.837299 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.837478 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.837555 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-gcsj8" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.843622 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.843708 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcffm\" (UniqueName: \"kubernetes.io/projected/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kube-api-access-gcffm\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.843731 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.843779 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.843797 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.843817 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.843880 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.843949 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.844158 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.844521 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.844772 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.845970 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.847050 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.847469 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.854410 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.857213 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.857449 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcffm\" (UniqueName: \"kubernetes.io/projected/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kube-api-access-gcffm\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.879605 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.909131 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.945270 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.945408 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kolla-config\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.945460 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-config-data\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.945480 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44kl\" (UniqueName: \"kubernetes.io/projected/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kube-api-access-g44kl\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:03 crc kubenswrapper[4917]: I0318 07:05:03.945522 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.047057 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-config-data\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.047872 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44kl\" (UniqueName: \"kubernetes.io/projected/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kube-api-access-g44kl\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.047824 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-config-data\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.047964 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.048037 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.048420 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kolla-config\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.048905 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kolla-config\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.059282 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-memcached-tls-certs\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.062768 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-combined-ca-bundle\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.066170 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44kl\" (UniqueName: \"kubernetes.io/projected/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kube-api-access-g44kl\") pod \"memcached-0\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " pod="openstack/memcached-0" Mar 18 07:05:04 crc kubenswrapper[4917]: I0318 07:05:04.220918 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 07:05:06 crc kubenswrapper[4917]: I0318 07:05:06.062218 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:05:06 crc kubenswrapper[4917]: I0318 07:05:06.063607 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 07:05:06 crc kubenswrapper[4917]: I0318 07:05:06.071922 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:05:06 crc kubenswrapper[4917]: I0318 07:05:06.072870 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-x42v7" Mar 18 07:05:06 crc kubenswrapper[4917]: I0318 07:05:06.185175 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2tpj\" (UniqueName: \"kubernetes.io/projected/d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9-kube-api-access-k2tpj\") pod \"kube-state-metrics-0\" (UID: \"d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9\") " pod="openstack/kube-state-metrics-0" Mar 18 07:05:06 crc kubenswrapper[4917]: I0318 07:05:06.288292 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2tpj\" (UniqueName: \"kubernetes.io/projected/d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9-kube-api-access-k2tpj\") pod \"kube-state-metrics-0\" (UID: \"d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9\") " pod="openstack/kube-state-metrics-0" Mar 18 07:05:06 crc kubenswrapper[4917]: I0318 07:05:06.315297 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2tpj\" (UniqueName: \"kubernetes.io/projected/d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9-kube-api-access-k2tpj\") pod \"kube-state-metrics-0\" (UID: \"d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9\") " pod="openstack/kube-state-metrics-0" Mar 18 07:05:06 crc kubenswrapper[4917]: I0318 07:05:06.393285 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.135856 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9gbdd"] Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.138017 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.140938 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.141151 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.142251 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-9bc9d" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.145303 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9gbdd"] Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.173868 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-jwxq2"] Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.175332 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.193749 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jwxq2"] Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.228307 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-combined-ca-bundle\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.228356 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-log-ovn\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.228406 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33ddaa4d-48c1-4c81-b0a3-4225b6382496-scripts\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.228442 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run-ovn\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.228460 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnw6x\" (UniqueName: \"kubernetes.io/projected/33ddaa4d-48c1-4c81-b0a3-4225b6382496-kube-api-access-wnw6x\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.228500 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-ovn-controller-tls-certs\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.228527 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331302 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run-ovn\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331351 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnw6x\" (UniqueName: \"kubernetes.io/projected/33ddaa4d-48c1-4c81-b0a3-4225b6382496-kube-api-access-wnw6x\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331404 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-ovn-controller-tls-certs\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331427 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-lib\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331450 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331474 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-log\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331500 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-etc-ovs\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331522 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-combined-ca-bundle\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331538 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-log-ovn\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331558 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24kf9\" (UniqueName: \"kubernetes.io/projected/dbba9465-1e4b-4b42-b512-addd628093d3-kube-api-access-24kf9\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331600 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbba9465-1e4b-4b42-b512-addd628093d3-scripts\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331619 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33ddaa4d-48c1-4c81-b0a3-4225b6382496-scripts\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.331647 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-run\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.332443 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.332697 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-log-ovn\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.332739 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run-ovn\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.334490 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33ddaa4d-48c1-4c81-b0a3-4225b6382496-scripts\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.338819 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-combined-ca-bundle\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.339059 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-ovn-controller-tls-certs\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.370573 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnw6x\" (UniqueName: \"kubernetes.io/projected/33ddaa4d-48c1-4c81-b0a3-4225b6382496-kube-api-access-wnw6x\") pod \"ovn-controller-9gbdd\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.432569 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-lib\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.432640 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-log\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.432669 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-etc-ovs\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.432701 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24kf9\" (UniqueName: \"kubernetes.io/projected/dbba9465-1e4b-4b42-b512-addd628093d3-kube-api-access-24kf9\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.432745 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbba9465-1e4b-4b42-b512-addd628093d3-scripts\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.432779 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-run\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.432879 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-run\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.432913 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-log\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.432985 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-etc-ovs\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.433023 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-lib\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.434658 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbba9465-1e4b-4b42-b512-addd628093d3-scripts\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.451783 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24kf9\" (UniqueName: \"kubernetes.io/projected/dbba9465-1e4b-4b42-b512-addd628093d3-kube-api-access-24kf9\") pod \"ovn-controller-ovs-jwxq2\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.466936 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:09 crc kubenswrapper[4917]: I0318 07:05:09.510567 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.038527 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.039647 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.041326 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.042016 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.042615 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.042964 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.043153 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-c7xzr" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.052818 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.147396 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.147444 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.147492 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-config\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.147518 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.147536 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.147569 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4f7r\" (UniqueName: \"kubernetes.io/projected/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-kube-api-access-j4f7r\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.147668 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.147691 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249023 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249076 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249128 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-config\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249159 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249182 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249215 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4f7r\" (UniqueName: \"kubernetes.io/projected/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-kube-api-access-j4f7r\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249291 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249326 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249501 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.249555 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.250151 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-config\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.250619 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.254104 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.254842 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.260855 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.270513 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.270867 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4f7r\" (UniqueName: \"kubernetes.io/projected/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-kube-api-access-j4f7r\") pod \"ovsdbserver-sb-0\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:10 crc kubenswrapper[4917]: I0318 07:05:10.375810 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.830971 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.832815 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.843711 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-mwzkp" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.844007 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.844259 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.845900 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.885633 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.890669 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-config\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.890708 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.890751 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.890805 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.890854 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsqd8\" (UniqueName: \"kubernetes.io/projected/bf779ffa-140b-4c10-b42e-9c7568cebb01-kube-api-access-dsqd8\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.890880 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.890917 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.890940 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.992865 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-config\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.992924 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.992982 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.993016 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.993042 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsqd8\" (UniqueName: \"kubernetes.io/projected/bf779ffa-140b-4c10-b42e-9c7568cebb01-kube-api-access-dsqd8\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.993074 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.993127 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.993155 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.993552 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.993969 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.994696 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-config\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:12 crc kubenswrapper[4917]: I0318 07:05:12.998497 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:13 crc kubenswrapper[4917]: I0318 07:05:13.010058 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:13 crc kubenswrapper[4917]: I0318 07:05:13.010334 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:13 crc kubenswrapper[4917]: I0318 07:05:13.012288 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:13 crc kubenswrapper[4917]: I0318 07:05:13.027408 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsqd8\" (UniqueName: \"kubernetes.io/projected/bf779ffa-140b-4c10-b42e-9c7568cebb01-kube-api-access-dsqd8\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:13 crc kubenswrapper[4917]: I0318 07:05:13.031728 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:13 crc kubenswrapper[4917]: I0318 07:05:13.174863 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.843265 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.843678 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jvmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5448ff6dc7-8hnnw_openstack(6d8013c7-f3c9-4196-acd2-c1a08cb6426e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.847531 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" podUID="6d8013c7-f3c9-4196-acd2-c1a08cb6426e" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.880511 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.880688 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hpvxn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-658f55c9f5-t89fr_openstack(002ecc9e-4964-4a4e-873c-2ddaadebc6fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.882064 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" podUID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.887773 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.888146 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wjzfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-64696987c5-nkrcg_openstack(fe9c96ad-e63b-4401-8bf9-a52e177193e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.889276 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-64696987c5-nkrcg" podUID="fe9c96ad-e63b-4401-8bf9-a52e177193e3" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.916255 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.916453 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sbh6j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-54b5dffb47-b9jzc_openstack(f4dea0e2-6a23-4d5f-91b9-c65f5c7da380): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:05:15 crc kubenswrapper[4917]: E0318 07:05:15.917797 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" podUID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" Mar 18 07:05:16 crc kubenswrapper[4917]: E0318 07:05:16.090060 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" podUID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" Mar 18 07:05:16 crc kubenswrapper[4917]: E0318 07:05:16.101970 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51\\\"\"" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" podUID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.465202 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 07:05:16 crc kubenswrapper[4917]: W0318 07:05:16.469385 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod033e4d58_03e5_49fa_ad5b_169464bf7ba9.slice/crio-971f2be52f502a5e65d5e4a46858d48495433a69bd5670e36c859cff5013ff1c WatchSource:0}: Error finding container 971f2be52f502a5e65d5e4a46858d48495433a69bd5670e36c859cff5013ff1c: Status 404 returned error can't find the container with id 971f2be52f502a5e65d5e4a46858d48495433a69bd5670e36c859cff5013ff1c Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.477874 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.597711 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 07:05:16 crc kubenswrapper[4917]: W0318 07:05:16.600413 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9fe9ead_8bad_4e5d_8719_ba6e4a94dae9.slice/crio-0fd4500833042da0f4dec5b497927133b09d9854806eb8f23781610a96b872d1 WatchSource:0}: Error finding container 0fd4500833042da0f4dec5b497927133b09d9854806eb8f23781610a96b872d1: Status 404 returned error can't find the container with id 0fd4500833042da0f4dec5b497927133b09d9854806eb8f23781610a96b872d1 Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.607419 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9gbdd"] Mar 18 07:05:16 crc kubenswrapper[4917]: W0318 07:05:16.611132 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6976b4d3_75e6_4b74_99db_3fd9acb3a742.slice/crio-12851433605b3bb8094e3ad0300c14dc29a5305e806eb3886ac2bf25df3c97e0 WatchSource:0}: Error finding container 12851433605b3bb8094e3ad0300c14dc29a5305e806eb3886ac2bf25df3c97e0: Status 404 returned error can't find the container with id 12851433605b3bb8094e3ad0300c14dc29a5305e806eb3886ac2bf25df3c97e0 Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.614101 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:05:16 crc kubenswrapper[4917]: W0318 07:05:16.616939 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbba9465_1e4b_4b42_b512_addd628093d3.slice/crio-f6faece2d16c067365c6f3099d85ada197216bdcf072c28f76abfc592fa49775 WatchSource:0}: Error finding container f6faece2d16c067365c6f3099d85ada197216bdcf072c28f76abfc592fa49775: Status 404 returned error can't find the container with id f6faece2d16c067365c6f3099d85ada197216bdcf072c28f76abfc592fa49775 Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.624065 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jwxq2"] Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.646834 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.650381 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-config\") pod \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\" (UID: \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\") " Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.650513 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jvmq\" (UniqueName: \"kubernetes.io/projected/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-kube-api-access-8jvmq\") pod \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\" (UID: \"6d8013c7-f3c9-4196-acd2-c1a08cb6426e\") " Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.652308 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-config" (OuterVolumeSpecName: "config") pod "6d8013c7-f3c9-4196-acd2-c1a08cb6426e" (UID: "6d8013c7-f3c9-4196-acd2-c1a08cb6426e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.659723 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.731169 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-kube-api-access-8jvmq" (OuterVolumeSpecName: "kube-api-access-8jvmq") pod "6d8013c7-f3c9-4196-acd2-c1a08cb6426e" (UID: "6d8013c7-f3c9-4196-acd2-c1a08cb6426e"). InnerVolumeSpecName "kube-api-access-8jvmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.753002 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-dns-svc\") pod \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.753127 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjzfx\" (UniqueName: \"kubernetes.io/projected/fe9c96ad-e63b-4401-8bf9-a52e177193e3-kube-api-access-wjzfx\") pod \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.753158 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-config\") pod \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\" (UID: \"fe9c96ad-e63b-4401-8bf9-a52e177193e3\") " Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.753704 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe9c96ad-e63b-4401-8bf9-a52e177193e3" (UID: "fe9c96ad-e63b-4401-8bf9-a52e177193e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.753776 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jvmq\" (UniqueName: \"kubernetes.io/projected/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-kube-api-access-8jvmq\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.753796 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d8013c7-f3c9-4196-acd2-c1a08cb6426e-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.754265 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-config" (OuterVolumeSpecName: "config") pod "fe9c96ad-e63b-4401-8bf9-a52e177193e3" (UID: "fe9c96ad-e63b-4401-8bf9-a52e177193e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.758906 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe9c96ad-e63b-4401-8bf9-a52e177193e3-kube-api-access-wjzfx" (OuterVolumeSpecName: "kube-api-access-wjzfx") pod "fe9c96ad-e63b-4401-8bf9-a52e177193e3" (UID: "fe9c96ad-e63b-4401-8bf9-a52e177193e3"). InnerVolumeSpecName "kube-api-access-wjzfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.857473 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.857507 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjzfx\" (UniqueName: \"kubernetes.io/projected/fe9c96ad-e63b-4401-8bf9-a52e177193e3-kube-api-access-wjzfx\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.857521 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe9c96ad-e63b-4401-8bf9-a52e177193e3-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:16 crc kubenswrapper[4917]: I0318 07:05:16.871299 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.093932 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" event={"ID":"6d8013c7-f3c9-4196-acd2-c1a08cb6426e","Type":"ContainerDied","Data":"bb596f9d83bb0b50ae511078e5b8bccfedcdc99d9ff6cbee5485b40c6561e15d"} Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.094003 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5448ff6dc7-8hnnw" Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.099632 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84500248-15f0-4049-8423-43502d6587cf","Type":"ContainerStarted","Data":"e7ce4bd2ee2154314474fb5fa29a9d9c5cade19767d99235c00e067dfab7cf6c"} Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.120830 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9gbdd" event={"ID":"33ddaa4d-48c1-4c81-b0a3-4225b6382496","Type":"ContainerStarted","Data":"f6993223374c6901f85cb9d0491bb05d4720fb6709864af33102cac30541e4cc"} Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.122576 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9f5fe129-b4bd-40ac-a3f2-6eec0469308b","Type":"ContainerStarted","Data":"5cd4f04a6e279ab9756dac383c90c32a4bfedd644d6f51091e5311ffbbb73163"} Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.124480 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9","Type":"ContainerStarted","Data":"0fd4500833042da0f4dec5b497927133b09d9854806eb8f23781610a96b872d1"} Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.126226 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jwxq2" event={"ID":"dbba9465-1e4b-4b42-b512-addd628093d3","Type":"ContainerStarted","Data":"f6faece2d16c067365c6f3099d85ada197216bdcf072c28f76abfc592fa49775"} Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.127942 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"033e4d58-03e5-49fa-ad5b-169464bf7ba9","Type":"ContainerStarted","Data":"971f2be52f502a5e65d5e4a46858d48495433a69bd5670e36c859cff5013ff1c"} Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.130400 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64696987c5-nkrcg" event={"ID":"fe9c96ad-e63b-4401-8bf9-a52e177193e3","Type":"ContainerDied","Data":"39311fe05500b2330c6dd9e8389860dde90a1e05f55edbe0d5ba98f5d33e3e1f"} Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.130445 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64696987c5-nkrcg" Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.138284 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6976b4d3-75e6-4b74-99db-3fd9acb3a742","Type":"ContainerStarted","Data":"12851433605b3bb8094e3ad0300c14dc29a5305e806eb3886ac2bf25df3c97e0"} Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.259300 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-8hnnw"] Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.270011 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5448ff6dc7-8hnnw"] Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.278064 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-nkrcg"] Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.282374 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64696987c5-nkrcg"] Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.685404 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.785666 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d8013c7-f3c9-4196-acd2-c1a08cb6426e" path="/var/lib/kubelet/pods/6d8013c7-f3c9-4196-acd2-c1a08cb6426e/volumes" Mar 18 07:05:17 crc kubenswrapper[4917]: I0318 07:05:17.786030 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe9c96ad-e63b-4401-8bf9-a52e177193e3" path="/var/lib/kubelet/pods/fe9c96ad-e63b-4401-8bf9-a52e177193e3/volumes" Mar 18 07:05:18 crc kubenswrapper[4917]: I0318 07:05:18.150757 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bf779ffa-140b-4c10-b42e-9c7568cebb01","Type":"ContainerStarted","Data":"0f6efa69f687cbe94c03ad5cad4367b781f219ac82985b49bba7132d0b878cbd"} Mar 18 07:05:19 crc kubenswrapper[4917]: I0318 07:05:19.171922 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11fb09df-78b6-44c6-a78f-2b720a98cfad","Type":"ContainerStarted","Data":"38f6a17087633473084cee1b528df5d10be62a5fb94b0daa33f536007711ed0e"} Mar 18 07:05:20 crc kubenswrapper[4917]: I0318 07:05:20.183062 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a2ed8f1-269d-45fb-a766-46c867bd0a91","Type":"ContainerStarted","Data":"89d1b01a410a58d6efadc5b7f3e235ac35c4482c50d32403ecd6e850c30e9231"} Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.226921 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9gbdd" event={"ID":"33ddaa4d-48c1-4c81-b0a3-4225b6382496","Type":"ContainerStarted","Data":"c67ceb625cdd5baa04a13e13da848059276036abb1326784ef355b6956226e2f"} Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.227701 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9gbdd" Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.228682 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9f5fe129-b4bd-40ac-a3f2-6eec0469308b","Type":"ContainerStarted","Data":"a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697"} Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.230775 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9","Type":"ContainerStarted","Data":"c0ced854d56d471c5e1933ff8991e4dd562acc177e136ffc7eb0c0cb1963647e"} Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.230911 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.232439 4917 generic.go:334] "Generic (PLEG): container finished" podID="dbba9465-1e4b-4b42-b512-addd628093d3" containerID="f1fb7e855791c24d686bd1d7eb986bfa5f8a90f822e71b3a82a95f7eccb12cb5" exitCode=0 Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.232534 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jwxq2" event={"ID":"dbba9465-1e4b-4b42-b512-addd628093d3","Type":"ContainerDied","Data":"f1fb7e855791c24d686bd1d7eb986bfa5f8a90f822e71b3a82a95f7eccb12cb5"} Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.234401 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"033e4d58-03e5-49fa-ad5b-169464bf7ba9","Type":"ContainerStarted","Data":"bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789"} Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.234526 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.236230 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84500248-15f0-4049-8423-43502d6587cf","Type":"ContainerStarted","Data":"b2b06add313005d06645803b1e8e5e8da09cb4fad220682987d99604261e8632"} Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.244109 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6976b4d3-75e6-4b74-99db-3fd9acb3a742","Type":"ContainerStarted","Data":"15c8e43db4edb4b54e790e0d0ebd8efd89d196a97667fc082ccce43971fca170"} Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.247962 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bf779ffa-140b-4c10-b42e-9c7568cebb01","Type":"ContainerStarted","Data":"b1a2420de022669dd8a8a7601588a4eb6184dda29d746ad16041b6928f5afdfb"} Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.263915 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9gbdd" podStartSLOduration=9.200378808 podStartE2EDuration="17.263889354s" podCreationTimestamp="2026-03-18 07:05:09 +0000 UTC" firstStartedPulling="2026-03-18 07:05:16.60682062 +0000 UTC m=+1101.547975344" lastFinishedPulling="2026-03-18 07:05:24.670331176 +0000 UTC m=+1109.611485890" observedRunningTime="2026-03-18 07:05:26.24836976 +0000 UTC m=+1111.189524514" watchObservedRunningTime="2026-03-18 07:05:26.263889354 +0000 UTC m=+1111.205044148" Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.275824 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.536928336 podStartE2EDuration="23.27580454s" podCreationTimestamp="2026-03-18 07:05:03 +0000 UTC" firstStartedPulling="2026-03-18 07:05:16.474621479 +0000 UTC m=+1101.415776193" lastFinishedPulling="2026-03-18 07:05:24.213497683 +0000 UTC m=+1109.154652397" observedRunningTime="2026-03-18 07:05:26.268725875 +0000 UTC m=+1111.209880619" watchObservedRunningTime="2026-03-18 07:05:26.27580454 +0000 UTC m=+1111.216959264" Mar 18 07:05:26 crc kubenswrapper[4917]: I0318 07:05:26.317186 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.658173963 podStartE2EDuration="20.317149802s" podCreationTimestamp="2026-03-18 07:05:06 +0000 UTC" firstStartedPulling="2026-03-18 07:05:16.604298107 +0000 UTC m=+1101.545452811" lastFinishedPulling="2026-03-18 07:05:25.263273886 +0000 UTC m=+1110.204428650" observedRunningTime="2026-03-18 07:05:26.312350614 +0000 UTC m=+1111.253505348" watchObservedRunningTime="2026-03-18 07:05:26.317149802 +0000 UTC m=+1111.258304526" Mar 18 07:05:27 crc kubenswrapper[4917]: I0318 07:05:27.258252 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jwxq2" event={"ID":"dbba9465-1e4b-4b42-b512-addd628093d3","Type":"ContainerStarted","Data":"5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8"} Mar 18 07:05:27 crc kubenswrapper[4917]: I0318 07:05:27.258665 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jwxq2" event={"ID":"dbba9465-1e4b-4b42-b512-addd628093d3","Type":"ContainerStarted","Data":"faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304"} Mar 18 07:05:27 crc kubenswrapper[4917]: I0318 07:05:27.284884 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-jwxq2" podStartSLOduration=10.673850757 podStartE2EDuration="18.284867276s" podCreationTimestamp="2026-03-18 07:05:09 +0000 UTC" firstStartedPulling="2026-03-18 07:05:16.619338861 +0000 UTC m=+1101.560493575" lastFinishedPulling="2026-03-18 07:05:24.23035538 +0000 UTC m=+1109.171510094" observedRunningTime="2026-03-18 07:05:27.283223925 +0000 UTC m=+1112.224378659" watchObservedRunningTime="2026-03-18 07:05:27.284867276 +0000 UTC m=+1112.226021990" Mar 18 07:05:28 crc kubenswrapper[4917]: I0318 07:05:28.267256 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:28 crc kubenswrapper[4917]: I0318 07:05:28.267492 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:29 crc kubenswrapper[4917]: I0318 07:05:29.276776 4917 generic.go:334] "Generic (PLEG): container finished" podID="84500248-15f0-4049-8423-43502d6587cf" containerID="b2b06add313005d06645803b1e8e5e8da09cb4fad220682987d99604261e8632" exitCode=0 Mar 18 07:05:29 crc kubenswrapper[4917]: I0318 07:05:29.276911 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84500248-15f0-4049-8423-43502d6587cf","Type":"ContainerDied","Data":"b2b06add313005d06645803b1e8e5e8da09cb4fad220682987d99604261e8632"} Mar 18 07:05:29 crc kubenswrapper[4917]: I0318 07:05:29.284878 4917 generic.go:334] "Generic (PLEG): container finished" podID="6976b4d3-75e6-4b74-99db-3fd9acb3a742" containerID="15c8e43db4edb4b54e790e0d0ebd8efd89d196a97667fc082ccce43971fca170" exitCode=0 Mar 18 07:05:29 crc kubenswrapper[4917]: I0318 07:05:29.284961 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6976b4d3-75e6-4b74-99db-3fd9acb3a742","Type":"ContainerDied","Data":"15c8e43db4edb4b54e790e0d0ebd8efd89d196a97667fc082ccce43971fca170"} Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.297849 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" containerID="fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee" exitCode=0 Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.297949 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" event={"ID":"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380","Type":"ContainerDied","Data":"fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee"} Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.305417 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84500248-15f0-4049-8423-43502d6587cf","Type":"ContainerStarted","Data":"2e38a061b090ac1a57bf820e3f8a49ce670631af9f0ea08d5ab50da29f72d9c6"} Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.314774 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6976b4d3-75e6-4b74-99db-3fd9acb3a742","Type":"ContainerStarted","Data":"f1e89b7d2b258ebb604f705e899b6e2668a5b5bff0002dfc62fd05250dfb7183"} Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.320888 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bf779ffa-140b-4c10-b42e-9c7568cebb01","Type":"ContainerStarted","Data":"5c74ef922cf294c1354aeab44d02d26e32c40d47c6adfa40c3f157efa6666fe6"} Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.325676 4917 generic.go:334] "Generic (PLEG): container finished" podID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" containerID="aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662" exitCode=0 Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.325749 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" event={"ID":"002ecc9e-4964-4a4e-873c-2ddaadebc6fc","Type":"ContainerDied","Data":"aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662"} Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.329634 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9f5fe129-b4bd-40ac-a3f2-6eec0469308b","Type":"ContainerStarted","Data":"b15b4f0c051b063f63312e19c1b7a905ea40662728a9caa08205f2747d82f183"} Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.365765 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.194145579 podStartE2EDuration="29.365747025s" podCreationTimestamp="2026-03-18 07:05:01 +0000 UTC" firstStartedPulling="2026-03-18 07:05:16.499057108 +0000 UTC m=+1101.440211812" lastFinishedPulling="2026-03-18 07:05:24.670658534 +0000 UTC m=+1109.611813258" observedRunningTime="2026-03-18 07:05:30.362211987 +0000 UTC m=+1115.303366741" watchObservedRunningTime="2026-03-18 07:05:30.365747025 +0000 UTC m=+1115.306901739" Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.376961 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.409168 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.585066048 podStartE2EDuration="28.409112808s" podCreationTimestamp="2026-03-18 07:05:02 +0000 UTC" firstStartedPulling="2026-03-18 07:05:16.616098331 +0000 UTC m=+1101.557253045" lastFinishedPulling="2026-03-18 07:05:24.440145091 +0000 UTC m=+1109.381299805" observedRunningTime="2026-03-18 07:05:30.403055247 +0000 UTC m=+1115.344210051" watchObservedRunningTime="2026-03-18 07:05:30.409112808 +0000 UTC m=+1115.350267542" Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.436551 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.387790894 podStartE2EDuration="21.436532286s" podCreationTimestamp="2026-03-18 07:05:09 +0000 UTC" firstStartedPulling="2026-03-18 07:05:16.877110876 +0000 UTC m=+1101.818265600" lastFinishedPulling="2026-03-18 07:05:28.925852278 +0000 UTC m=+1113.867006992" observedRunningTime="2026-03-18 07:05:30.432063235 +0000 UTC m=+1115.373217959" watchObservedRunningTime="2026-03-18 07:05:30.436532286 +0000 UTC m=+1115.377687010" Mar 18 07:05:30 crc kubenswrapper[4917]: I0318 07:05:30.466370 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.244951 podStartE2EDuration="19.466349194s" podCreationTimestamp="2026-03-18 07:05:11 +0000 UTC" firstStartedPulling="2026-03-18 07:05:17.689843503 +0000 UTC m=+1102.630998247" lastFinishedPulling="2026-03-18 07:05:28.911241707 +0000 UTC m=+1113.852396441" observedRunningTime="2026-03-18 07:05:30.459362661 +0000 UTC m=+1115.400517415" watchObservedRunningTime="2026-03-18 07:05:30.466349194 +0000 UTC m=+1115.407503918" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.175302 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.240564 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.345303 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" event={"ID":"002ecc9e-4964-4a4e-873c-2ddaadebc6fc","Type":"ContainerStarted","Data":"b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e"} Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.345698 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.348832 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" event={"ID":"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380","Type":"ContainerStarted","Data":"58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde"} Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.349657 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.376361 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.379957 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" podStartSLOduration=3.566816589 podStartE2EDuration="33.379925658s" podCreationTimestamp="2026-03-18 07:04:58 +0000 UTC" firstStartedPulling="2026-03-18 07:04:59.584796239 +0000 UTC m=+1084.525950953" lastFinishedPulling="2026-03-18 07:05:29.397905308 +0000 UTC m=+1114.339060022" observedRunningTime="2026-03-18 07:05:31.37151346 +0000 UTC m=+1116.312668244" watchObservedRunningTime="2026-03-18 07:05:31.379925658 +0000 UTC m=+1116.321080442" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.412088 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.415084 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" podStartSLOduration=3.523026799 podStartE2EDuration="32.415053547s" podCreationTimestamp="2026-03-18 07:04:59 +0000 UTC" firstStartedPulling="2026-03-18 07:05:00.282596045 +0000 UTC m=+1085.223750759" lastFinishedPulling="2026-03-18 07:05:29.174622753 +0000 UTC m=+1114.115777507" observedRunningTime="2026-03-18 07:05:31.407857219 +0000 UTC m=+1116.349012033" watchObservedRunningTime="2026-03-18 07:05:31.415053547 +0000 UTC m=+1116.356208321" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.449640 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.717945 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-t89fr"] Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.740053 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-fh8ss"] Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.741191 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.745988 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.763653 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-fh8ss"] Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.769411 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rlzz7"] Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.770305 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.788397 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.826720 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rlzz7"] Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.876501 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-config\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.876547 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-dns-svc\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.876686 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkpqh\" (UniqueName: \"kubernetes.io/projected/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-kube-api-access-nkpqh\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.876754 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovs-rundir\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.876790 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0e2915-1512-4288-8c43-7774fb90542f-config\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.877089 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.877215 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-combined-ca-bundle\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.877301 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-ovsdbserver-nb\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.877450 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovn-rundir\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.877612 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2l9j\" (UniqueName: \"kubernetes.io/projected/7a0e2915-1512-4288-8c43-7774fb90542f-kube-api-access-h2l9j\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.981299 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-config\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.992169 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-dns-svc\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.993033 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkpqh\" (UniqueName: \"kubernetes.io/projected/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-kube-api-access-nkpqh\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.993152 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovs-rundir\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.993864 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0e2915-1512-4288-8c43-7774fb90542f-config\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.994064 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.994147 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-combined-ca-bundle\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.994222 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-ovsdbserver-nb\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.994331 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovn-rundir\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.994440 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2l9j\" (UniqueName: \"kubernetes.io/projected/7a0e2915-1512-4288-8c43-7774fb90542f-kube-api-access-h2l9j\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:31 crc kubenswrapper[4917]: I0318 07:05:31.987599 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-config\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:31.999475 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0e2915-1512-4288-8c43-7774fb90542f-config\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:31.992969 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-dns-svc\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:31.999843 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovs-rundir\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.001381 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-ovsdbserver-nb\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.001424 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovn-rundir\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.019907 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-combined-ca-bundle\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.023100 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.028392 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkpqh\" (UniqueName: \"kubernetes.io/projected/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-kube-api-access-nkpqh\") pod \"dnsmasq-dns-84d7bcdf99-fh8ss\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.028930 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2l9j\" (UniqueName: \"kubernetes.io/projected/7a0e2915-1512-4288-8c43-7774fb90542f-kube-api-access-h2l9j\") pod \"ovn-controller-metrics-rlzz7\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.049389 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-b9jzc"] Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.062820 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.086666 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-zdmcf"] Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.088326 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.093528 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.101818 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.148663 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-zdmcf"] Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.209352 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-config\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.209440 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.209478 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xl7q\" (UniqueName: \"kubernetes.io/projected/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-kube-api-access-2xl7q\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.209521 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-dns-svc\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.209542 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.311194 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.311509 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-config\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.311570 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.311619 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xl7q\" (UniqueName: \"kubernetes.io/projected/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-kube-api-access-2xl7q\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.311667 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-dns-svc\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.312214 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-nb\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.312401 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-config\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.312435 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-dns-svc\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.312821 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-sb\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.330130 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xl7q\" (UniqueName: \"kubernetes.io/projected/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-kube-api-access-2xl7q\") pod \"dnsmasq-dns-f697c8bff-zdmcf\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.359754 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.359789 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" podUID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" containerName="dnsmasq-dns" containerID="cri-o://58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde" gracePeriod=10 Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.412341 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.482534 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.482606 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.526189 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.570170 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.572462 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.581311 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.581644 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.581666 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.581706 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.581797 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7thbx" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.653547 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rlzz7"] Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.676952 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-fh8ss"] Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.719267 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.719313 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-scripts\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.719353 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nztcl\" (UniqueName: \"kubernetes.io/projected/2b83ec86-5c66-4dc6-9236-a437f37611a9-kube-api-access-nztcl\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.719444 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-config\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.719476 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.719503 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.719533 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.782023 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.821026 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.821105 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-scripts\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.821172 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nztcl\" (UniqueName: \"kubernetes.io/projected/2b83ec86-5c66-4dc6-9236-a437f37611a9-kube-api-access-nztcl\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.821561 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-config\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.821726 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.821810 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.821874 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.822179 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-scripts\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.823013 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-config\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.823176 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.826470 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.826869 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.826981 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.840573 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nztcl\" (UniqueName: \"kubernetes.io/projected/2b83ec86-5c66-4dc6-9236-a437f37611a9-kube-api-access-nztcl\") pod \"ovn-northd-0\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.891410 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.924855 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-config\") pod \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.925020 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbh6j\" (UniqueName: \"kubernetes.io/projected/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-kube-api-access-sbh6j\") pod \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.925873 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-dns-svc\") pod \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\" (UID: \"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380\") " Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.928543 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.928549 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-kube-api-access-sbh6j" (OuterVolumeSpecName: "kube-api-access-sbh6j") pod "f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" (UID: "f4dea0e2-6a23-4d5f-91b9-c65f5c7da380"). InnerVolumeSpecName "kube-api-access-sbh6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.928602 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.929947 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbh6j\" (UniqueName: \"kubernetes.io/projected/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-kube-api-access-sbh6j\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.962604 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-config" (OuterVolumeSpecName: "config") pod "f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" (UID: "f4dea0e2-6a23-4d5f-91b9-c65f5c7da380"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:32 crc kubenswrapper[4917]: I0318 07:05:32.964783 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" (UID: "f4dea0e2-6a23-4d5f-91b9-c65f5c7da380"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.031962 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.031996 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.043421 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-zdmcf"] Mar 18 07:05:33 crc kubenswrapper[4917]: W0318 07:05:33.050743 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaad3f8d_23ba_40e0_8ac6_bd02d8f9da49.slice/crio-3356b42123e7da022ebba157b27fbe792d1ed01118951d9e50fa6d3cab4dff8c WatchSource:0}: Error finding container 3356b42123e7da022ebba157b27fbe792d1ed01118951d9e50fa6d3cab4dff8c: Status 404 returned error can't find the container with id 3356b42123e7da022ebba157b27fbe792d1ed01118951d9e50fa6d3cab4dff8c Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.299003 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 07:05:33 crc kubenswrapper[4917]: W0318 07:05:33.321641 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b83ec86_5c66_4dc6_9236_a437f37611a9.slice/crio-8c7c705eacdffc731a020742f6f8804ced9faa1236fe70754fd0f7c9b8c9d714 WatchSource:0}: Error finding container 8c7c705eacdffc731a020742f6f8804ced9faa1236fe70754fd0f7c9b8c9d714: Status 404 returned error can't find the container with id 8c7c705eacdffc731a020742f6f8804ced9faa1236fe70754fd0f7c9b8c9d714 Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.368919 4917 generic.go:334] "Generic (PLEG): container finished" podID="9a5b397e-e4e6-48ad-9d42-f1d852fb6740" containerID="2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc" exitCode=0 Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.369018 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" event={"ID":"9a5b397e-e4e6-48ad-9d42-f1d852fb6740","Type":"ContainerDied","Data":"2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc"} Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.369094 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" event={"ID":"9a5b397e-e4e6-48ad-9d42-f1d852fb6740","Type":"ContainerStarted","Data":"47dd0786a4cb566d7d9ce913452e0dce9313d81acc2d12d8ea93bb6b702d31a1"} Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.374172 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" containerID="58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde" exitCode=0 Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.374246 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" event={"ID":"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380","Type":"ContainerDied","Data":"58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde"} Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.374270 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" event={"ID":"f4dea0e2-6a23-4d5f-91b9-c65f5c7da380","Type":"ContainerDied","Data":"3c909bd787bebea8a5b7ac18b8aba1b5ac283c31a04e14ab2c515558557ad79c"} Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.374287 4917 scope.go:117] "RemoveContainer" containerID="58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.374443 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54b5dffb47-b9jzc" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.380299 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b83ec86-5c66-4dc6-9236-a437f37611a9","Type":"ContainerStarted","Data":"8c7c705eacdffc731a020742f6f8804ced9faa1236fe70754fd0f7c9b8c9d714"} Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.384369 4917 generic.go:334] "Generic (PLEG): container finished" podID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" containerID="4a4b407d0996ca15e5daedfecffb0771f7f4f289b45da6dfb3ded6941b45af61" exitCode=0 Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.384474 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" event={"ID":"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49","Type":"ContainerDied","Data":"4a4b407d0996ca15e5daedfecffb0771f7f4f289b45da6dfb3ded6941b45af61"} Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.384505 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" event={"ID":"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49","Type":"ContainerStarted","Data":"3356b42123e7da022ebba157b27fbe792d1ed01118951d9e50fa6d3cab4dff8c"} Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.394066 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rlzz7" event={"ID":"7a0e2915-1512-4288-8c43-7774fb90542f","Type":"ContainerStarted","Data":"33af6cfc095e81af7f21997b2ef1a0bd3c1b9021392541c1e2f4d234da292135"} Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.394122 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rlzz7" event={"ID":"7a0e2915-1512-4288-8c43-7774fb90542f","Type":"ContainerStarted","Data":"b5da64509ead283bebb0ae0ae3133bf167261dfac60ba6362521c58a855654f9"} Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.394267 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" podUID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" containerName="dnsmasq-dns" containerID="cri-o://b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e" gracePeriod=10 Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.425853 4917 scope.go:117] "RemoveContainer" containerID="fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.427737 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rlzz7" podStartSLOduration=2.427718876 podStartE2EDuration="2.427718876s" podCreationTimestamp="2026-03-18 07:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:05:33.425335437 +0000 UTC m=+1118.366490171" watchObservedRunningTime="2026-03-18 07:05:33.427718876 +0000 UTC m=+1118.368873610" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.470000 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-b9jzc"] Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.476265 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54b5dffb47-b9jzc"] Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.488303 4917 scope.go:117] "RemoveContainer" containerID="58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde" Mar 18 07:05:33 crc kubenswrapper[4917]: E0318 07:05:33.489193 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde\": container with ID starting with 58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde not found: ID does not exist" containerID="58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.489226 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde"} err="failed to get container status \"58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde\": rpc error: code = NotFound desc = could not find container \"58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde\": container with ID starting with 58a589bbf850c818d5f5991756d688d6540e0ab0b0a1569b60e118362e4c1dde not found: ID does not exist" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.489245 4917 scope.go:117] "RemoveContainer" containerID="fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee" Mar 18 07:05:33 crc kubenswrapper[4917]: E0318 07:05:33.490411 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee\": container with ID starting with fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee not found: ID does not exist" containerID="fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.490434 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee"} err="failed to get container status \"fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee\": rpc error: code = NotFound desc = could not find container \"fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee\": container with ID starting with fac907a1dc34eb9eac90225cde569310011ce927a9dbab30d919521b51f519ee not found: ID does not exist" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.730418 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.782854 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" path="/var/lib/kubelet/pods/f4dea0e2-6a23-4d5f-91b9-c65f5c7da380/volumes" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.846947 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-dns-svc\") pod \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.847061 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpvxn\" (UniqueName: \"kubernetes.io/projected/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-kube-api-access-hpvxn\") pod \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.847137 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-config\") pod \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\" (UID: \"002ecc9e-4964-4a4e-873c-2ddaadebc6fc\") " Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.850824 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-kube-api-access-hpvxn" (OuterVolumeSpecName: "kube-api-access-hpvxn") pod "002ecc9e-4964-4a4e-873c-2ddaadebc6fc" (UID: "002ecc9e-4964-4a4e-873c-2ddaadebc6fc"). InnerVolumeSpecName "kube-api-access-hpvxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.885312 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "002ecc9e-4964-4a4e-873c-2ddaadebc6fc" (UID: "002ecc9e-4964-4a4e-873c-2ddaadebc6fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.902027 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-config" (OuterVolumeSpecName: "config") pod "002ecc9e-4964-4a4e-873c-2ddaadebc6fc" (UID: "002ecc9e-4964-4a4e-873c-2ddaadebc6fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.910279 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.914542 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.948883 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.948910 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:33 crc kubenswrapper[4917]: I0318 07:05:33.948920 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpvxn\" (UniqueName: \"kubernetes.io/projected/002ecc9e-4964-4a4e-873c-2ddaadebc6fc-kube-api-access-hpvxn\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.222146 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.405129 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" event={"ID":"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49","Type":"ContainerStarted","Data":"818c39450c69a154f30388b122f1a3401360ee3bb97f0b2a226c8d7926d3b096"} Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.405537 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.407836 4917 generic.go:334] "Generic (PLEG): container finished" podID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" containerID="b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e" exitCode=0 Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.407918 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" event={"ID":"002ecc9e-4964-4a4e-873c-2ddaadebc6fc","Type":"ContainerDied","Data":"b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e"} Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.407947 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" event={"ID":"002ecc9e-4964-4a4e-873c-2ddaadebc6fc","Type":"ContainerDied","Data":"78f60bc1ba33b08e13de2aaa4817bf8b7aa55140fa453618366b53a921ed5de0"} Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.407978 4917 scope.go:117] "RemoveContainer" containerID="b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.408079 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-658f55c9f5-t89fr" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.416701 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" event={"ID":"9a5b397e-e4e6-48ad-9d42-f1d852fb6740","Type":"ContainerStarted","Data":"2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772"} Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.416830 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.426554 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" podStartSLOduration=2.4265388789999998 podStartE2EDuration="2.426538879s" podCreationTimestamp="2026-03-18 07:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:05:34.424449377 +0000 UTC m=+1119.365604101" watchObservedRunningTime="2026-03-18 07:05:34.426538879 +0000 UTC m=+1119.367693593" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.448028 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" podStartSLOduration=3.44800812 podStartE2EDuration="3.44800812s" podCreationTimestamp="2026-03-18 07:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:05:34.441993151 +0000 UTC m=+1119.383147875" watchObservedRunningTime="2026-03-18 07:05:34.44800812 +0000 UTC m=+1119.389162834" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.456508 4917 scope.go:117] "RemoveContainer" containerID="aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.462851 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-t89fr"] Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.468264 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-658f55c9f5-t89fr"] Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.486748 4917 scope.go:117] "RemoveContainer" containerID="b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e" Mar 18 07:05:34 crc kubenswrapper[4917]: E0318 07:05:34.487410 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e\": container with ID starting with b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e not found: ID does not exist" containerID="b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.487452 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e"} err="failed to get container status \"b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e\": rpc error: code = NotFound desc = could not find container \"b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e\": container with ID starting with b45d1ea4e3a2e286f005e7978cd085ea920958573660fc07c40281b0abe88e0e not found: ID does not exist" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.487509 4917 scope.go:117] "RemoveContainer" containerID="aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662" Mar 18 07:05:34 crc kubenswrapper[4917]: E0318 07:05:34.487827 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662\": container with ID starting with aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662 not found: ID does not exist" containerID="aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.487859 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662"} err="failed to get container status \"aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662\": rpc error: code = NotFound desc = could not find container \"aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662\": container with ID starting with aa1f93cb13eb8354aa1e402161aea55a2ae1cf860fc4df40239a50d312573662 not found: ID does not exist" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.646466 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 07:05:34 crc kubenswrapper[4917]: I0318 07:05:34.708265 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.289944 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-168c-account-create-update-jq6gc"] Mar 18 07:05:35 crc kubenswrapper[4917]: E0318 07:05:35.298108 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" containerName="init" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.298137 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" containerName="init" Mar 18 07:05:35 crc kubenswrapper[4917]: E0318 07:05:35.298173 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" containerName="dnsmasq-dns" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.298181 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" containerName="dnsmasq-dns" Mar 18 07:05:35 crc kubenswrapper[4917]: E0318 07:05:35.298219 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" containerName="init" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.298225 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" containerName="init" Mar 18 07:05:35 crc kubenswrapper[4917]: E0318 07:05:35.298256 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" containerName="dnsmasq-dns" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.298282 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" containerName="dnsmasq-dns" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.298669 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" containerName="dnsmasq-dns" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.298689 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4dea0e2-6a23-4d5f-91b9-c65f5c7da380" containerName="dnsmasq-dns" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.299439 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.300506 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-168c-account-create-update-jq6gc"] Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.301631 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.342182 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-8xnsx"] Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.344000 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.349831 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8xnsx"] Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.388694 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qwxh\" (UniqueName: \"kubernetes.io/projected/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-kube-api-access-4qwxh\") pod \"keystone-168c-account-create-update-jq6gc\" (UID: \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\") " pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.388851 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-operator-scripts\") pod \"keystone-168c-account-create-update-jq6gc\" (UID: \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\") " pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.418813 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-5kn5p"] Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.420099 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.427186 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b83ec86-5c66-4dc6-9236-a437f37611a9","Type":"ContainerStarted","Data":"8b2c13a58a1120455d3b9c627b8cf69e354661c43f72f0bcc57f23e360c0f81f"} Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.427232 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b83ec86-5c66-4dc6-9236-a437f37611a9","Type":"ContainerStarted","Data":"ef4b723c7e5e12825354609c888ba3e920eaa69afb8a6ad4df7b13d783acb2c3"} Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.428181 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.444913 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5kn5p"] Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.461897 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.952368516 podStartE2EDuration="3.461270401s" podCreationTimestamp="2026-03-18 07:05:32 +0000 UTC" firstStartedPulling="2026-03-18 07:05:33.323558048 +0000 UTC m=+1118.264712762" lastFinishedPulling="2026-03-18 07:05:34.832459933 +0000 UTC m=+1119.773614647" observedRunningTime="2026-03-18 07:05:35.453553239 +0000 UTC m=+1120.394707973" watchObservedRunningTime="2026-03-18 07:05:35.461270401 +0000 UTC m=+1120.402425125" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.489754 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-operator-scripts\") pod \"keystone-168c-account-create-update-jq6gc\" (UID: \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\") " pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.489878 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx7v2\" (UniqueName: \"kubernetes.io/projected/6608e620-c6ed-439f-9a07-67eef576a390-kube-api-access-nx7v2\") pod \"placement-db-create-5kn5p\" (UID: \"6608e620-c6ed-439f-9a07-67eef576a390\") " pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.489926 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qwxh\" (UniqueName: \"kubernetes.io/projected/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-kube-api-access-4qwxh\") pod \"keystone-168c-account-create-update-jq6gc\" (UID: \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\") " pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.489974 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c53afd0-a49e-4af6-a08d-181a6227a31e-operator-scripts\") pod \"keystone-db-create-8xnsx\" (UID: \"1c53afd0-a49e-4af6-a08d-181a6227a31e\") " pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.490079 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rxcw\" (UniqueName: \"kubernetes.io/projected/1c53afd0-a49e-4af6-a08d-181a6227a31e-kube-api-access-4rxcw\") pod \"keystone-db-create-8xnsx\" (UID: \"1c53afd0-a49e-4af6-a08d-181a6227a31e\") " pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.490148 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6608e620-c6ed-439f-9a07-67eef576a390-operator-scripts\") pod \"placement-db-create-5kn5p\" (UID: \"6608e620-c6ed-439f-9a07-67eef576a390\") " pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.491371 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-operator-scripts\") pod \"keystone-168c-account-create-update-jq6gc\" (UID: \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\") " pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.511702 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qwxh\" (UniqueName: \"kubernetes.io/projected/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-kube-api-access-4qwxh\") pod \"keystone-168c-account-create-update-jq6gc\" (UID: \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\") " pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.522191 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e169-account-create-update-gcjpw"] Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.523489 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.528715 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.532445 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e169-account-create-update-gcjpw"] Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.592070 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rxcw\" (UniqueName: \"kubernetes.io/projected/1c53afd0-a49e-4af6-a08d-181a6227a31e-kube-api-access-4rxcw\") pod \"keystone-db-create-8xnsx\" (UID: \"1c53afd0-a49e-4af6-a08d-181a6227a31e\") " pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.592141 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6608e620-c6ed-439f-9a07-67eef576a390-operator-scripts\") pod \"placement-db-create-5kn5p\" (UID: \"6608e620-c6ed-439f-9a07-67eef576a390\") " pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.592197 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vtpc\" (UniqueName: \"kubernetes.io/projected/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-kube-api-access-8vtpc\") pod \"placement-e169-account-create-update-gcjpw\" (UID: \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\") " pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.592256 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx7v2\" (UniqueName: \"kubernetes.io/projected/6608e620-c6ed-439f-9a07-67eef576a390-kube-api-access-nx7v2\") pod \"placement-db-create-5kn5p\" (UID: \"6608e620-c6ed-439f-9a07-67eef576a390\") " pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.592317 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c53afd0-a49e-4af6-a08d-181a6227a31e-operator-scripts\") pod \"keystone-db-create-8xnsx\" (UID: \"1c53afd0-a49e-4af6-a08d-181a6227a31e\") " pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.592358 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-operator-scripts\") pod \"placement-e169-account-create-update-gcjpw\" (UID: \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\") " pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.593038 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c53afd0-a49e-4af6-a08d-181a6227a31e-operator-scripts\") pod \"keystone-db-create-8xnsx\" (UID: \"1c53afd0-a49e-4af6-a08d-181a6227a31e\") " pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.593089 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6608e620-c6ed-439f-9a07-67eef576a390-operator-scripts\") pod \"placement-db-create-5kn5p\" (UID: \"6608e620-c6ed-439f-9a07-67eef576a390\") " pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.608246 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rxcw\" (UniqueName: \"kubernetes.io/projected/1c53afd0-a49e-4af6-a08d-181a6227a31e-kube-api-access-4rxcw\") pod \"keystone-db-create-8xnsx\" (UID: \"1c53afd0-a49e-4af6-a08d-181a6227a31e\") " pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.613259 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx7v2\" (UniqueName: \"kubernetes.io/projected/6608e620-c6ed-439f-9a07-67eef576a390-kube-api-access-nx7v2\") pod \"placement-db-create-5kn5p\" (UID: \"6608e620-c6ed-439f-9a07-67eef576a390\") " pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.639305 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.661733 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.693778 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vtpc\" (UniqueName: \"kubernetes.io/projected/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-kube-api-access-8vtpc\") pod \"placement-e169-account-create-update-gcjpw\" (UID: \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\") " pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.693889 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-operator-scripts\") pod \"placement-e169-account-create-update-gcjpw\" (UID: \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\") " pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.694575 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-operator-scripts\") pod \"placement-e169-account-create-update-gcjpw\" (UID: \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\") " pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.710097 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vtpc\" (UniqueName: \"kubernetes.io/projected/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-kube-api-access-8vtpc\") pod \"placement-e169-account-create-update-gcjpw\" (UID: \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\") " pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.738743 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.782358 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="002ecc9e-4964-4a4e-873c-2ddaadebc6fc" path="/var/lib/kubelet/pods/002ecc9e-4964-4a4e-873c-2ddaadebc6fc/volumes" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.863274 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:35 crc kubenswrapper[4917]: I0318 07:05:35.931037 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-168c-account-create-update-jq6gc"] Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.058343 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.143747 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.188751 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-8xnsx"] Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.300726 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-5kn5p"] Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.386810 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e169-account-create-update-gcjpw"] Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.409190 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.461437 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-fh8ss"] Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.476721 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-168c-account-create-update-jq6gc" event={"ID":"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3","Type":"ContainerStarted","Data":"7087316d4bc5f762711dcb6e2fd729e279467784e67e4f93f5ce5dbba379f5dc"} Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.476764 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-168c-account-create-update-jq6gc" event={"ID":"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3","Type":"ContainerStarted","Data":"587bc9f47c3d8b591d25260943bdf95f14c762d349bb86540612bef6b11cdd7a"} Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.496685 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e169-account-create-update-gcjpw" event={"ID":"f24e450c-1619-42db-92b0-8f8aa6d9a1ab","Type":"ContainerStarted","Data":"2f9c80722813e74e208b9575be74a616bc2d6ef523e3ed7f00f5b18a41f65f33"} Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.502281 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-qhw5r"] Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.504297 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.512022 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5kn5p" event={"ID":"6608e620-c6ed-439f-9a07-67eef576a390","Type":"ContainerStarted","Data":"29c0491c0efc48ae34a748f477b748dc9bdc0aadf27a1d0104b78fa494fbcd8b"} Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.512487 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-168c-account-create-update-jq6gc" podStartSLOduration=1.51246027 podStartE2EDuration="1.51246027s" podCreationTimestamp="2026-03-18 07:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:05:36.50436522 +0000 UTC m=+1121.445519944" watchObservedRunningTime="2026-03-18 07:05:36.51246027 +0000 UTC m=+1121.453614984" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.525574 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8xnsx" event={"ID":"1c53afd0-a49e-4af6-a08d-181a6227a31e","Type":"ContainerStarted","Data":"ff0ef75abe5c87311980a1e7fa1390190d50b4bcfc403826e45436a106232403"} Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.527919 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" podUID="9a5b397e-e4e6-48ad-9d42-f1d852fb6740" containerName="dnsmasq-dns" containerID="cri-o://2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772" gracePeriod=10 Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.564685 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-qhw5r"] Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.602085 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-8xnsx" podStartSLOduration=1.6020663069999999 podStartE2EDuration="1.602066307s" podCreationTimestamp="2026-03-18 07:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:05:36.595060934 +0000 UTC m=+1121.536215648" watchObservedRunningTime="2026-03-18 07:05:36.602066307 +0000 UTC m=+1121.543221021" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.611675 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.611853 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-config\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.611872 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.611908 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.611931 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km4g7\" (UniqueName: \"kubernetes.io/projected/939755c5-26d3-460b-af09-4b24f10fc9d0-kube-api-access-km4g7\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.717048 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.717331 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km4g7\" (UniqueName: \"kubernetes.io/projected/939755c5-26d3-460b-af09-4b24f10fc9d0-kube-api-access-km4g7\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.717381 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.717446 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-config\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.717465 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.718190 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-dns-svc\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.718237 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-sb\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.718432 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-nb\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.718766 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-config\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.737654 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km4g7\" (UniqueName: \"kubernetes.io/projected/939755c5-26d3-460b-af09-4b24f10fc9d0-kube-api-access-km4g7\") pod \"dnsmasq-dns-b4ddd5fb7-qhw5r\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:36 crc kubenswrapper[4917]: I0318 07:05:36.922795 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.012149 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.025181 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-dns-svc\") pod \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.027143 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-ovsdbserver-nb\") pod \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.027227 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-config\") pod \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.028140 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkpqh\" (UniqueName: \"kubernetes.io/projected/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-kube-api-access-nkpqh\") pod \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\" (UID: \"9a5b397e-e4e6-48ad-9d42-f1d852fb6740\") " Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.032761 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-kube-api-access-nkpqh" (OuterVolumeSpecName: "kube-api-access-nkpqh") pod "9a5b397e-e4e6-48ad-9d42-f1d852fb6740" (UID: "9a5b397e-e4e6-48ad-9d42-f1d852fb6740"). InnerVolumeSpecName "kube-api-access-nkpqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.079269 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9a5b397e-e4e6-48ad-9d42-f1d852fb6740" (UID: "9a5b397e-e4e6-48ad-9d42-f1d852fb6740"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.085095 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9a5b397e-e4e6-48ad-9d42-f1d852fb6740" (UID: "9a5b397e-e4e6-48ad-9d42-f1d852fb6740"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.089637 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-config" (OuterVolumeSpecName: "config") pod "9a5b397e-e4e6-48ad-9d42-f1d852fb6740" (UID: "9a5b397e-e4e6-48ad-9d42-f1d852fb6740"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.130595 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkpqh\" (UniqueName: \"kubernetes.io/projected/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-kube-api-access-nkpqh\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.130930 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.130940 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.130951 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a5b397e-e4e6-48ad-9d42-f1d852fb6740-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.458482 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-qhw5r"] Mar 18 07:05:37 crc kubenswrapper[4917]: W0318 07:05:37.468944 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod939755c5_26d3_460b_af09_4b24f10fc9d0.slice/crio-45855cb792b9089c2d031f42bbfc7c4b4b195851635dc1e2e442ecc793c69d7e WatchSource:0}: Error finding container 45855cb792b9089c2d031f42bbfc7c4b4b195851635dc1e2e442ecc793c69d7e: Status 404 returned error can't find the container with id 45855cb792b9089c2d031f42bbfc7c4b4b195851635dc1e2e442ecc793c69d7e Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.534932 4917 generic.go:334] "Generic (PLEG): container finished" podID="1c53afd0-a49e-4af6-a08d-181a6227a31e" containerID="b569ea190b5d3209285e816291e69308cddca6c166c4343e82c025cda3198375" exitCode=0 Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.535045 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8xnsx" event={"ID":"1c53afd0-a49e-4af6-a08d-181a6227a31e","Type":"ContainerDied","Data":"b569ea190b5d3209285e816291e69308cddca6c166c4343e82c025cda3198375"} Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.541158 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" event={"ID":"939755c5-26d3-460b-af09-4b24f10fc9d0","Type":"ContainerStarted","Data":"45855cb792b9089c2d031f42bbfc7c4b4b195851635dc1e2e442ecc793c69d7e"} Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.545247 4917 generic.go:334] "Generic (PLEG): container finished" podID="5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3" containerID="7087316d4bc5f762711dcb6e2fd729e279467784e67e4f93f5ce5dbba379f5dc" exitCode=0 Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.545319 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-168c-account-create-update-jq6gc" event={"ID":"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3","Type":"ContainerDied","Data":"7087316d4bc5f762711dcb6e2fd729e279467784e67e4f93f5ce5dbba379f5dc"} Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.548281 4917 generic.go:334] "Generic (PLEG): container finished" podID="f24e450c-1619-42db-92b0-8f8aa6d9a1ab" containerID="a6b7dc9e685c315ce6e8cbdd47d2f213c0a46ab7945b7ae81297a06be7d1900b" exitCode=0 Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.548335 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e169-account-create-update-gcjpw" event={"ID":"f24e450c-1619-42db-92b0-8f8aa6d9a1ab","Type":"ContainerDied","Data":"a6b7dc9e685c315ce6e8cbdd47d2f213c0a46ab7945b7ae81297a06be7d1900b"} Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.562758 4917 generic.go:334] "Generic (PLEG): container finished" podID="6608e620-c6ed-439f-9a07-67eef576a390" containerID="804617347f439740e66792317b763d4d4061eab3918a6e260b4ba919d324b619" exitCode=0 Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.562867 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5kn5p" event={"ID":"6608e620-c6ed-439f-9a07-67eef576a390","Type":"ContainerDied","Data":"804617347f439740e66792317b763d4d4061eab3918a6e260b4ba919d324b619"} Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.568019 4917 generic.go:334] "Generic (PLEG): container finished" podID="9a5b397e-e4e6-48ad-9d42-f1d852fb6740" containerID="2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772" exitCode=0 Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.568108 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.568156 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" event={"ID":"9a5b397e-e4e6-48ad-9d42-f1d852fb6740","Type":"ContainerDied","Data":"2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772"} Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.568220 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84d7bcdf99-fh8ss" event={"ID":"9a5b397e-e4e6-48ad-9d42-f1d852fb6740","Type":"ContainerDied","Data":"47dd0786a4cb566d7d9ce913452e0dce9313d81acc2d12d8ea93bb6b702d31a1"} Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.568255 4917 scope.go:117] "RemoveContainer" containerID="2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.610099 4917 scope.go:117] "RemoveContainer" containerID="2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.621376 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 18 07:05:37 crc kubenswrapper[4917]: E0318 07:05:37.621788 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5b397e-e4e6-48ad-9d42-f1d852fb6740" containerName="init" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.621802 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5b397e-e4e6-48ad-9d42-f1d852fb6740" containerName="init" Mar 18 07:05:37 crc kubenswrapper[4917]: E0318 07:05:37.621822 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a5b397e-e4e6-48ad-9d42-f1d852fb6740" containerName="dnsmasq-dns" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.621830 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a5b397e-e4e6-48ad-9d42-f1d852fb6740" containerName="dnsmasq-dns" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.622034 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a5b397e-e4e6-48ad-9d42-f1d852fb6740" containerName="dnsmasq-dns" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.631422 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.633230 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.633663 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.633804 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.634055 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-r48wn" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.669646 4917 scope.go:117] "RemoveContainer" containerID="2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772" Mar 18 07:05:37 crc kubenswrapper[4917]: E0318 07:05:37.670018 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772\": container with ID starting with 2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772 not found: ID does not exist" containerID="2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.670057 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772"} err="failed to get container status \"2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772\": rpc error: code = NotFound desc = could not find container \"2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772\": container with ID starting with 2c3ebdfe16552ebbb26a18ccf3c05d9462f18436f93e4324920de666476d1772 not found: ID does not exist" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.670075 4917 scope.go:117] "RemoveContainer" containerID="2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc" Mar 18 07:05:37 crc kubenswrapper[4917]: E0318 07:05:37.673493 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc\": container with ID starting with 2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc not found: ID does not exist" containerID="2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.673532 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc"} err="failed to get container status \"2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc\": rpc error: code = NotFound desc = could not find container \"2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc\": container with ID starting with 2ab73fcfcaf4aee510313db6db21487a7461d85615707fc42a816e809115a0fc not found: ID does not exist" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.678226 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.683972 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-fh8ss"] Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.688612 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84d7bcdf99-fh8ss"] Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.758085 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.759114 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-lock\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.759139 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-cache\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.759208 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.759237 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385bce6-e9e7-4f8b-84be-9afb342f7134-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.759354 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg8sq\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-kube-api-access-lg8sq\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.790536 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a5b397e-e4e6-48ad-9d42-f1d852fb6740" path="/var/lib/kubelet/pods/9a5b397e-e4e6-48ad-9d42-f1d852fb6740/volumes" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.861256 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-lock\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.861292 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-cache\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.861341 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.861371 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385bce6-e9e7-4f8b-84be-9afb342f7134-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.861431 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg8sq\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-kube-api-access-lg8sq\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.861499 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: E0318 07:05:37.861614 4917 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 07:05:37 crc kubenswrapper[4917]: E0318 07:05:37.861656 4917 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 07:05:37 crc kubenswrapper[4917]: E0318 07:05:37.861718 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift podName:7385bce6-e9e7-4f8b-84be-9afb342f7134 nodeName:}" failed. No retries permitted until 2026-03-18 07:05:38.361697183 +0000 UTC m=+1123.302851907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift") pod "swift-storage-0" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134") : configmap "swift-ring-files" not found Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.861769 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.862636 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-lock\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.862841 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-cache\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.867132 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385bce6-e9e7-4f8b-84be-9afb342f7134-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.880420 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg8sq\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-kube-api-access-lg8sq\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:37 crc kubenswrapper[4917]: I0318 07:05:37.886176 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.067145 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-bgqb9"] Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.069426 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.073004 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.076475 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.076836 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.127419 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-bgqb9"] Mar 18 07:05:38 crc kubenswrapper[4917]: E0318 07:05:38.128065 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-n88c5 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-n88c5 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-bgqb9" podUID="c9d3cf91-9429-4802-9c92-8827b3c3535a" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.138213 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-wr662"] Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.153211 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.201644 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wr662"] Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.207699 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-bgqb9"] Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270141 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/80378da5-5294-4d7e-92c9-2aba37eb64a1-etc-swift\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270203 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-swiftconf\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270231 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-swiftconf\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270309 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-ring-data-devices\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270505 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c9d3cf91-9429-4802-9c92-8827b3c3535a-etc-swift\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270645 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-scripts\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270671 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-dispersionconf\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270703 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-scripts\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270806 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-dispersionconf\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270841 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88c5\" (UniqueName: \"kubernetes.io/projected/c9d3cf91-9429-4802-9c92-8827b3c3535a-kube-api-access-n88c5\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270864 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-ring-data-devices\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.270899 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ks94\" (UniqueName: \"kubernetes.io/projected/80378da5-5294-4d7e-92c9-2aba37eb64a1-kube-api-access-5ks94\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.271092 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-combined-ca-bundle\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.271197 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-combined-ca-bundle\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372174 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-ring-data-devices\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372224 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372253 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c9d3cf91-9429-4802-9c92-8827b3c3535a-etc-swift\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372286 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-scripts\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372302 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-dispersionconf\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372320 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-scripts\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372344 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-dispersionconf\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372364 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88c5\" (UniqueName: \"kubernetes.io/projected/c9d3cf91-9429-4802-9c92-8827b3c3535a-kube-api-access-n88c5\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372383 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-ring-data-devices\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372408 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ks94\" (UniqueName: \"kubernetes.io/projected/80378da5-5294-4d7e-92c9-2aba37eb64a1-kube-api-access-5ks94\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372437 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-combined-ca-bundle\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: E0318 07:05:38.372445 4917 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 07:05:38 crc kubenswrapper[4917]: E0318 07:05:38.372480 4917 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 07:05:38 crc kubenswrapper[4917]: E0318 07:05:38.372552 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift podName:7385bce6-e9e7-4f8b-84be-9afb342f7134 nodeName:}" failed. No retries permitted until 2026-03-18 07:05:39.372524332 +0000 UTC m=+1124.313679076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift") pod "swift-storage-0" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134") : configmap "swift-ring-files" not found Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372661 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c9d3cf91-9429-4802-9c92-8827b3c3535a-etc-swift\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.372455 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-combined-ca-bundle\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.373159 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/80378da5-5294-4d7e-92c9-2aba37eb64a1-etc-swift\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.373680 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-scripts\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.373854 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-scripts\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.373950 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/80378da5-5294-4d7e-92c9-2aba37eb64a1-etc-swift\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.373212 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-swiftconf\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.374088 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-ring-data-devices\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.374120 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-swiftconf\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.374366 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-ring-data-devices\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.376746 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-dispersionconf\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.376807 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-swiftconf\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.377525 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-dispersionconf\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.389233 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-combined-ca-bundle\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.389526 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-combined-ca-bundle\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.389691 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ks94\" (UniqueName: \"kubernetes.io/projected/80378da5-5294-4d7e-92c9-2aba37eb64a1-kube-api-access-5ks94\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.398814 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-swiftconf\") pod \"swift-ring-rebalance-wr662\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.403519 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88c5\" (UniqueName: \"kubernetes.io/projected/c9d3cf91-9429-4802-9c92-8827b3c3535a-kube-api-access-n88c5\") pod \"swift-ring-rebalance-bgqb9\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.507311 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.592362 4917 generic.go:334] "Generic (PLEG): container finished" podID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerID="05650b436727ef2eb235496e69080df30e3464cd282dff22a92af0c2f1a65404" exitCode=0 Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.592464 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.592503 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" event={"ID":"939755c5-26d3-460b-af09-4b24f10fc9d0","Type":"ContainerDied","Data":"05650b436727ef2eb235496e69080df30e3464cd282dff22a92af0c2f1a65404"} Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.640495 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.687673 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88c5\" (UniqueName: \"kubernetes.io/projected/c9d3cf91-9429-4802-9c92-8827b3c3535a-kube-api-access-n88c5\") pod \"c9d3cf91-9429-4802-9c92-8827b3c3535a\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.687755 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-dispersionconf\") pod \"c9d3cf91-9429-4802-9c92-8827b3c3535a\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.687776 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-swiftconf\") pod \"c9d3cf91-9429-4802-9c92-8827b3c3535a\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.687795 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c9d3cf91-9429-4802-9c92-8827b3c3535a-etc-swift\") pod \"c9d3cf91-9429-4802-9c92-8827b3c3535a\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.687819 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-combined-ca-bundle\") pod \"c9d3cf91-9429-4802-9c92-8827b3c3535a\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.688239 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-scripts\") pod \"c9d3cf91-9429-4802-9c92-8827b3c3535a\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.688274 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-ring-data-devices\") pod \"c9d3cf91-9429-4802-9c92-8827b3c3535a\" (UID: \"c9d3cf91-9429-4802-9c92-8827b3c3535a\") " Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.688554 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d3cf91-9429-4802-9c92-8827b3c3535a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c9d3cf91-9429-4802-9c92-8827b3c3535a" (UID: "c9d3cf91-9429-4802-9c92-8827b3c3535a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.688847 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-scripts" (OuterVolumeSpecName: "scripts") pod "c9d3cf91-9429-4802-9c92-8827b3c3535a" (UID: "c9d3cf91-9429-4802-9c92-8827b3c3535a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.689099 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c9d3cf91-9429-4802-9c92-8827b3c3535a" (UID: "c9d3cf91-9429-4802-9c92-8827b3c3535a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.689386 4917 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c9d3cf91-9429-4802-9c92-8827b3c3535a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.689403 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.689411 4917 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c9d3cf91-9429-4802-9c92-8827b3c3535a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.692344 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c9d3cf91-9429-4802-9c92-8827b3c3535a" (UID: "c9d3cf91-9429-4802-9c92-8827b3c3535a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.692386 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c9d3cf91-9429-4802-9c92-8827b3c3535a" (UID: "c9d3cf91-9429-4802-9c92-8827b3c3535a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.694224 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d3cf91-9429-4802-9c92-8827b3c3535a-kube-api-access-n88c5" (OuterVolumeSpecName: "kube-api-access-n88c5") pod "c9d3cf91-9429-4802-9c92-8827b3c3535a" (UID: "c9d3cf91-9429-4802-9c92-8827b3c3535a"). InnerVolumeSpecName "kube-api-access-n88c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.697783 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9d3cf91-9429-4802-9c92-8827b3c3535a" (UID: "c9d3cf91-9429-4802-9c92-8827b3c3535a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.792070 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n88c5\" (UniqueName: \"kubernetes.io/projected/c9d3cf91-9429-4802-9c92-8827b3c3535a-kube-api-access-n88c5\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.792384 4917 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.792402 4917 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.792413 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9d3cf91-9429-4802-9c92-8827b3c3535a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.930768 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.995471 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx7v2\" (UniqueName: \"kubernetes.io/projected/6608e620-c6ed-439f-9a07-67eef576a390-kube-api-access-nx7v2\") pod \"6608e620-c6ed-439f-9a07-67eef576a390\" (UID: \"6608e620-c6ed-439f-9a07-67eef576a390\") " Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.995605 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6608e620-c6ed-439f-9a07-67eef576a390-operator-scripts\") pod \"6608e620-c6ed-439f-9a07-67eef576a390\" (UID: \"6608e620-c6ed-439f-9a07-67eef576a390\") " Mar 18 07:05:38 crc kubenswrapper[4917]: I0318 07:05:38.996404 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6608e620-c6ed-439f-9a07-67eef576a390-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6608e620-c6ed-439f-9a07-67eef576a390" (UID: "6608e620-c6ed-439f-9a07-67eef576a390"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.002053 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6608e620-c6ed-439f-9a07-67eef576a390-kube-api-access-nx7v2" (OuterVolumeSpecName: "kube-api-access-nx7v2") pod "6608e620-c6ed-439f-9a07-67eef576a390" (UID: "6608e620-c6ed-439f-9a07-67eef576a390"). InnerVolumeSpecName "kube-api-access-nx7v2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.071577 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.088929 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.096784 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c53afd0-a49e-4af6-a08d-181a6227a31e-operator-scripts\") pod \"1c53afd0-a49e-4af6-a08d-181a6227a31e\" (UID: \"1c53afd0-a49e-4af6-a08d-181a6227a31e\") " Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.097041 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rxcw\" (UniqueName: \"kubernetes.io/projected/1c53afd0-a49e-4af6-a08d-181a6227a31e-kube-api-access-4rxcw\") pod \"1c53afd0-a49e-4af6-a08d-181a6227a31e\" (UID: \"1c53afd0-a49e-4af6-a08d-181a6227a31e\") " Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.097361 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6608e620-c6ed-439f-9a07-67eef576a390-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.097372 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx7v2\" (UniqueName: \"kubernetes.io/projected/6608e620-c6ed-439f-9a07-67eef576a390-kube-api-access-nx7v2\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.107393 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c53afd0-a49e-4af6-a08d-181a6227a31e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1c53afd0-a49e-4af6-a08d-181a6227a31e" (UID: "1c53afd0-a49e-4af6-a08d-181a6227a31e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.112931 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c53afd0-a49e-4af6-a08d-181a6227a31e-kube-api-access-4rxcw" (OuterVolumeSpecName: "kube-api-access-4rxcw") pod "1c53afd0-a49e-4af6-a08d-181a6227a31e" (UID: "1c53afd0-a49e-4af6-a08d-181a6227a31e"). InnerVolumeSpecName "kube-api-access-4rxcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.136250 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.198822 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vtpc\" (UniqueName: \"kubernetes.io/projected/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-kube-api-access-8vtpc\") pod \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\" (UID: \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\") " Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.199043 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-operator-scripts\") pod \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\" (UID: \"f24e450c-1619-42db-92b0-8f8aa6d9a1ab\") " Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.199128 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-operator-scripts\") pod \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\" (UID: \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\") " Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.200516 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qwxh\" (UniqueName: \"kubernetes.io/projected/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-kube-api-access-4qwxh\") pod \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\" (UID: \"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3\") " Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.201098 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rxcw\" (UniqueName: \"kubernetes.io/projected/1c53afd0-a49e-4af6-a08d-181a6227a31e-kube-api-access-4rxcw\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.201118 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1c53afd0-a49e-4af6-a08d-181a6227a31e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.201256 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-wr662"] Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.201564 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3" (UID: "5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.201562 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f24e450c-1619-42db-92b0-8f8aa6d9a1ab" (UID: "f24e450c-1619-42db-92b0-8f8aa6d9a1ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.204125 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-kube-api-access-8vtpc" (OuterVolumeSpecName: "kube-api-access-8vtpc") pod "f24e450c-1619-42db-92b0-8f8aa6d9a1ab" (UID: "f24e450c-1619-42db-92b0-8f8aa6d9a1ab"). InnerVolumeSpecName "kube-api-access-8vtpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.204563 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-kube-api-access-4qwxh" (OuterVolumeSpecName: "kube-api-access-4qwxh") pod "5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3" (UID: "5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3"). InnerVolumeSpecName "kube-api-access-4qwxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:39 crc kubenswrapper[4917]: W0318 07:05:39.214506 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80378da5_5294_4d7e_92c9_2aba37eb64a1.slice/crio-fa99d5b1858e806fdea7ea649797ebe557cb09548a775e8393a5746a2c62cd5c WatchSource:0}: Error finding container fa99d5b1858e806fdea7ea649797ebe557cb09548a775e8393a5746a2c62cd5c: Status 404 returned error can't find the container with id fa99d5b1858e806fdea7ea649797ebe557cb09548a775e8393a5746a2c62cd5c Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.302188 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.302219 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.302229 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qwxh\" (UniqueName: \"kubernetes.io/projected/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3-kube-api-access-4qwxh\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.302238 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vtpc\" (UniqueName: \"kubernetes.io/projected/f24e450c-1619-42db-92b0-8f8aa6d9a1ab-kube-api-access-8vtpc\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.404298 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:39 crc kubenswrapper[4917]: E0318 07:05:39.404508 4917 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 07:05:39 crc kubenswrapper[4917]: E0318 07:05:39.404748 4917 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 07:05:39 crc kubenswrapper[4917]: E0318 07:05:39.404811 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift podName:7385bce6-e9e7-4f8b-84be-9afb342f7134 nodeName:}" failed. No retries permitted until 2026-03-18 07:05:41.404794174 +0000 UTC m=+1126.345948888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift") pod "swift-storage-0" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134") : configmap "swift-ring-files" not found Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.601849 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wr662" event={"ID":"80378da5-5294-4d7e-92c9-2aba37eb64a1","Type":"ContainerStarted","Data":"fa99d5b1858e806fdea7ea649797ebe557cb09548a775e8393a5746a2c62cd5c"} Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.604310 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-8xnsx" event={"ID":"1c53afd0-a49e-4af6-a08d-181a6227a31e","Type":"ContainerDied","Data":"ff0ef75abe5c87311980a1e7fa1390190d50b4bcfc403826e45436a106232403"} Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.604360 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff0ef75abe5c87311980a1e7fa1390190d50b4bcfc403826e45436a106232403" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.604332 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-8xnsx" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.606544 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" event={"ID":"939755c5-26d3-460b-af09-4b24f10fc9d0","Type":"ContainerStarted","Data":"977fe887b50dce97fe001d1efee118fc531726fa077498324d1f7a6b367dce07"} Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.606745 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.609146 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-168c-account-create-update-jq6gc" event={"ID":"5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3","Type":"ContainerDied","Data":"587bc9f47c3d8b591d25260943bdf95f14c762d349bb86540612bef6b11cdd7a"} Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.609195 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="587bc9f47c3d8b591d25260943bdf95f14c762d349bb86540612bef6b11cdd7a" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.609260 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-168c-account-create-update-jq6gc" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.613115 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e169-account-create-update-gcjpw" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.613787 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e169-account-create-update-gcjpw" event={"ID":"f24e450c-1619-42db-92b0-8f8aa6d9a1ab","Type":"ContainerDied","Data":"2f9c80722813e74e208b9575be74a616bc2d6ef523e3ed7f00f5b18a41f65f33"} Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.613816 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f9c80722813e74e208b9575be74a616bc2d6ef523e3ed7f00f5b18a41f65f33" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.615444 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-bgqb9" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.615511 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-5kn5p" event={"ID":"6608e620-c6ed-439f-9a07-67eef576a390","Type":"ContainerDied","Data":"29c0491c0efc48ae34a748f477b748dc9bdc0aadf27a1d0104b78fa494fbcd8b"} Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.615553 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29c0491c0efc48ae34a748f477b748dc9bdc0aadf27a1d0104b78fa494fbcd8b" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.615669 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-5kn5p" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.629609 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" podStartSLOduration=3.629592646 podStartE2EDuration="3.629592646s" podCreationTimestamp="2026-03-18 07:05:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:05:39.626749295 +0000 UTC m=+1124.567904079" watchObservedRunningTime="2026-03-18 07:05:39.629592646 +0000 UTC m=+1124.570747350" Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.720939 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-bgqb9"] Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.728925 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-bgqb9"] Mar 18 07:05:39 crc kubenswrapper[4917]: I0318 07:05:39.793074 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d3cf91-9429-4802-9c92-8827b3c3535a" path="/var/lib/kubelet/pods/c9d3cf91-9429-4802-9c92-8827b3c3535a/volumes" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.115168 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-flpk2"] Mar 18 07:05:41 crc kubenswrapper[4917]: E0318 07:05:41.115786 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c53afd0-a49e-4af6-a08d-181a6227a31e" containerName="mariadb-database-create" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.115798 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c53afd0-a49e-4af6-a08d-181a6227a31e" containerName="mariadb-database-create" Mar 18 07:05:41 crc kubenswrapper[4917]: E0318 07:05:41.115831 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6608e620-c6ed-439f-9a07-67eef576a390" containerName="mariadb-database-create" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.115837 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6608e620-c6ed-439f-9a07-67eef576a390" containerName="mariadb-database-create" Mar 18 07:05:41 crc kubenswrapper[4917]: E0318 07:05:41.115844 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f24e450c-1619-42db-92b0-8f8aa6d9a1ab" containerName="mariadb-account-create-update" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.115849 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f24e450c-1619-42db-92b0-8f8aa6d9a1ab" containerName="mariadb-account-create-update" Mar 18 07:05:41 crc kubenswrapper[4917]: E0318 07:05:41.115857 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3" containerName="mariadb-account-create-update" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.115862 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3" containerName="mariadb-account-create-update" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.116010 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f24e450c-1619-42db-92b0-8f8aa6d9a1ab" containerName="mariadb-account-create-update" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.116024 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c53afd0-a49e-4af6-a08d-181a6227a31e" containerName="mariadb-database-create" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.116033 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3" containerName="mariadb-account-create-update" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.116044 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6608e620-c6ed-439f-9a07-67eef576a390" containerName="mariadb-database-create" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.116551 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.119499 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.137292 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-flpk2"] Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.138156 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbrgs\" (UniqueName: \"kubernetes.io/projected/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-kube-api-access-sbrgs\") pod \"root-account-create-update-flpk2\" (UID: \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\") " pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.138519 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-operator-scripts\") pod \"root-account-create-update-flpk2\" (UID: \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\") " pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.240461 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-operator-scripts\") pod \"root-account-create-update-flpk2\" (UID: \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\") " pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.240620 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbrgs\" (UniqueName: \"kubernetes.io/projected/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-kube-api-access-sbrgs\") pod \"root-account-create-update-flpk2\" (UID: \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\") " pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.243382 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-operator-scripts\") pod \"root-account-create-update-flpk2\" (UID: \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\") " pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.260872 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbrgs\" (UniqueName: \"kubernetes.io/projected/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-kube-api-access-sbrgs\") pod \"root-account-create-update-flpk2\" (UID: \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\") " pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.445801 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:41 crc kubenswrapper[4917]: E0318 07:05:41.446078 4917 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 07:05:41 crc kubenswrapper[4917]: E0318 07:05:41.446145 4917 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 07:05:41 crc kubenswrapper[4917]: E0318 07:05:41.446258 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift podName:7385bce6-e9e7-4f8b-84be-9afb342f7134 nodeName:}" failed. No retries permitted until 2026-03-18 07:05:45.446224214 +0000 UTC m=+1130.387378968 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift") pod "swift-storage-0" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134") : configmap "swift-ring-files" not found Mar 18 07:05:41 crc kubenswrapper[4917]: I0318 07:05:41.468340 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:42 crc kubenswrapper[4917]: I0318 07:05:42.531809 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:42 crc kubenswrapper[4917]: I0318 07:05:42.932734 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-flpk2"] Mar 18 07:05:42 crc kubenswrapper[4917]: W0318 07:05:42.935780 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod294f43c9_010d_4de1_8cdd_8d10f29d0e6f.slice/crio-76056c0445c1a6d361ca284cef9196dba72820a746a90c5fee2655092ed7c8e9 WatchSource:0}: Error finding container 76056c0445c1a6d361ca284cef9196dba72820a746a90c5fee2655092ed7c8e9: Status 404 returned error can't find the container with id 76056c0445c1a6d361ca284cef9196dba72820a746a90c5fee2655092ed7c8e9 Mar 18 07:05:43 crc kubenswrapper[4917]: I0318 07:05:43.658422 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-flpk2" event={"ID":"294f43c9-010d-4de1-8cdd-8d10f29d0e6f","Type":"ContainerStarted","Data":"a5c8dcc8b00fb61a998e2de57eda4651c17c337a9d22a5003f5ddff672299ec8"} Mar 18 07:05:43 crc kubenswrapper[4917]: I0318 07:05:43.658773 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-flpk2" event={"ID":"294f43c9-010d-4de1-8cdd-8d10f29d0e6f","Type":"ContainerStarted","Data":"76056c0445c1a6d361ca284cef9196dba72820a746a90c5fee2655092ed7c8e9"} Mar 18 07:05:43 crc kubenswrapper[4917]: I0318 07:05:43.660716 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wr662" event={"ID":"80378da5-5294-4d7e-92c9-2aba37eb64a1","Type":"ContainerStarted","Data":"229ed9b75a6c6ab364d819b8ba387d93d101381df8c8e058d0975c3e44b2ee17"} Mar 18 07:05:43 crc kubenswrapper[4917]: I0318 07:05:43.684746 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-flpk2" podStartSLOduration=2.684720079 podStartE2EDuration="2.684720079s" podCreationTimestamp="2026-03-18 07:05:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:05:43.677081511 +0000 UTC m=+1128.618236235" watchObservedRunningTime="2026-03-18 07:05:43.684720079 +0000 UTC m=+1128.625874803" Mar 18 07:05:43 crc kubenswrapper[4917]: I0318 07:05:43.702005 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-wr662" podStartSLOduration=2.393268522 podStartE2EDuration="5.701979387s" podCreationTimestamp="2026-03-18 07:05:38 +0000 UTC" firstStartedPulling="2026-03-18 07:05:39.216637718 +0000 UTC m=+1124.157792432" lastFinishedPulling="2026-03-18 07:05:42.525348593 +0000 UTC m=+1127.466503297" observedRunningTime="2026-03-18 07:05:43.694760898 +0000 UTC m=+1128.635915652" watchObservedRunningTime="2026-03-18 07:05:43.701979387 +0000 UTC m=+1128.643134091" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.353790 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9pm58"] Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.355029 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9pm58" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.368911 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9pm58"] Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.468988 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b7da-account-create-update-v97z6"] Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.470336 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.472922 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.481107 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b7da-account-create-update-v97z6"] Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.508982 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-operator-scripts\") pod \"glance-b7da-account-create-update-v97z6\" (UID: \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\") " pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.509044 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38903d42-d383-4a16-99a9-a252e9238bb2-operator-scripts\") pod \"glance-db-create-9pm58\" (UID: \"38903d42-d383-4a16-99a9-a252e9238bb2\") " pod="openstack/glance-db-create-9pm58" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.509081 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8cd6\" (UniqueName: \"kubernetes.io/projected/38903d42-d383-4a16-99a9-a252e9238bb2-kube-api-access-c8cd6\") pod \"glance-db-create-9pm58\" (UID: \"38903d42-d383-4a16-99a9-a252e9238bb2\") " pod="openstack/glance-db-create-9pm58" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.509101 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkd5c\" (UniqueName: \"kubernetes.io/projected/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-kube-api-access-qkd5c\") pod \"glance-b7da-account-create-update-v97z6\" (UID: \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\") " pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.610325 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38903d42-d383-4a16-99a9-a252e9238bb2-operator-scripts\") pod \"glance-db-create-9pm58\" (UID: \"38903d42-d383-4a16-99a9-a252e9238bb2\") " pod="openstack/glance-db-create-9pm58" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.610424 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8cd6\" (UniqueName: \"kubernetes.io/projected/38903d42-d383-4a16-99a9-a252e9238bb2-kube-api-access-c8cd6\") pod \"glance-db-create-9pm58\" (UID: \"38903d42-d383-4a16-99a9-a252e9238bb2\") " pod="openstack/glance-db-create-9pm58" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.610462 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkd5c\" (UniqueName: \"kubernetes.io/projected/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-kube-api-access-qkd5c\") pod \"glance-b7da-account-create-update-v97z6\" (UID: \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\") " pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.610678 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-operator-scripts\") pod \"glance-b7da-account-create-update-v97z6\" (UID: \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\") " pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.611892 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-operator-scripts\") pod \"glance-b7da-account-create-update-v97z6\" (UID: \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\") " pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.612042 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38903d42-d383-4a16-99a9-a252e9238bb2-operator-scripts\") pod \"glance-db-create-9pm58\" (UID: \"38903d42-d383-4a16-99a9-a252e9238bb2\") " pod="openstack/glance-db-create-9pm58" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.630644 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkd5c\" (UniqueName: \"kubernetes.io/projected/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-kube-api-access-qkd5c\") pod \"glance-b7da-account-create-update-v97z6\" (UID: \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\") " pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.634701 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8cd6\" (UniqueName: \"kubernetes.io/projected/38903d42-d383-4a16-99a9-a252e9238bb2-kube-api-access-c8cd6\") pod \"glance-db-create-9pm58\" (UID: \"38903d42-d383-4a16-99a9-a252e9238bb2\") " pod="openstack/glance-db-create-9pm58" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.676207 4917 generic.go:334] "Generic (PLEG): container finished" podID="294f43c9-010d-4de1-8cdd-8d10f29d0e6f" containerID="a5c8dcc8b00fb61a998e2de57eda4651c17c337a9d22a5003f5ddff672299ec8" exitCode=0 Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.676751 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-flpk2" event={"ID":"294f43c9-010d-4de1-8cdd-8d10f29d0e6f","Type":"ContainerDied","Data":"a5c8dcc8b00fb61a998e2de57eda4651c17c337a9d22a5003f5ddff672299ec8"} Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.685125 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9pm58" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.837003 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:44 crc kubenswrapper[4917]: I0318 07:05:44.974386 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9pm58"] Mar 18 07:05:44 crc kubenswrapper[4917]: W0318 07:05:44.980553 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38903d42_d383_4a16_99a9_a252e9238bb2.slice/crio-173717e46cf7df4dfdf5bf822b29b35035dc496c080a583cfb5c044670ffdcae WatchSource:0}: Error finding container 173717e46cf7df4dfdf5bf822b29b35035dc496c080a583cfb5c044670ffdcae: Status 404 returned error can't find the container with id 173717e46cf7df4dfdf5bf822b29b35035dc496c080a583cfb5c044670ffdcae Mar 18 07:05:45 crc kubenswrapper[4917]: I0318 07:05:45.300332 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b7da-account-create-update-v97z6"] Mar 18 07:05:45 crc kubenswrapper[4917]: I0318 07:05:45.525551 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:45 crc kubenswrapper[4917]: E0318 07:05:45.525804 4917 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 18 07:05:45 crc kubenswrapper[4917]: E0318 07:05:45.526125 4917 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 18 07:05:45 crc kubenswrapper[4917]: E0318 07:05:45.526213 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift podName:7385bce6-e9e7-4f8b-84be-9afb342f7134 nodeName:}" failed. No retries permitted until 2026-03-18 07:05:53.526184053 +0000 UTC m=+1138.467338807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift") pod "swift-storage-0" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134") : configmap "swift-ring-files" not found Mar 18 07:05:45 crc kubenswrapper[4917]: I0318 07:05:45.705382 4917 generic.go:334] "Generic (PLEG): container finished" podID="926d49c5-2eb9-4bd6-8a38-31a1d02e6b47" containerID="0affa86d9ad40088e59a3f25bf18df604583664b66c3e9037725dd7d8fb8047d" exitCode=0 Mar 18 07:05:45 crc kubenswrapper[4917]: I0318 07:05:45.706036 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b7da-account-create-update-v97z6" event={"ID":"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47","Type":"ContainerDied","Data":"0affa86d9ad40088e59a3f25bf18df604583664b66c3e9037725dd7d8fb8047d"} Mar 18 07:05:45 crc kubenswrapper[4917]: I0318 07:05:45.706074 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b7da-account-create-update-v97z6" event={"ID":"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47","Type":"ContainerStarted","Data":"d4a54b2e9cd0b10e01e1cfb54e5815675305959a2a90538778d8184fbf8db478"} Mar 18 07:05:45 crc kubenswrapper[4917]: I0318 07:05:45.711086 4917 generic.go:334] "Generic (PLEG): container finished" podID="38903d42-d383-4a16-99a9-a252e9238bb2" containerID="a6b74816997731ee88b4c0a3507aa5ccade9c70b7ede35bfa632b3f05414456c" exitCode=0 Mar 18 07:05:45 crc kubenswrapper[4917]: I0318 07:05:45.711274 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9pm58" event={"ID":"38903d42-d383-4a16-99a9-a252e9238bb2","Type":"ContainerDied","Data":"a6b74816997731ee88b4c0a3507aa5ccade9c70b7ede35bfa632b3f05414456c"} Mar 18 07:05:45 crc kubenswrapper[4917]: I0318 07:05:45.711344 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9pm58" event={"ID":"38903d42-d383-4a16-99a9-a252e9238bb2","Type":"ContainerStarted","Data":"173717e46cf7df4dfdf5bf822b29b35035dc496c080a583cfb5c044670ffdcae"} Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.085347 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.240617 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-operator-scripts\") pod \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\" (UID: \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\") " Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.241030 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbrgs\" (UniqueName: \"kubernetes.io/projected/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-kube-api-access-sbrgs\") pod \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\" (UID: \"294f43c9-010d-4de1-8cdd-8d10f29d0e6f\") " Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.242031 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "294f43c9-010d-4de1-8cdd-8d10f29d0e6f" (UID: "294f43c9-010d-4de1-8cdd-8d10f29d0e6f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.247802 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-kube-api-access-sbrgs" (OuterVolumeSpecName: "kube-api-access-sbrgs") pod "294f43c9-010d-4de1-8cdd-8d10f29d0e6f" (UID: "294f43c9-010d-4de1-8cdd-8d10f29d0e6f"). InnerVolumeSpecName "kube-api-access-sbrgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.344086 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbrgs\" (UniqueName: \"kubernetes.io/projected/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-kube-api-access-sbrgs\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.344148 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/294f43c9-010d-4de1-8cdd-8d10f29d0e6f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.727119 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-flpk2" event={"ID":"294f43c9-010d-4de1-8cdd-8d10f29d0e6f","Type":"ContainerDied","Data":"76056c0445c1a6d361ca284cef9196dba72820a746a90c5fee2655092ed7c8e9"} Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.730029 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76056c0445c1a6d361ca284cef9196dba72820a746a90c5fee2655092ed7c8e9" Mar 18 07:05:46 crc kubenswrapper[4917]: I0318 07:05:46.727381 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-flpk2" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.014765 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.081759 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-zdmcf"] Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.081998 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" podUID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" containerName="dnsmasq-dns" containerID="cri-o://818c39450c69a154f30388b122f1a3401360ee3bb97f0b2a226c8d7926d3b096" gracePeriod=10 Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.261940 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.267071 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-operator-scripts\") pod \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\" (UID: \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\") " Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.267558 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkd5c\" (UniqueName: \"kubernetes.io/projected/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-kube-api-access-qkd5c\") pod \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\" (UID: \"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47\") " Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.267634 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "926d49c5-2eb9-4bd6-8a38-31a1d02e6b47" (UID: "926d49c5-2eb9-4bd6-8a38-31a1d02e6b47"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.268654 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.269772 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9pm58" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.272725 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-kube-api-access-qkd5c" (OuterVolumeSpecName: "kube-api-access-qkd5c") pod "926d49c5-2eb9-4bd6-8a38-31a1d02e6b47" (UID: "926d49c5-2eb9-4bd6-8a38-31a1d02e6b47"). InnerVolumeSpecName "kube-api-access-qkd5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.370438 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8cd6\" (UniqueName: \"kubernetes.io/projected/38903d42-d383-4a16-99a9-a252e9238bb2-kube-api-access-c8cd6\") pod \"38903d42-d383-4a16-99a9-a252e9238bb2\" (UID: \"38903d42-d383-4a16-99a9-a252e9238bb2\") " Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.370553 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38903d42-d383-4a16-99a9-a252e9238bb2-operator-scripts\") pod \"38903d42-d383-4a16-99a9-a252e9238bb2\" (UID: \"38903d42-d383-4a16-99a9-a252e9238bb2\") " Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.370973 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38903d42-d383-4a16-99a9-a252e9238bb2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38903d42-d383-4a16-99a9-a252e9238bb2" (UID: "38903d42-d383-4a16-99a9-a252e9238bb2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.371050 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkd5c\" (UniqueName: \"kubernetes.io/projected/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47-kube-api-access-qkd5c\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.473036 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38903d42-d383-4a16-99a9-a252e9238bb2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.527436 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" podUID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.739000 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b7da-account-create-update-v97z6" event={"ID":"926d49c5-2eb9-4bd6-8a38-31a1d02e6b47","Type":"ContainerDied","Data":"d4a54b2e9cd0b10e01e1cfb54e5815675305959a2a90538778d8184fbf8db478"} Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.739038 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4a54b2e9cd0b10e01e1cfb54e5815675305959a2a90538778d8184fbf8db478" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.739104 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b7da-account-create-update-v97z6" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.742125 4917 generic.go:334] "Generic (PLEG): container finished" podID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" containerID="818c39450c69a154f30388b122f1a3401360ee3bb97f0b2a226c8d7926d3b096" exitCode=0 Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.742178 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" event={"ID":"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49","Type":"ContainerDied","Data":"818c39450c69a154f30388b122f1a3401360ee3bb97f0b2a226c8d7926d3b096"} Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.744064 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9pm58" event={"ID":"38903d42-d383-4a16-99a9-a252e9238bb2","Type":"ContainerDied","Data":"173717e46cf7df4dfdf5bf822b29b35035dc496c080a583cfb5c044670ffdcae"} Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.744084 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="173717e46cf7df4dfdf5bf822b29b35035dc496c080a583cfb5c044670ffdcae" Mar 18 07:05:47 crc kubenswrapper[4917]: I0318 07:05:47.744137 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9pm58" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.275783 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38903d42-d383-4a16-99a9-a252e9238bb2-kube-api-access-c8cd6" (OuterVolumeSpecName: "kube-api-access-c8cd6") pod "38903d42-d383-4a16-99a9-a252e9238bb2" (UID: "38903d42-d383-4a16-99a9-a252e9238bb2"). InnerVolumeSpecName "kube-api-access-c8cd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.300125 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8cd6\" (UniqueName: \"kubernetes.io/projected/38903d42-d383-4a16-99a9-a252e9238bb2-kube-api-access-c8cd6\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.423633 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-flpk2"] Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.433743 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-flpk2"] Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.735229 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.773316 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" event={"ID":"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49","Type":"ContainerDied","Data":"3356b42123e7da022ebba157b27fbe792d1ed01118951d9e50fa6d3cab4dff8c"} Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.773354 4917 scope.go:117] "RemoveContainer" containerID="818c39450c69a154f30388b122f1a3401360ee3bb97f0b2a226c8d7926d3b096" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.773488 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f697c8bff-zdmcf" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.822646 4917 scope.go:117] "RemoveContainer" containerID="4a4b407d0996ca15e5daedfecffb0771f7f4f289b45da6dfb3ded6941b45af61" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.914502 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-dns-svc\") pod \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.914571 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-sb\") pod \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.914626 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xl7q\" (UniqueName: \"kubernetes.io/projected/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-kube-api-access-2xl7q\") pod \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.914688 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-config\") pod \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.914750 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-nb\") pod \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\" (UID: \"aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49\") " Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.919519 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-kube-api-access-2xl7q" (OuterVolumeSpecName: "kube-api-access-2xl7q") pod "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" (UID: "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49"). InnerVolumeSpecName "kube-api-access-2xl7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.951151 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-config" (OuterVolumeSpecName: "config") pod "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" (UID: "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.954420 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" (UID: "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.955614 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" (UID: "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:48 crc kubenswrapper[4917]: I0318 07:05:48.959794 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" (UID: "aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:49 crc kubenswrapper[4917]: I0318 07:05:49.016611 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:49 crc kubenswrapper[4917]: I0318 07:05:49.016871 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:49 crc kubenswrapper[4917]: I0318 07:05:49.016966 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:49 crc kubenswrapper[4917]: I0318 07:05:49.017030 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:49 crc kubenswrapper[4917]: I0318 07:05:49.017086 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xl7q\" (UniqueName: \"kubernetes.io/projected/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49-kube-api-access-2xl7q\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:49 crc kubenswrapper[4917]: I0318 07:05:49.117841 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-zdmcf"] Mar 18 07:05:49 crc kubenswrapper[4917]: I0318 07:05:49.123264 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f697c8bff-zdmcf"] Mar 18 07:05:49 crc kubenswrapper[4917]: I0318 07:05:49.788530 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="294f43c9-010d-4de1-8cdd-8d10f29d0e6f" path="/var/lib/kubelet/pods/294f43c9-010d-4de1-8cdd-8d10f29d0e6f/volumes" Mar 18 07:05:49 crc kubenswrapper[4917]: I0318 07:05:49.790306 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" path="/var/lib/kubelet/pods/aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49/volumes" Mar 18 07:05:50 crc kubenswrapper[4917]: I0318 07:05:50.798166 4917 generic.go:334] "Generic (PLEG): container finished" podID="80378da5-5294-4d7e-92c9-2aba37eb64a1" containerID="229ed9b75a6c6ab364d819b8ba387d93d101381df8c8e058d0975c3e44b2ee17" exitCode=0 Mar 18 07:05:50 crc kubenswrapper[4917]: I0318 07:05:50.798216 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wr662" event={"ID":"80378da5-5294-4d7e-92c9-2aba37eb64a1","Type":"ContainerDied","Data":"229ed9b75a6c6ab364d819b8ba387d93d101381df8c8e058d0975c3e44b2ee17"} Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.192235 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tp4f7"] Mar 18 07:05:51 crc kubenswrapper[4917]: E0318 07:05:51.192839 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" containerName="dnsmasq-dns" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.192869 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" containerName="dnsmasq-dns" Mar 18 07:05:51 crc kubenswrapper[4917]: E0318 07:05:51.192905 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926d49c5-2eb9-4bd6-8a38-31a1d02e6b47" containerName="mariadb-account-create-update" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.192918 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="926d49c5-2eb9-4bd6-8a38-31a1d02e6b47" containerName="mariadb-account-create-update" Mar 18 07:05:51 crc kubenswrapper[4917]: E0318 07:05:51.192953 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" containerName="init" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.192967 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" containerName="init" Mar 18 07:05:51 crc kubenswrapper[4917]: E0318 07:05:51.192990 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="294f43c9-010d-4de1-8cdd-8d10f29d0e6f" containerName="mariadb-account-create-update" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.193003 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="294f43c9-010d-4de1-8cdd-8d10f29d0e6f" containerName="mariadb-account-create-update" Mar 18 07:05:51 crc kubenswrapper[4917]: E0318 07:05:51.193024 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38903d42-d383-4a16-99a9-a252e9238bb2" containerName="mariadb-database-create" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.193036 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="38903d42-d383-4a16-99a9-a252e9238bb2" containerName="mariadb-database-create" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.193316 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaad3f8d-23ba-40e0-8ac6-bd02d8f9da49" containerName="dnsmasq-dns" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.193342 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="294f43c9-010d-4de1-8cdd-8d10f29d0e6f" containerName="mariadb-account-create-update" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.193374 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="38903d42-d383-4a16-99a9-a252e9238bb2" containerName="mariadb-database-create" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.193395 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="926d49c5-2eb9-4bd6-8a38-31a1d02e6b47" containerName="mariadb-account-create-update" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.194197 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.198789 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.202100 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tp4f7"] Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.290716 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44z2g\" (UniqueName: \"kubernetes.io/projected/51316588-92e7-41dc-a1a5-6cfe2dc70632-kube-api-access-44z2g\") pod \"root-account-create-update-tp4f7\" (UID: \"51316588-92e7-41dc-a1a5-6cfe2dc70632\") " pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.290913 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51316588-92e7-41dc-a1a5-6cfe2dc70632-operator-scripts\") pod \"root-account-create-update-tp4f7\" (UID: \"51316588-92e7-41dc-a1a5-6cfe2dc70632\") " pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.392904 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51316588-92e7-41dc-a1a5-6cfe2dc70632-operator-scripts\") pod \"root-account-create-update-tp4f7\" (UID: \"51316588-92e7-41dc-a1a5-6cfe2dc70632\") " pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.392998 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44z2g\" (UniqueName: \"kubernetes.io/projected/51316588-92e7-41dc-a1a5-6cfe2dc70632-kube-api-access-44z2g\") pod \"root-account-create-update-tp4f7\" (UID: \"51316588-92e7-41dc-a1a5-6cfe2dc70632\") " pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.394493 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51316588-92e7-41dc-a1a5-6cfe2dc70632-operator-scripts\") pod \"root-account-create-update-tp4f7\" (UID: \"51316588-92e7-41dc-a1a5-6cfe2dc70632\") " pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.430566 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44z2g\" (UniqueName: \"kubernetes.io/projected/51316588-92e7-41dc-a1a5-6cfe2dc70632-kube-api-access-44z2g\") pod \"root-account-create-update-tp4f7\" (UID: \"51316588-92e7-41dc-a1a5-6cfe2dc70632\") " pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.529125 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.810779 4917 generic.go:334] "Generic (PLEG): container finished" podID="4a2ed8f1-269d-45fb-a766-46c867bd0a91" containerID="89d1b01a410a58d6efadc5b7f3e235ac35c4482c50d32403ecd6e850c30e9231" exitCode=0 Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.810897 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a2ed8f1-269d-45fb-a766-46c867bd0a91","Type":"ContainerDied","Data":"89d1b01a410a58d6efadc5b7f3e235ac35c4482c50d32403ecd6e850c30e9231"} Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.818228 4917 generic.go:334] "Generic (PLEG): container finished" podID="11fb09df-78b6-44c6-a78f-2b720a98cfad" containerID="38f6a17087633473084cee1b528df5d10be62a5fb94b0daa33f536007711ed0e" exitCode=0 Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.818789 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11fb09df-78b6-44c6-a78f-2b720a98cfad","Type":"ContainerDied","Data":"38f6a17087633473084cee1b528df5d10be62a5fb94b0daa33f536007711ed0e"} Mar 18 07:05:51 crc kubenswrapper[4917]: I0318 07:05:51.876965 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tp4f7"] Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.108666 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.205987 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-scripts\") pod \"80378da5-5294-4d7e-92c9-2aba37eb64a1\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.206083 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-combined-ca-bundle\") pod \"80378da5-5294-4d7e-92c9-2aba37eb64a1\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.206167 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ks94\" (UniqueName: \"kubernetes.io/projected/80378da5-5294-4d7e-92c9-2aba37eb64a1-kube-api-access-5ks94\") pod \"80378da5-5294-4d7e-92c9-2aba37eb64a1\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.206196 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-swiftconf\") pod \"80378da5-5294-4d7e-92c9-2aba37eb64a1\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.206223 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-ring-data-devices\") pod \"80378da5-5294-4d7e-92c9-2aba37eb64a1\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.206287 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-dispersionconf\") pod \"80378da5-5294-4d7e-92c9-2aba37eb64a1\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.206322 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/80378da5-5294-4d7e-92c9-2aba37eb64a1-etc-swift\") pod \"80378da5-5294-4d7e-92c9-2aba37eb64a1\" (UID: \"80378da5-5294-4d7e-92c9-2aba37eb64a1\") " Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.208074 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "80378da5-5294-4d7e-92c9-2aba37eb64a1" (UID: "80378da5-5294-4d7e-92c9-2aba37eb64a1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.208549 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80378da5-5294-4d7e-92c9-2aba37eb64a1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "80378da5-5294-4d7e-92c9-2aba37eb64a1" (UID: "80378da5-5294-4d7e-92c9-2aba37eb64a1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.212080 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80378da5-5294-4d7e-92c9-2aba37eb64a1-kube-api-access-5ks94" (OuterVolumeSpecName: "kube-api-access-5ks94") pod "80378da5-5294-4d7e-92c9-2aba37eb64a1" (UID: "80378da5-5294-4d7e-92c9-2aba37eb64a1"). InnerVolumeSpecName "kube-api-access-5ks94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.223506 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "80378da5-5294-4d7e-92c9-2aba37eb64a1" (UID: "80378da5-5294-4d7e-92c9-2aba37eb64a1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.230504 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80378da5-5294-4d7e-92c9-2aba37eb64a1" (UID: "80378da5-5294-4d7e-92c9-2aba37eb64a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.235159 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-scripts" (OuterVolumeSpecName: "scripts") pod "80378da5-5294-4d7e-92c9-2aba37eb64a1" (UID: "80378da5-5294-4d7e-92c9-2aba37eb64a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.235157 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "80378da5-5294-4d7e-92c9-2aba37eb64a1" (UID: "80378da5-5294-4d7e-92c9-2aba37eb64a1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.308260 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.308323 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ks94\" (UniqueName: \"kubernetes.io/projected/80378da5-5294-4d7e-92c9-2aba37eb64a1-kube-api-access-5ks94\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.308344 4917 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.308363 4917 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.308393 4917 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/80378da5-5294-4d7e-92c9-2aba37eb64a1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.308410 4917 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/80378da5-5294-4d7e-92c9-2aba37eb64a1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.308428 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80378da5-5294-4d7e-92c9-2aba37eb64a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.830518 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11fb09df-78b6-44c6-a78f-2b720a98cfad","Type":"ContainerStarted","Data":"256f8ce82bcbadac34767fc05b95a7249fd4500c71a2a020dd30ba54395951e5"} Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.831136 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.833781 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-wr662" event={"ID":"80378da5-5294-4d7e-92c9-2aba37eb64a1","Type":"ContainerDied","Data":"fa99d5b1858e806fdea7ea649797ebe557cb09548a775e8393a5746a2c62cd5c"} Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.833871 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa99d5b1858e806fdea7ea649797ebe557cb09548a775e8393a5746a2c62cd5c" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.833798 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-wr662" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.835202 4917 generic.go:334] "Generic (PLEG): container finished" podID="51316588-92e7-41dc-a1a5-6cfe2dc70632" containerID="e8321e394102520dcb88380357df3d2f6b7fb4eebed49b73947cce669ba85bdb" exitCode=0 Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.835284 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tp4f7" event={"ID":"51316588-92e7-41dc-a1a5-6cfe2dc70632","Type":"ContainerDied","Data":"e8321e394102520dcb88380357df3d2f6b7fb4eebed49b73947cce669ba85bdb"} Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.835311 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tp4f7" event={"ID":"51316588-92e7-41dc-a1a5-6cfe2dc70632","Type":"ContainerStarted","Data":"829b335463998ffcfda186a605212acf420df5b0a26e19324d528bbc0354c69c"} Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.837381 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a2ed8f1-269d-45fb-a766-46c867bd0a91","Type":"ContainerStarted","Data":"a60676e9973b2a6031a6ea6dba53de6eade7182dfd3e15a64da35230283bab60"} Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.837710 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.870534 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.624841952 podStartE2EDuration="53.870510519s" podCreationTimestamp="2026-03-18 07:04:59 +0000 UTC" firstStartedPulling="2026-03-18 07:05:01.68196971 +0000 UTC m=+1086.623124424" lastFinishedPulling="2026-03-18 07:05:15.927638267 +0000 UTC m=+1100.868792991" observedRunningTime="2026-03-18 07:05:52.856704777 +0000 UTC m=+1137.797859501" watchObservedRunningTime="2026-03-18 07:05:52.870510519 +0000 UTC m=+1137.811665243" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.960784 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 07:05:52 crc kubenswrapper[4917]: I0318 07:05:52.992720 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.626866477 podStartE2EDuration="54.992703462s" podCreationTimestamp="2026-03-18 07:04:58 +0000 UTC" firstStartedPulling="2026-03-18 07:05:00.642192165 +0000 UTC m=+1085.583346879" lastFinishedPulling="2026-03-18 07:05:15.00802914 +0000 UTC m=+1099.949183864" observedRunningTime="2026-03-18 07:05:52.920777293 +0000 UTC m=+1137.861932017" watchObservedRunningTime="2026-03-18 07:05:52.992703462 +0000 UTC m=+1137.933858176" Mar 18 07:05:53 crc kubenswrapper[4917]: I0318 07:05:53.531026 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:53 crc kubenswrapper[4917]: I0318 07:05:53.538187 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") pod \"swift-storage-0\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " pod="openstack/swift-storage-0" Mar 18 07:05:53 crc kubenswrapper[4917]: I0318 07:05:53.634328 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.166303 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.244002 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51316588-92e7-41dc-a1a5-6cfe2dc70632-operator-scripts\") pod \"51316588-92e7-41dc-a1a5-6cfe2dc70632\" (UID: \"51316588-92e7-41dc-a1a5-6cfe2dc70632\") " Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.244079 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44z2g\" (UniqueName: \"kubernetes.io/projected/51316588-92e7-41dc-a1a5-6cfe2dc70632-kube-api-access-44z2g\") pod \"51316588-92e7-41dc-a1a5-6cfe2dc70632\" (UID: \"51316588-92e7-41dc-a1a5-6cfe2dc70632\") " Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.244908 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51316588-92e7-41dc-a1a5-6cfe2dc70632-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "51316588-92e7-41dc-a1a5-6cfe2dc70632" (UID: "51316588-92e7-41dc-a1a5-6cfe2dc70632"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.248483 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51316588-92e7-41dc-a1a5-6cfe2dc70632-kube-api-access-44z2g" (OuterVolumeSpecName: "kube-api-access-44z2g") pod "51316588-92e7-41dc-a1a5-6cfe2dc70632" (UID: "51316588-92e7-41dc-a1a5-6cfe2dc70632"). InnerVolumeSpecName "kube-api-access-44z2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:05:54 crc kubenswrapper[4917]: W0318 07:05:54.254462 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7385bce6_e9e7_4f8b_84be_9afb342f7134.slice/crio-04516f85c5dee38cd77003da96b1ab2262b612ddd5b119b0d301a50d9f551d59 WatchSource:0}: Error finding container 04516f85c5dee38cd77003da96b1ab2262b612ddd5b119b0d301a50d9f551d59: Status 404 returned error can't find the container with id 04516f85c5dee38cd77003da96b1ab2262b612ddd5b119b0d301a50d9f551d59 Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.254746 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.345859 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/51316588-92e7-41dc-a1a5-6cfe2dc70632-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.345890 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44z2g\" (UniqueName: \"kubernetes.io/projected/51316588-92e7-41dc-a1a5-6cfe2dc70632-kube-api-access-44z2g\") on node \"crc\" DevicePath \"\"" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.606946 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-xqwbz"] Mar 18 07:05:54 crc kubenswrapper[4917]: E0318 07:05:54.607277 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80378da5-5294-4d7e-92c9-2aba37eb64a1" containerName="swift-ring-rebalance" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.607293 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="80378da5-5294-4d7e-92c9-2aba37eb64a1" containerName="swift-ring-rebalance" Mar 18 07:05:54 crc kubenswrapper[4917]: E0318 07:05:54.607312 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51316588-92e7-41dc-a1a5-6cfe2dc70632" containerName="mariadb-account-create-update" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.607319 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="51316588-92e7-41dc-a1a5-6cfe2dc70632" containerName="mariadb-account-create-update" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.607508 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="80378da5-5294-4d7e-92c9-2aba37eb64a1" containerName="swift-ring-rebalance" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.607528 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="51316588-92e7-41dc-a1a5-6cfe2dc70632" containerName="mariadb-account-create-update" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.608166 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.611579 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bstgs" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.611971 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.616274 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xqwbz"] Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.752836 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jw8x\" (UniqueName: \"kubernetes.io/projected/cd14ce63-7434-4646-9995-5cc41d2a4c6c-kube-api-access-9jw8x\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.753125 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-db-sync-config-data\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.753414 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-config-data\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.753472 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-combined-ca-bundle\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.854552 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-config-data\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.854617 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-combined-ca-bundle\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.854651 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jw8x\" (UniqueName: \"kubernetes.io/projected/cd14ce63-7434-4646-9995-5cc41d2a4c6c-kube-api-access-9jw8x\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.854792 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-db-sync-config-data\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.857663 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tp4f7" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.860394 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tp4f7" event={"ID":"51316588-92e7-41dc-a1a5-6cfe2dc70632","Type":"ContainerDied","Data":"829b335463998ffcfda186a605212acf420df5b0a26e19324d528bbc0354c69c"} Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.860444 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829b335463998ffcfda186a605212acf420df5b0a26e19324d528bbc0354c69c" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.862704 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-combined-ca-bundle\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.862868 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-config-data\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.863752 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-db-sync-config-data\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.870494 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"04516f85c5dee38cd77003da96b1ab2262b612ddd5b119b0d301a50d9f551d59"} Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.879735 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jw8x\" (UniqueName: \"kubernetes.io/projected/cd14ce63-7434-4646-9995-5cc41d2a4c6c-kube-api-access-9jw8x\") pod \"glance-db-sync-xqwbz\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:54 crc kubenswrapper[4917]: I0318 07:05:54.925209 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xqwbz" Mar 18 07:05:55 crc kubenswrapper[4917]: I0318 07:05:55.580601 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-xqwbz"] Mar 18 07:05:55 crc kubenswrapper[4917]: W0318 07:05:55.582929 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd14ce63_7434_4646_9995_5cc41d2a4c6c.slice/crio-186baace6668c11f1b7322cb1eb0d8efcfdf18b9c228ec5dfa191f6445e86bea WatchSource:0}: Error finding container 186baace6668c11f1b7322cb1eb0d8efcfdf18b9c228ec5dfa191f6445e86bea: Status 404 returned error can't find the container with id 186baace6668c11f1b7322cb1eb0d8efcfdf18b9c228ec5dfa191f6445e86bea Mar 18 07:05:55 crc kubenswrapper[4917]: I0318 07:05:55.883581 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xqwbz" event={"ID":"cd14ce63-7434-4646-9995-5cc41d2a4c6c","Type":"ContainerStarted","Data":"186baace6668c11f1b7322cb1eb0d8efcfdf18b9c228ec5dfa191f6445e86bea"} Mar 18 07:05:57 crc kubenswrapper[4917]: I0318 07:05:57.905155 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a"} Mar 18 07:05:57 crc kubenswrapper[4917]: I0318 07:05:57.905709 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b"} Mar 18 07:05:58 crc kubenswrapper[4917]: I0318 07:05:58.450627 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tp4f7"] Mar 18 07:05:58 crc kubenswrapper[4917]: I0318 07:05:58.461865 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tp4f7"] Mar 18 07:05:58 crc kubenswrapper[4917]: I0318 07:05:58.918132 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850"} Mar 18 07:05:58 crc kubenswrapper[4917]: I0318 07:05:58.918415 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62"} Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.527848 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9gbdd" podUID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" containerName="ovn-controller" probeResult="failure" output=< Mar 18 07:05:59 crc kubenswrapper[4917]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 07:05:59 crc kubenswrapper[4917]: > Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.547816 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.551767 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.798427 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51316588-92e7-41dc-a1a5-6cfe2dc70632" path="/var/lib/kubelet/pods/51316588-92e7-41dc-a1a5-6cfe2dc70632/volumes" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.799257 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9gbdd-config-x68ll"] Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.800397 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.802438 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.802697 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9gbdd-config-x68ll"] Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.934166 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-log-ovn\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.934216 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-scripts\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.934290 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-additional-scripts\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.934315 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwnn6\" (UniqueName: \"kubernetes.io/projected/bd66746b-8963-443e-9372-8402ff2ee2b1-kube-api-access-fwnn6\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.934366 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:05:59 crc kubenswrapper[4917]: I0318 07:05:59.934400 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run-ovn\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.035659 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-log-ovn\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.035938 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-scripts\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.036002 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-additional-scripts\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.036023 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwnn6\" (UniqueName: \"kubernetes.io/projected/bd66746b-8963-443e-9372-8402ff2ee2b1-kube-api-access-fwnn6\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.036056 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.036098 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run-ovn\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.035996 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-log-ovn\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.036836 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.036851 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run-ovn\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.036903 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-additional-scripts\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.038410 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-scripts\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.068662 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwnn6\" (UniqueName: \"kubernetes.io/projected/bd66746b-8963-443e-9372-8402ff2ee2b1-kube-api-access-fwnn6\") pod \"ovn-controller-9gbdd-config-x68ll\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.127233 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.138634 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563626-vlkxh"] Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.141885 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563626-vlkxh" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.151162 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.151505 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.151702 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.154791 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563626-vlkxh"] Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.239481 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpch4\" (UniqueName: \"kubernetes.io/projected/c1f16f1b-109d-44a6-9ac8-a954e43c41c5-kube-api-access-xpch4\") pod \"auto-csr-approver-29563626-vlkxh\" (UID: \"c1f16f1b-109d-44a6-9ac8-a954e43c41c5\") " pod="openshift-infra/auto-csr-approver-29563626-vlkxh" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.340717 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpch4\" (UniqueName: \"kubernetes.io/projected/c1f16f1b-109d-44a6-9ac8-a954e43c41c5-kube-api-access-xpch4\") pod \"auto-csr-approver-29563626-vlkxh\" (UID: \"c1f16f1b-109d-44a6-9ac8-a954e43c41c5\") " pod="openshift-infra/auto-csr-approver-29563626-vlkxh" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.371611 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpch4\" (UniqueName: \"kubernetes.io/projected/c1f16f1b-109d-44a6-9ac8-a954e43c41c5-kube-api-access-xpch4\") pod \"auto-csr-approver-29563626-vlkxh\" (UID: \"c1f16f1b-109d-44a6-9ac8-a954e43c41c5\") " pod="openshift-infra/auto-csr-approver-29563626-vlkxh" Mar 18 07:06:00 crc kubenswrapper[4917]: I0318 07:06:00.493528 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563626-vlkxh" Mar 18 07:06:02 crc kubenswrapper[4917]: I0318 07:06:02.928726 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:06:02 crc kubenswrapper[4917]: I0318 07:06:02.929348 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:06:02 crc kubenswrapper[4917]: I0318 07:06:02.929394 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:06:02 crc kubenswrapper[4917]: I0318 07:06:02.929937 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0854ed049fe9b7cfdc3675b0efe75d5c58d0e2de88a3782f41bc4ebb18b9f74"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:06:02 crc kubenswrapper[4917]: I0318 07:06:02.929997 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://d0854ed049fe9b7cfdc3675b0efe75d5c58d0e2de88a3782f41bc4ebb18b9f74" gracePeriod=600 Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.214632 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563626-vlkxh"] Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.223205 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9gbdd-config-x68ll"] Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.458624 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-pd424"] Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.460397 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pd424" Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.462447 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.468288 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pd424"] Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.596519 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn5rr\" (UniqueName: \"kubernetes.io/projected/9caf5978-14cb-440d-bb9b-9ad1cc0590af-kube-api-access-hn5rr\") pod \"root-account-create-update-pd424\" (UID: \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\") " pod="openstack/root-account-create-update-pd424" Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.596618 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9caf5978-14cb-440d-bb9b-9ad1cc0590af-operator-scripts\") pod \"root-account-create-update-pd424\" (UID: \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\") " pod="openstack/root-account-create-update-pd424" Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.698203 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9caf5978-14cb-440d-bb9b-9ad1cc0590af-operator-scripts\") pod \"root-account-create-update-pd424\" (UID: \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\") " pod="openstack/root-account-create-update-pd424" Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.698442 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn5rr\" (UniqueName: \"kubernetes.io/projected/9caf5978-14cb-440d-bb9b-9ad1cc0590af-kube-api-access-hn5rr\") pod \"root-account-create-update-pd424\" (UID: \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\") " pod="openstack/root-account-create-update-pd424" Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.699275 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9caf5978-14cb-440d-bb9b-9ad1cc0590af-operator-scripts\") pod \"root-account-create-update-pd424\" (UID: \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\") " pod="openstack/root-account-create-update-pd424" Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.726217 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn5rr\" (UniqueName: \"kubernetes.io/projected/9caf5978-14cb-440d-bb9b-9ad1cc0590af-kube-api-access-hn5rr\") pod \"root-account-create-update-pd424\" (UID: \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\") " pod="openstack/root-account-create-update-pd424" Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.789001 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pd424" Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.974440 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="d0854ed049fe9b7cfdc3675b0efe75d5c58d0e2de88a3782f41bc4ebb18b9f74" exitCode=0 Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.974490 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"d0854ed049fe9b7cfdc3675b0efe75d5c58d0e2de88a3782f41bc4ebb18b9f74"} Mar 18 07:06:03 crc kubenswrapper[4917]: I0318 07:06:03.974525 4917 scope.go:117] "RemoveContainer" containerID="d1fa455e15b1a756345723f1e179413cfc4b43062d137c44fce060567289753f" Mar 18 07:06:04 crc kubenswrapper[4917]: I0318 07:06:04.525414 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9gbdd" podUID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" containerName="ovn-controller" probeResult="failure" output=< Mar 18 07:06:04 crc kubenswrapper[4917]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 07:06:04 crc kubenswrapper[4917]: > Mar 18 07:06:09 crc kubenswrapper[4917]: I0318 07:06:09.511394 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9gbdd" podUID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" containerName="ovn-controller" probeResult="failure" output=< Mar 18 07:06:09 crc kubenswrapper[4917]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 18 07:06:09 crc kubenswrapper[4917]: > Mar 18 07:06:09 crc kubenswrapper[4917]: W0318 07:06:09.708203 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1f16f1b_109d_44a6_9ac8_a954e43c41c5.slice/crio-e1eb13802812207fd0e30fd941d6335b40d60504fca1a9ae0663696b2f1bb99b WatchSource:0}: Error finding container e1eb13802812207fd0e30fd941d6335b40d60504fca1a9ae0663696b2f1bb99b: Status 404 returned error can't find the container with id e1eb13802812207fd0e30fd941d6335b40d60504fca1a9ae0663696b2f1bb99b Mar 18 07:06:09 crc kubenswrapper[4917]: W0318 07:06:09.722040 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd66746b_8963_443e_9372_8402ff2ee2b1.slice/crio-84081a640bfa3615c9a7ce4b14a3858ff56341690d06d8069ff51d82e535aa42 WatchSource:0}: Error finding container 84081a640bfa3615c9a7ce4b14a3858ff56341690d06d8069ff51d82e535aa42: Status 404 returned error can't find the container with id 84081a640bfa3615c9a7ce4b14a3858ff56341690d06d8069ff51d82e535aa42 Mar 18 07:06:10 crc kubenswrapper[4917]: I0318 07:06:10.137659 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e"} Mar 18 07:06:10 crc kubenswrapper[4917]: I0318 07:06:10.153988 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"ed205158cfd88c24a2618c8398681343eec6e1ff531ca763ff821abed75c51f1"} Mar 18 07:06:10 crc kubenswrapper[4917]: I0318 07:06:10.161558 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9gbdd-config-x68ll" event={"ID":"bd66746b-8963-443e-9372-8402ff2ee2b1","Type":"ContainerStarted","Data":"84081a640bfa3615c9a7ce4b14a3858ff56341690d06d8069ff51d82e535aa42"} Mar 18 07:06:10 crc kubenswrapper[4917]: I0318 07:06:10.167886 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563626-vlkxh" event={"ID":"c1f16f1b-109d-44a6-9ac8-a954e43c41c5","Type":"ContainerStarted","Data":"e1eb13802812207fd0e30fd941d6335b40d60504fca1a9ae0663696b2f1bb99b"} Mar 18 07:06:10 crc kubenswrapper[4917]: I0318 07:06:10.196851 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9gbdd-config-x68ll" podStartSLOduration=11.196831996 podStartE2EDuration="11.196831996s" podCreationTimestamp="2026-03-18 07:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:10.194598351 +0000 UTC m=+1155.135753085" watchObservedRunningTime="2026-03-18 07:06:10.196831996 +0000 UTC m=+1155.137986700" Mar 18 07:06:10 crc kubenswrapper[4917]: I0318 07:06:10.219516 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:06:10 crc kubenswrapper[4917]: I0318 07:06:10.240502 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-pd424"] Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.114832 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.175543 4917 generic.go:334] "Generic (PLEG): container finished" podID="bd66746b-8963-443e-9372-8402ff2ee2b1" containerID="f12fd537a1df880673bc6d68c046c68708eca4a3e38f151f3ad0288bc357683d" exitCode=0 Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.175645 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9gbdd-config-x68ll" event={"ID":"bd66746b-8963-443e-9372-8402ff2ee2b1","Type":"ContainerDied","Data":"f12fd537a1df880673bc6d68c046c68708eca4a3e38f151f3ad0288bc357683d"} Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.176724 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563626-vlkxh" event={"ID":"c1f16f1b-109d-44a6-9ac8-a954e43c41c5","Type":"ContainerStarted","Data":"86cf9a99fc294f76759213f08ff613566913303572b79c76fb1e75d3604f58f3"} Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.180378 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84"} Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.180409 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a"} Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.180418 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5"} Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.181505 4917 generic.go:334] "Generic (PLEG): container finished" podID="9caf5978-14cb-440d-bb9b-9ad1cc0590af" containerID="d627fc6ae8a31ea609ee3334c418704d6c2c5ee092724d1282be71d8044d788b" exitCode=0 Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.181539 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pd424" event={"ID":"9caf5978-14cb-440d-bb9b-9ad1cc0590af","Type":"ContainerDied","Data":"d627fc6ae8a31ea609ee3334c418704d6c2c5ee092724d1282be71d8044d788b"} Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.181553 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pd424" event={"ID":"9caf5978-14cb-440d-bb9b-9ad1cc0590af","Type":"ContainerStarted","Data":"df0ac666042bdc8d389afd2ee091a66aec022d3c96e2c3867ad5edf4546045a5"} Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.183701 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xqwbz" event={"ID":"cd14ce63-7434-4646-9995-5cc41d2a4c6c","Type":"ContainerStarted","Data":"041ca039bbadb98cfa9331d8184c76abf835c6ed1fca960800be969281f8d5cf"} Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.229397 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563626-vlkxh" podStartSLOduration=10.368496335 podStartE2EDuration="11.229380324s" podCreationTimestamp="2026-03-18 07:06:00 +0000 UTC" firstStartedPulling="2026-03-18 07:06:09.755666471 +0000 UTC m=+1154.696821185" lastFinishedPulling="2026-03-18 07:06:10.61655046 +0000 UTC m=+1155.557705174" observedRunningTime="2026-03-18 07:06:11.222541634 +0000 UTC m=+1156.163696358" watchObservedRunningTime="2026-03-18 07:06:11.229380324 +0000 UTC m=+1156.170535038" Mar 18 07:06:11 crc kubenswrapper[4917]: I0318 07:06:11.261640 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-xqwbz" podStartSLOduration=3.01002041 podStartE2EDuration="17.261622381s" podCreationTimestamp="2026-03-18 07:05:54 +0000 UTC" firstStartedPulling="2026-03-18 07:05:55.585397772 +0000 UTC m=+1140.526552506" lastFinishedPulling="2026-03-18 07:06:09.836999753 +0000 UTC m=+1154.778154477" observedRunningTime="2026-03-18 07:06:11.258621007 +0000 UTC m=+1156.199775741" watchObservedRunningTime="2026-03-18 07:06:11.261622381 +0000 UTC m=+1156.202777095" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.193382 4917 generic.go:334] "Generic (PLEG): container finished" podID="c1f16f1b-109d-44a6-9ac8-a954e43c41c5" containerID="86cf9a99fc294f76759213f08ff613566913303572b79c76fb1e75d3604f58f3" exitCode=0 Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.193614 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563626-vlkxh" event={"ID":"c1f16f1b-109d-44a6-9ac8-a954e43c41c5","Type":"ContainerDied","Data":"86cf9a99fc294f76759213f08ff613566913303572b79c76fb1e75d3604f58f3"} Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.438461 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-nb2kk"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.439771 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.453786 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nb2kk"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.457456 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edbd50ea-b9fc-479c-8702-82df627467bf-operator-scripts\") pod \"cinder-db-create-nb2kk\" (UID: \"edbd50ea-b9fc-479c-8702-82df627467bf\") " pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.457561 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf2z9\" (UniqueName: \"kubernetes.io/projected/edbd50ea-b9fc-479c-8702-82df627467bf-kube-api-access-cf2z9\") pod \"cinder-db-create-nb2kk\" (UID: \"edbd50ea-b9fc-479c-8702-82df627467bf\") " pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.550274 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db3f-account-create-update-9dqtt"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.551372 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.553638 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.558992 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqg7c\" (UniqueName: \"kubernetes.io/projected/55d4f4e3-0908-410e-a4ec-e40c5550d370-kube-api-access-fqg7c\") pod \"cinder-db3f-account-create-update-9dqtt\" (UID: \"55d4f4e3-0908-410e-a4ec-e40c5550d370\") " pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.559050 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf2z9\" (UniqueName: \"kubernetes.io/projected/edbd50ea-b9fc-479c-8702-82df627467bf-kube-api-access-cf2z9\") pod \"cinder-db-create-nb2kk\" (UID: \"edbd50ea-b9fc-479c-8702-82df627467bf\") " pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.559075 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d4f4e3-0908-410e-a4ec-e40c5550d370-operator-scripts\") pod \"cinder-db3f-account-create-update-9dqtt\" (UID: \"55d4f4e3-0908-410e-a4ec-e40c5550d370\") " pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.559235 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edbd50ea-b9fc-479c-8702-82df627467bf-operator-scripts\") pod \"cinder-db-create-nb2kk\" (UID: \"edbd50ea-b9fc-479c-8702-82df627467bf\") " pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.559758 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db3f-account-create-update-9dqtt"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.560017 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edbd50ea-b9fc-479c-8702-82df627467bf-operator-scripts\") pod \"cinder-db-create-nb2kk\" (UID: \"edbd50ea-b9fc-479c-8702-82df627467bf\") " pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.587037 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf2z9\" (UniqueName: \"kubernetes.io/projected/edbd50ea-b9fc-479c-8702-82df627467bf-kube-api-access-cf2z9\") pod \"cinder-db-create-nb2kk\" (UID: \"edbd50ea-b9fc-479c-8702-82df627467bf\") " pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.641627 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-frlcc"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.642546 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.658196 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-frlcc"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.664482 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d4f4e3-0908-410e-a4ec-e40c5550d370-operator-scripts\") pod \"cinder-db3f-account-create-update-9dqtt\" (UID: \"55d4f4e3-0908-410e-a4ec-e40c5550d370\") " pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.664535 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73ea0479-f6bc-4136-af63-8e92f46e3891-operator-scripts\") pod \"neutron-db-create-frlcc\" (UID: \"73ea0479-f6bc-4136-af63-8e92f46e3891\") " pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.664672 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtf9s\" (UniqueName: \"kubernetes.io/projected/73ea0479-f6bc-4136-af63-8e92f46e3891-kube-api-access-rtf9s\") pod \"neutron-db-create-frlcc\" (UID: \"73ea0479-f6bc-4136-af63-8e92f46e3891\") " pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.664707 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqg7c\" (UniqueName: \"kubernetes.io/projected/55d4f4e3-0908-410e-a4ec-e40c5550d370-kube-api-access-fqg7c\") pod \"cinder-db3f-account-create-update-9dqtt\" (UID: \"55d4f4e3-0908-410e-a4ec-e40c5550d370\") " pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.665315 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d4f4e3-0908-410e-a4ec-e40c5550d370-operator-scripts\") pod \"cinder-db3f-account-create-update-9dqtt\" (UID: \"55d4f4e3-0908-410e-a4ec-e40c5550d370\") " pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.679314 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.684827 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqg7c\" (UniqueName: \"kubernetes.io/projected/55d4f4e3-0908-410e-a4ec-e40c5550d370-kube-api-access-fqg7c\") pod \"cinder-db3f-account-create-update-9dqtt\" (UID: \"55d4f4e3-0908-410e-a4ec-e40c5550d370\") " pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.702295 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-qnf9f"] Mar 18 07:06:12 crc kubenswrapper[4917]: E0318 07:06:12.702776 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd66746b-8963-443e-9372-8402ff2ee2b1" containerName="ovn-config" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.702805 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd66746b-8963-443e-9372-8402ff2ee2b1" containerName="ovn-config" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.702987 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd66746b-8963-443e-9372-8402ff2ee2b1" containerName="ovn-config" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.703528 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.717913 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.718119 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.718356 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.718513 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5j76w" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.719629 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pd424" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.749682 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qnf9f"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766062 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwnn6\" (UniqueName: \"kubernetes.io/projected/bd66746b-8963-443e-9372-8402ff2ee2b1-kube-api-access-fwnn6\") pod \"bd66746b-8963-443e-9372-8402ff2ee2b1\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766186 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn5rr\" (UniqueName: \"kubernetes.io/projected/9caf5978-14cb-440d-bb9b-9ad1cc0590af-kube-api-access-hn5rr\") pod \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\" (UID: \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\") " Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766257 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9caf5978-14cb-440d-bb9b-9ad1cc0590af-operator-scripts\") pod \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\" (UID: \"9caf5978-14cb-440d-bb9b-9ad1cc0590af\") " Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766274 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-log-ovn\") pod \"bd66746b-8963-443e-9372-8402ff2ee2b1\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766310 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-additional-scripts\") pod \"bd66746b-8963-443e-9372-8402ff2ee2b1\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766333 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-scripts\") pod \"bd66746b-8963-443e-9372-8402ff2ee2b1\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766363 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run-ovn\") pod \"bd66746b-8963-443e-9372-8402ff2ee2b1\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766409 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run\") pod \"bd66746b-8963-443e-9372-8402ff2ee2b1\" (UID: \"bd66746b-8963-443e-9372-8402ff2ee2b1\") " Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766678 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-combined-ca-bundle\") pod \"keystone-db-sync-qnf9f\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766709 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2c6w\" (UniqueName: \"kubernetes.io/projected/9cbf9993-0d83-499a-8cc5-11662e0641e1-kube-api-access-n2c6w\") pod \"keystone-db-sync-qnf9f\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766748 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtf9s\" (UniqueName: \"kubernetes.io/projected/73ea0479-f6bc-4136-af63-8e92f46e3891-kube-api-access-rtf9s\") pod \"neutron-db-create-frlcc\" (UID: \"73ea0479-f6bc-4136-af63-8e92f46e3891\") " pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766803 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73ea0479-f6bc-4136-af63-8e92f46e3891-operator-scripts\") pod \"neutron-db-create-frlcc\" (UID: \"73ea0479-f6bc-4136-af63-8e92f46e3891\") " pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.766833 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-config-data\") pod \"keystone-db-sync-qnf9f\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.769670 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "bd66746b-8963-443e-9372-8402ff2ee2b1" (UID: "bd66746b-8963-443e-9372-8402ff2ee2b1"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.769722 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run" (OuterVolumeSpecName: "var-run") pod "bd66746b-8963-443e-9372-8402ff2ee2b1" (UID: "bd66746b-8963-443e-9372-8402ff2ee2b1"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.769785 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "bd66746b-8963-443e-9372-8402ff2ee2b1" (UID: "bd66746b-8963-443e-9372-8402ff2ee2b1"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.770129 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9caf5978-14cb-440d-bb9b-9ad1cc0590af-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9caf5978-14cb-440d-bb9b-9ad1cc0590af" (UID: "9caf5978-14cb-440d-bb9b-9ad1cc0590af"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.770751 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "bd66746b-8963-443e-9372-8402ff2ee2b1" (UID: "bd66746b-8963-443e-9372-8402ff2ee2b1"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.771232 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73ea0479-f6bc-4136-af63-8e92f46e3891-operator-scripts\") pod \"neutron-db-create-frlcc\" (UID: \"73ea0479-f6bc-4136-af63-8e92f46e3891\") " pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.771976 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-scripts" (OuterVolumeSpecName: "scripts") pod "bd66746b-8963-443e-9372-8402ff2ee2b1" (UID: "bd66746b-8963-443e-9372-8402ff2ee2b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.773133 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd66746b-8963-443e-9372-8402ff2ee2b1-kube-api-access-fwnn6" (OuterVolumeSpecName: "kube-api-access-fwnn6") pod "bd66746b-8963-443e-9372-8402ff2ee2b1" (UID: "bd66746b-8963-443e-9372-8402ff2ee2b1"). InnerVolumeSpecName "kube-api-access-fwnn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.774818 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9caf5978-14cb-440d-bb9b-9ad1cc0590af-kube-api-access-hn5rr" (OuterVolumeSpecName: "kube-api-access-hn5rr") pod "9caf5978-14cb-440d-bb9b-9ad1cc0590af" (UID: "9caf5978-14cb-440d-bb9b-9ad1cc0590af"). InnerVolumeSpecName "kube-api-access-hn5rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.775152 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.780076 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-jkt4f"] Mar 18 07:06:12 crc kubenswrapper[4917]: E0318 07:06:12.780425 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9caf5978-14cb-440d-bb9b-9ad1cc0590af" containerName="mariadb-account-create-update" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.780437 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9caf5978-14cb-440d-bb9b-9ad1cc0590af" containerName="mariadb-account-create-update" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.780613 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9caf5978-14cb-440d-bb9b-9ad1cc0590af" containerName="mariadb-account-create-update" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.781137 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.785227 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtf9s\" (UniqueName: \"kubernetes.io/projected/73ea0479-f6bc-4136-af63-8e92f46e3891-kube-api-access-rtf9s\") pod \"neutron-db-create-frlcc\" (UID: \"73ea0479-f6bc-4136-af63-8e92f46e3891\") " pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.788660 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jkt4f"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.820518 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-801b-account-create-update-kx4dz"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.821507 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.824148 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.843629 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-801b-account-create-update-kx4dz"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.863476 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-0f82-account-create-update-95z7q"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.869867 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.870947 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-combined-ca-bundle\") pod \"keystone-db-sync-qnf9f\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871024 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2c6w\" (UniqueName: \"kubernetes.io/projected/9cbf9993-0d83-499a-8cc5-11662e0641e1-kube-api-access-n2c6w\") pod \"keystone-db-sync-qnf9f\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871110 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18360b12-be17-4e63-ba90-8afb66e879e4-operator-scripts\") pod \"barbican-db-create-jkt4f\" (UID: \"18360b12-be17-4e63-ba90-8afb66e879e4\") " pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871150 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwnrb\" (UniqueName: \"kubernetes.io/projected/9abcd564-199b-4481-bb54-c1904431bce2-kube-api-access-cwnrb\") pod \"barbican-801b-account-create-update-kx4dz\" (UID: \"9abcd564-199b-4481-bb54-c1904431bce2\") " pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871233 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9abcd564-199b-4481-bb54-c1904431bce2-operator-scripts\") pod \"barbican-801b-account-create-update-kx4dz\" (UID: \"9abcd564-199b-4481-bb54-c1904431bce2\") " pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871344 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-config-data\") pod \"keystone-db-sync-qnf9f\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871422 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlk6h\" (UniqueName: \"kubernetes.io/projected/18360b12-be17-4e63-ba90-8afb66e879e4-kube-api-access-rlk6h\") pod \"barbican-db-create-jkt4f\" (UID: \"18360b12-be17-4e63-ba90-8afb66e879e4\") " pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871499 4917 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871512 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwnn6\" (UniqueName: \"kubernetes.io/projected/bd66746b-8963-443e-9372-8402ff2ee2b1-kube-api-access-fwnn6\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871521 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn5rr\" (UniqueName: \"kubernetes.io/projected/9caf5978-14cb-440d-bb9b-9ad1cc0590af-kube-api-access-hn5rr\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871530 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9caf5978-14cb-440d-bb9b-9ad1cc0590af-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871538 4917 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871549 4917 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871571 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd66746b-8963-443e-9372-8402ff2ee2b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.871590 4917 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/bd66746b-8963-443e-9372-8402ff2ee2b1-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.874657 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0f82-account-create-update-95z7q"] Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.874672 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.879445 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-config-data\") pod \"keystone-db-sync-qnf9f\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.888805 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-combined-ca-bundle\") pod \"keystone-db-sync-qnf9f\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.893154 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.895165 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2c6w\" (UniqueName: \"kubernetes.io/projected/9cbf9993-0d83-499a-8cc5-11662e0641e1-kube-api-access-n2c6w\") pod \"keystone-db-sync-qnf9f\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.967985 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.972528 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18360b12-be17-4e63-ba90-8afb66e879e4-operator-scripts\") pod \"barbican-db-create-jkt4f\" (UID: \"18360b12-be17-4e63-ba90-8afb66e879e4\") " pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.972903 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwnrb\" (UniqueName: \"kubernetes.io/projected/9abcd564-199b-4481-bb54-c1904431bce2-kube-api-access-cwnrb\") pod \"barbican-801b-account-create-update-kx4dz\" (UID: \"9abcd564-199b-4481-bb54-c1904431bce2\") " pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.972952 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9abcd564-199b-4481-bb54-c1904431bce2-operator-scripts\") pod \"barbican-801b-account-create-update-kx4dz\" (UID: \"9abcd564-199b-4481-bb54-c1904431bce2\") " pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.972990 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khnp7\" (UniqueName: \"kubernetes.io/projected/3861ad88-ced8-438a-8456-9a432a8bd828-kube-api-access-khnp7\") pod \"neutron-0f82-account-create-update-95z7q\" (UID: \"3861ad88-ced8-438a-8456-9a432a8bd828\") " pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.973039 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlk6h\" (UniqueName: \"kubernetes.io/projected/18360b12-be17-4e63-ba90-8afb66e879e4-kube-api-access-rlk6h\") pod \"barbican-db-create-jkt4f\" (UID: \"18360b12-be17-4e63-ba90-8afb66e879e4\") " pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.973087 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3861ad88-ced8-438a-8456-9a432a8bd828-operator-scripts\") pod \"neutron-0f82-account-create-update-95z7q\" (UID: \"3861ad88-ced8-438a-8456-9a432a8bd828\") " pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.974366 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9abcd564-199b-4481-bb54-c1904431bce2-operator-scripts\") pod \"barbican-801b-account-create-update-kx4dz\" (UID: \"9abcd564-199b-4481-bb54-c1904431bce2\") " pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.974523 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18360b12-be17-4e63-ba90-8afb66e879e4-operator-scripts\") pod \"barbican-db-create-jkt4f\" (UID: \"18360b12-be17-4e63-ba90-8afb66e879e4\") " pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:12 crc kubenswrapper[4917]: I0318 07:06:12.991655 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlk6h\" (UniqueName: \"kubernetes.io/projected/18360b12-be17-4e63-ba90-8afb66e879e4-kube-api-access-rlk6h\") pod \"barbican-db-create-jkt4f\" (UID: \"18360b12-be17-4e63-ba90-8afb66e879e4\") " pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.018930 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwnrb\" (UniqueName: \"kubernetes.io/projected/9abcd564-199b-4481-bb54-c1904431bce2-kube-api-access-cwnrb\") pod \"barbican-801b-account-create-update-kx4dz\" (UID: \"9abcd564-199b-4481-bb54-c1904431bce2\") " pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.037174 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.083759 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3861ad88-ced8-438a-8456-9a432a8bd828-operator-scripts\") pod \"neutron-0f82-account-create-update-95z7q\" (UID: \"3861ad88-ced8-438a-8456-9a432a8bd828\") " pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.084202 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khnp7\" (UniqueName: \"kubernetes.io/projected/3861ad88-ced8-438a-8456-9a432a8bd828-kube-api-access-khnp7\") pod \"neutron-0f82-account-create-update-95z7q\" (UID: \"3861ad88-ced8-438a-8456-9a432a8bd828\") " pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.085094 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3861ad88-ced8-438a-8456-9a432a8bd828-operator-scripts\") pod \"neutron-0f82-account-create-update-95z7q\" (UID: \"3861ad88-ced8-438a-8456-9a432a8bd828\") " pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.130243 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.140858 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khnp7\" (UniqueName: \"kubernetes.io/projected/3861ad88-ced8-438a-8456-9a432a8bd828-kube-api-access-khnp7\") pod \"neutron-0f82-account-create-update-95z7q\" (UID: \"3861ad88-ced8-438a-8456-9a432a8bd828\") " pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.215483 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.217797 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.372283 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182"} Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.372525 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0"} Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.372536 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056"} Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.383818 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9gbdd-config-x68ll"] Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.398470 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9gbdd-config-x68ll"] Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.406156 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-pd424" event={"ID":"9caf5978-14cb-440d-bb9b-9ad1cc0590af","Type":"ContainerDied","Data":"df0ac666042bdc8d389afd2ee091a66aec022d3c96e2c3867ad5edf4546045a5"} Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.406196 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0ac666042bdc8d389afd2ee091a66aec022d3c96e2c3867ad5edf4546045a5" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.406273 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-pd424" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.414576 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9gbdd-config-x68ll" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.424009 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84081a640bfa3615c9a7ce4b14a3858ff56341690d06d8069ff51d82e535aa42" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.434106 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-nb2kk"] Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.632521 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db3f-account-create-update-9dqtt"] Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.785651 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd66746b-8963-443e-9372-8402ff2ee2b1" path="/var/lib/kubelet/pods/bd66746b-8963-443e-9372-8402ff2ee2b1/volumes" Mar 18 07:06:13 crc kubenswrapper[4917]: I0318 07:06:13.930843 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563626-vlkxh" Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.054384 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpch4\" (UniqueName: \"kubernetes.io/projected/c1f16f1b-109d-44a6-9ac8-a954e43c41c5-kube-api-access-xpch4\") pod \"c1f16f1b-109d-44a6-9ac8-a954e43c41c5\" (UID: \"c1f16f1b-109d-44a6-9ac8-a954e43c41c5\") " Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.058674 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f16f1b-109d-44a6-9ac8-a954e43c41c5-kube-api-access-xpch4" (OuterVolumeSpecName: "kube-api-access-xpch4") pod "c1f16f1b-109d-44a6-9ac8-a954e43c41c5" (UID: "c1f16f1b-109d-44a6-9ac8-a954e43c41c5"). InnerVolumeSpecName "kube-api-access-xpch4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.071224 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-qnf9f"] Mar 18 07:06:14 crc kubenswrapper[4917]: W0318 07:06:14.076807 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cbf9993_0d83_499a_8cc5_11662e0641e1.slice/crio-81710eceb56c622aba04229100eb1c9b3e6cfb02dc83af6db5eae5029857a040 WatchSource:0}: Error finding container 81710eceb56c622aba04229100eb1c9b3e6cfb02dc83af6db5eae5029857a040: Status 404 returned error can't find the container with id 81710eceb56c622aba04229100eb1c9b3e6cfb02dc83af6db5eae5029857a040 Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.104393 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-frlcc"] Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.156757 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpch4\" (UniqueName: \"kubernetes.io/projected/c1f16f1b-109d-44a6-9ac8-a954e43c41c5-kube-api-access-xpch4\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.212672 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jkt4f"] Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.222574 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-0f82-account-create-update-95z7q"] Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.258312 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-801b-account-create-update-kx4dz"] Mar 18 07:06:14 crc kubenswrapper[4917]: W0318 07:06:14.273406 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9abcd564_199b_4481_bb54_c1904431bce2.slice/crio-dc0f091e8e727ce9febc0a33a472ab268d1b5123027486b1a916d0873c875c98 WatchSource:0}: Error finding container dc0f091e8e727ce9febc0a33a472ab268d1b5123027486b1a916d0873c875c98: Status 404 returned error can't find the container with id dc0f091e8e727ce9febc0a33a472ab268d1b5123027486b1a916d0873c875c98 Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.425562 4917 generic.go:334] "Generic (PLEG): container finished" podID="edbd50ea-b9fc-479c-8702-82df627467bf" containerID="396f23d5b17e5e096dc0c629319f3621212491bb20b57e78e011422165082e9d" exitCode=0 Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.425653 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nb2kk" event={"ID":"edbd50ea-b9fc-479c-8702-82df627467bf","Type":"ContainerDied","Data":"396f23d5b17e5e096dc0c629319f3621212491bb20b57e78e011422165082e9d"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.425677 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nb2kk" event={"ID":"edbd50ea-b9fc-479c-8702-82df627467bf","Type":"ContainerStarted","Data":"06dab392241cde937f8a1a3ccad1e2a9d5d9740fd1ade6190caa570f60d13838"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.428715 4917 generic.go:334] "Generic (PLEG): container finished" podID="55d4f4e3-0908-410e-a4ec-e40c5550d370" containerID="50901c43b0a391f980b67cc6a8fed1a750a0bb81ba22d72473b2eb1c0bc04435" exitCode=0 Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.428756 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db3f-account-create-update-9dqtt" event={"ID":"55d4f4e3-0908-410e-a4ec-e40c5550d370","Type":"ContainerDied","Data":"50901c43b0a391f980b67cc6a8fed1a750a0bb81ba22d72473b2eb1c0bc04435"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.428811 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db3f-account-create-update-9dqtt" event={"ID":"55d4f4e3-0908-410e-a4ec-e40c5550d370","Type":"ContainerStarted","Data":"e19fa5cb692026450c89b6a2386e10a62380d4b1d2b748b0440a174fd8d4eb1c"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.433301 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qnf9f" event={"ID":"9cbf9993-0d83-499a-8cc5-11662e0641e1","Type":"ContainerStarted","Data":"81710eceb56c622aba04229100eb1c9b3e6cfb02dc83af6db5eae5029857a040"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.438869 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-frlcc" event={"ID":"73ea0479-f6bc-4136-af63-8e92f46e3891","Type":"ContainerStarted","Data":"7ac1a2d7faea106e84f19c26e3b566ab583c3f762d3854030133338c142969fb"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.438914 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-frlcc" event={"ID":"73ea0479-f6bc-4136-af63-8e92f46e3891","Type":"ContainerStarted","Data":"277ed026e4836822b35eb0aec5a451a317312fdacbfa38cd72b257dd33923d27"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.459320 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.459551 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.459560 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.462872 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0f82-account-create-update-95z7q" event={"ID":"3861ad88-ced8-438a-8456-9a432a8bd828","Type":"ContainerStarted","Data":"c79c25a7b1ab434e0e2b228ea1065ba159ae133e58f734cf2b000c20fcdc13f2"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.468993 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jkt4f" event={"ID":"18360b12-be17-4e63-ba90-8afb66e879e4","Type":"ContainerStarted","Data":"a288766e3a1305ad024592342ddae92158dd0132529036ba3ba6737afab5c433"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.470453 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-frlcc" podStartSLOduration=2.4704373459999998 podStartE2EDuration="2.470437346s" podCreationTimestamp="2026-03-18 07:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:14.467947384 +0000 UTC m=+1159.409102098" watchObservedRunningTime="2026-03-18 07:06:14.470437346 +0000 UTC m=+1159.411592060" Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.472034 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563626-vlkxh" Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.474391 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563626-vlkxh" event={"ID":"c1f16f1b-109d-44a6-9ac8-a954e43c41c5","Type":"ContainerDied","Data":"e1eb13802812207fd0e30fd941d6335b40d60504fca1a9ae0663696b2f1bb99b"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.474424 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1eb13802812207fd0e30fd941d6335b40d60504fca1a9ae0663696b2f1bb99b" Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.478424 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-801b-account-create-update-kx4dz" event={"ID":"9abcd564-199b-4481-bb54-c1904431bce2","Type":"ContainerStarted","Data":"dc0f091e8e727ce9febc0a33a472ab268d1b5123027486b1a916d0873c875c98"} Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.499835 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-0f82-account-create-update-95z7q" podStartSLOduration=2.499812613 podStartE2EDuration="2.499812613s" podCreationTimestamp="2026-03-18 07:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:14.486703029 +0000 UTC m=+1159.427857763" watchObservedRunningTime="2026-03-18 07:06:14.499812613 +0000 UTC m=+1159.440967327" Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.507209 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-jkt4f" podStartSLOduration=2.507186445 podStartE2EDuration="2.507186445s" podCreationTimestamp="2026-03-18 07:06:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:14.50133551 +0000 UTC m=+1159.442490224" watchObservedRunningTime="2026-03-18 07:06:14.507186445 +0000 UTC m=+1159.448341159" Mar 18 07:06:14 crc kubenswrapper[4917]: I0318 07:06:14.615529 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9gbdd" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.013346 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563620-rvlqw"] Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.028253 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563620-rvlqw"] Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.489934 4917 generic.go:334] "Generic (PLEG): container finished" podID="3861ad88-ced8-438a-8456-9a432a8bd828" containerID="16287dc8eac10673799a08556004c46778aa8c41292bde808ec3d9bab0a363e5" exitCode=0 Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.490023 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0f82-account-create-update-95z7q" event={"ID":"3861ad88-ced8-438a-8456-9a432a8bd828","Type":"ContainerDied","Data":"16287dc8eac10673799a08556004c46778aa8c41292bde808ec3d9bab0a363e5"} Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.495327 4917 generic.go:334] "Generic (PLEG): container finished" podID="18360b12-be17-4e63-ba90-8afb66e879e4" containerID="42786c541794c50119f11de4d6878499b506c8fd49a01bf9b4c397fce1a43ced" exitCode=0 Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.495391 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jkt4f" event={"ID":"18360b12-be17-4e63-ba90-8afb66e879e4","Type":"ContainerDied","Data":"42786c541794c50119f11de4d6878499b506c8fd49a01bf9b4c397fce1a43ced"} Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.497560 4917 generic.go:334] "Generic (PLEG): container finished" podID="73ea0479-f6bc-4136-af63-8e92f46e3891" containerID="7ac1a2d7faea106e84f19c26e3b566ab583c3f762d3854030133338c142969fb" exitCode=0 Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.497622 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-frlcc" event={"ID":"73ea0479-f6bc-4136-af63-8e92f46e3891","Type":"ContainerDied","Data":"7ac1a2d7faea106e84f19c26e3b566ab583c3f762d3854030133338c142969fb"} Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.499168 4917 generic.go:334] "Generic (PLEG): container finished" podID="9abcd564-199b-4481-bb54-c1904431bce2" containerID="bf71597a3ae9eb5e1990277fba0cd9d1929f0579a69880e7a39dc9d56f71c9e7" exitCode=0 Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.499262 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-801b-account-create-update-kx4dz" event={"ID":"9abcd564-199b-4481-bb54-c1904431bce2","Type":"ContainerDied","Data":"bf71597a3ae9eb5e1990277fba0cd9d1929f0579a69880e7a39dc9d56f71c9e7"} Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.506759 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerStarted","Data":"44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00"} Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.548724 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.853035319 podStartE2EDuration="39.548705675s" podCreationTimestamp="2026-03-18 07:05:36 +0000 UTC" firstStartedPulling="2026-03-18 07:05:54.256805619 +0000 UTC m=+1139.197960333" lastFinishedPulling="2026-03-18 07:06:11.952475975 +0000 UTC m=+1156.893630689" observedRunningTime="2026-03-18 07:06:15.546332916 +0000 UTC m=+1160.487487660" watchObservedRunningTime="2026-03-18 07:06:15.548705675 +0000 UTC m=+1160.489860389" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.814781 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1438dd56-616a-4ae5-98eb-451e7e0349db" path="/var/lib/kubelet/pods/1438dd56-616a-4ae5-98eb-451e7e0349db/volumes" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.822342 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-2xxch"] Mar 18 07:06:15 crc kubenswrapper[4917]: E0318 07:06:15.822761 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f16f1b-109d-44a6-9ac8-a954e43c41c5" containerName="oc" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.827172 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f16f1b-109d-44a6-9ac8-a954e43c41c5" containerName="oc" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.827569 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1f16f1b-109d-44a6-9ac8-a954e43c41c5" containerName="oc" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.828463 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.833900 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.844817 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-2xxch"] Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.910495 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rngzd\" (UniqueName: \"kubernetes.io/projected/ade4fee9-4c58-4961-9ffc-ff0153ca2837-kube-api-access-rngzd\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.910570 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-swift-storage-0\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.910622 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-svc\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.910657 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-sb\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.911109 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-nb\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.911304 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-config\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.928630 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:15 crc kubenswrapper[4917]: I0318 07:06:15.935057 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.013117 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d4f4e3-0908-410e-a4ec-e40c5550d370-operator-scripts\") pod \"55d4f4e3-0908-410e-a4ec-e40c5550d370\" (UID: \"55d4f4e3-0908-410e-a4ec-e40c5550d370\") " Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.013293 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf2z9\" (UniqueName: \"kubernetes.io/projected/edbd50ea-b9fc-479c-8702-82df627467bf-kube-api-access-cf2z9\") pod \"edbd50ea-b9fc-479c-8702-82df627467bf\" (UID: \"edbd50ea-b9fc-479c-8702-82df627467bf\") " Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.013347 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqg7c\" (UniqueName: \"kubernetes.io/projected/55d4f4e3-0908-410e-a4ec-e40c5550d370-kube-api-access-fqg7c\") pod \"55d4f4e3-0908-410e-a4ec-e40c5550d370\" (UID: \"55d4f4e3-0908-410e-a4ec-e40c5550d370\") " Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.013369 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edbd50ea-b9fc-479c-8702-82df627467bf-operator-scripts\") pod \"edbd50ea-b9fc-479c-8702-82df627467bf\" (UID: \"edbd50ea-b9fc-479c-8702-82df627467bf\") " Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.013936 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d4f4e3-0908-410e-a4ec-e40c5550d370-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55d4f4e3-0908-410e-a4ec-e40c5550d370" (UID: "55d4f4e3-0908-410e-a4ec-e40c5550d370"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.014238 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edbd50ea-b9fc-479c-8702-82df627467bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "edbd50ea-b9fc-479c-8702-82df627467bf" (UID: "edbd50ea-b9fc-479c-8702-82df627467bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.014416 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-config\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.015223 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-config\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.015361 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rngzd\" (UniqueName: \"kubernetes.io/projected/ade4fee9-4c58-4961-9ffc-ff0153ca2837-kube-api-access-rngzd\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.015471 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-swift-storage-0\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.016118 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-svc\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.016051 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-swift-storage-0\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.016258 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-sb\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.016462 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-nb\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.017607 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-sb\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.018726 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-nb\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.018839 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55d4f4e3-0908-410e-a4ec-e40c5550d370-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.019051 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edbd50ea-b9fc-479c-8702-82df627467bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.019119 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edbd50ea-b9fc-479c-8702-82df627467bf-kube-api-access-cf2z9" (OuterVolumeSpecName: "kube-api-access-cf2z9") pod "edbd50ea-b9fc-479c-8702-82df627467bf" (UID: "edbd50ea-b9fc-479c-8702-82df627467bf"). InnerVolumeSpecName "kube-api-access-cf2z9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.020028 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-svc\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.029066 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d4f4e3-0908-410e-a4ec-e40c5550d370-kube-api-access-fqg7c" (OuterVolumeSpecName: "kube-api-access-fqg7c") pod "55d4f4e3-0908-410e-a4ec-e40c5550d370" (UID: "55d4f4e3-0908-410e-a4ec-e40c5550d370"). InnerVolumeSpecName "kube-api-access-fqg7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.030227 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rngzd\" (UniqueName: \"kubernetes.io/projected/ade4fee9-4c58-4961-9ffc-ff0153ca2837-kube-api-access-rngzd\") pod \"dnsmasq-dns-86cbdd8bfc-2xxch\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.120777 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf2z9\" (UniqueName: \"kubernetes.io/projected/edbd50ea-b9fc-479c-8702-82df627467bf-kube-api-access-cf2z9\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.121044 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqg7c\" (UniqueName: \"kubernetes.io/projected/55d4f4e3-0908-410e-a4ec-e40c5550d370-kube-api-access-fqg7c\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.225447 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.515534 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-nb2kk" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.515668 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-nb2kk" event={"ID":"edbd50ea-b9fc-479c-8702-82df627467bf","Type":"ContainerDied","Data":"06dab392241cde937f8a1a3ccad1e2a9d5d9740fd1ade6190caa570f60d13838"} Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.516659 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06dab392241cde937f8a1a3ccad1e2a9d5d9740fd1ade6190caa570f60d13838" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.530456 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db3f-account-create-update-9dqtt" event={"ID":"55d4f4e3-0908-410e-a4ec-e40c5550d370","Type":"ContainerDied","Data":"e19fa5cb692026450c89b6a2386e10a62380d4b1d2b748b0440a174fd8d4eb1c"} Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.530500 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e19fa5cb692026450c89b6a2386e10a62380d4b1d2b748b0440a174fd8d4eb1c" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.530532 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db3f-account-create-update-9dqtt" Mar 18 07:06:16 crc kubenswrapper[4917]: I0318 07:06:16.690981 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-2xxch"] Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.518425 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.544243 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.549191 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.562952 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-801b-account-create-update-kx4dz" event={"ID":"9abcd564-199b-4481-bb54-c1904431bce2","Type":"ContainerDied","Data":"dc0f091e8e727ce9febc0a33a472ab268d1b5123027486b1a916d0873c875c98"} Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.563000 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0f091e8e727ce9febc0a33a472ab268d1b5123027486b1a916d0873c875c98" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.576709 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.579699 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xqwbz" event={"ID":"cd14ce63-7434-4646-9995-5cc41d2a4c6c","Type":"ContainerDied","Data":"041ca039bbadb98cfa9331d8184c76abf835c6ed1fca960800be969281f8d5cf"} Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.579670 4917 generic.go:334] "Generic (PLEG): container finished" podID="cd14ce63-7434-4646-9995-5cc41d2a4c6c" containerID="041ca039bbadb98cfa9331d8184c76abf835c6ed1fca960800be969281f8d5cf" exitCode=0 Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.591178 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73ea0479-f6bc-4136-af63-8e92f46e3891-operator-scripts\") pod \"73ea0479-f6bc-4136-af63-8e92f46e3891\" (UID: \"73ea0479-f6bc-4136-af63-8e92f46e3891\") " Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.591357 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtf9s\" (UniqueName: \"kubernetes.io/projected/73ea0479-f6bc-4136-af63-8e92f46e3891-kube-api-access-rtf9s\") pod \"73ea0479-f6bc-4136-af63-8e92f46e3891\" (UID: \"73ea0479-f6bc-4136-af63-8e92f46e3891\") " Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.592164 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73ea0479-f6bc-4136-af63-8e92f46e3891-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73ea0479-f6bc-4136-af63-8e92f46e3891" (UID: "73ea0479-f6bc-4136-af63-8e92f46e3891"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.597033 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-0f82-account-create-update-95z7q" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.597275 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-0f82-account-create-update-95z7q" event={"ID":"3861ad88-ced8-438a-8456-9a432a8bd828","Type":"ContainerDied","Data":"c79c25a7b1ab434e0e2b228ea1065ba159ae133e58f734cf2b000c20fcdc13f2"} Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.597299 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c79c25a7b1ab434e0e2b228ea1065ba159ae133e58f734cf2b000c20fcdc13f2" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.601755 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73ea0479-f6bc-4136-af63-8e92f46e3891-kube-api-access-rtf9s" (OuterVolumeSpecName: "kube-api-access-rtf9s") pod "73ea0479-f6bc-4136-af63-8e92f46e3891" (UID: "73ea0479-f6bc-4136-af63-8e92f46e3891"). InnerVolumeSpecName "kube-api-access-rtf9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.609783 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" event={"ID":"ade4fee9-4c58-4961-9ffc-ff0153ca2837","Type":"ContainerStarted","Data":"2134c875270f753566bf2a3a6277f2100a2011b8c2dfd86b860f2fe9d9be2812"} Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.620604 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jkt4f" event={"ID":"18360b12-be17-4e63-ba90-8afb66e879e4","Type":"ContainerDied","Data":"a288766e3a1305ad024592342ddae92158dd0132529036ba3ba6737afab5c433"} Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.620756 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jkt4f" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.620760 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a288766e3a1305ad024592342ddae92158dd0132529036ba3ba6737afab5c433" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.624494 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-frlcc" event={"ID":"73ea0479-f6bc-4136-af63-8e92f46e3891","Type":"ContainerDied","Data":"277ed026e4836822b35eb0aec5a451a317312fdacbfa38cd72b257dd33923d27"} Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.624530 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="277ed026e4836822b35eb0aec5a451a317312fdacbfa38cd72b257dd33923d27" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.624594 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-frlcc" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.692530 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlk6h\" (UniqueName: \"kubernetes.io/projected/18360b12-be17-4e63-ba90-8afb66e879e4-kube-api-access-rlk6h\") pod \"18360b12-be17-4e63-ba90-8afb66e879e4\" (UID: \"18360b12-be17-4e63-ba90-8afb66e879e4\") " Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.692789 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9abcd564-199b-4481-bb54-c1904431bce2-operator-scripts\") pod \"9abcd564-199b-4481-bb54-c1904431bce2\" (UID: \"9abcd564-199b-4481-bb54-c1904431bce2\") " Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.692995 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18360b12-be17-4e63-ba90-8afb66e879e4-operator-scripts\") pod \"18360b12-be17-4e63-ba90-8afb66e879e4\" (UID: \"18360b12-be17-4e63-ba90-8afb66e879e4\") " Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.693134 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khnp7\" (UniqueName: \"kubernetes.io/projected/3861ad88-ced8-438a-8456-9a432a8bd828-kube-api-access-khnp7\") pod \"3861ad88-ced8-438a-8456-9a432a8bd828\" (UID: \"3861ad88-ced8-438a-8456-9a432a8bd828\") " Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.693288 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwnrb\" (UniqueName: \"kubernetes.io/projected/9abcd564-199b-4481-bb54-c1904431bce2-kube-api-access-cwnrb\") pod \"9abcd564-199b-4481-bb54-c1904431bce2\" (UID: \"9abcd564-199b-4481-bb54-c1904431bce2\") " Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.693301 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9abcd564-199b-4481-bb54-c1904431bce2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9abcd564-199b-4481-bb54-c1904431bce2" (UID: "9abcd564-199b-4481-bb54-c1904431bce2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.693497 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3861ad88-ced8-438a-8456-9a432a8bd828-operator-scripts\") pod \"3861ad88-ced8-438a-8456-9a432a8bd828\" (UID: \"3861ad88-ced8-438a-8456-9a432a8bd828\") " Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.693456 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18360b12-be17-4e63-ba90-8afb66e879e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18360b12-be17-4e63-ba90-8afb66e879e4" (UID: "18360b12-be17-4e63-ba90-8afb66e879e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.693911 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3861ad88-ced8-438a-8456-9a432a8bd828-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3861ad88-ced8-438a-8456-9a432a8bd828" (UID: "3861ad88-ced8-438a-8456-9a432a8bd828"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.694375 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtf9s\" (UniqueName: \"kubernetes.io/projected/73ea0479-f6bc-4136-af63-8e92f46e3891-kube-api-access-rtf9s\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.694398 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9abcd564-199b-4481-bb54-c1904431bce2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.694408 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73ea0479-f6bc-4136-af63-8e92f46e3891-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.694417 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18360b12-be17-4e63-ba90-8afb66e879e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.694425 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3861ad88-ced8-438a-8456-9a432a8bd828-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.696335 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18360b12-be17-4e63-ba90-8afb66e879e4-kube-api-access-rlk6h" (OuterVolumeSpecName: "kube-api-access-rlk6h") pod "18360b12-be17-4e63-ba90-8afb66e879e4" (UID: "18360b12-be17-4e63-ba90-8afb66e879e4"). InnerVolumeSpecName "kube-api-access-rlk6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.696454 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3861ad88-ced8-438a-8456-9a432a8bd828-kube-api-access-khnp7" (OuterVolumeSpecName: "kube-api-access-khnp7") pod "3861ad88-ced8-438a-8456-9a432a8bd828" (UID: "3861ad88-ced8-438a-8456-9a432a8bd828"). InnerVolumeSpecName "kube-api-access-khnp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.696812 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9abcd564-199b-4481-bb54-c1904431bce2-kube-api-access-cwnrb" (OuterVolumeSpecName: "kube-api-access-cwnrb") pod "9abcd564-199b-4481-bb54-c1904431bce2" (UID: "9abcd564-199b-4481-bb54-c1904431bce2"). InnerVolumeSpecName "kube-api-access-cwnrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.795417 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khnp7\" (UniqueName: \"kubernetes.io/projected/3861ad88-ced8-438a-8456-9a432a8bd828-kube-api-access-khnp7\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.795453 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwnrb\" (UniqueName: \"kubernetes.io/projected/9abcd564-199b-4481-bb54-c1904431bce2-kube-api-access-cwnrb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:19 crc kubenswrapper[4917]: I0318 07:06:19.795462 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlk6h\" (UniqueName: \"kubernetes.io/projected/18360b12-be17-4e63-ba90-8afb66e879e4-kube-api-access-rlk6h\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:20 crc kubenswrapper[4917]: I0318 07:06:20.638827 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qnf9f" event={"ID":"9cbf9993-0d83-499a-8cc5-11662e0641e1","Type":"ContainerStarted","Data":"2658026f893c6e07fb2028dc78145eb95678a1a39faeea5221b841ebdc338c12"} Mar 18 07:06:20 crc kubenswrapper[4917]: I0318 07:06:20.640956 4917 generic.go:334] "Generic (PLEG): container finished" podID="ade4fee9-4c58-4961-9ffc-ff0153ca2837" containerID="0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393" exitCode=0 Mar 18 07:06:20 crc kubenswrapper[4917]: I0318 07:06:20.641074 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-801b-account-create-update-kx4dz" Mar 18 07:06:20 crc kubenswrapper[4917]: I0318 07:06:20.641220 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" event={"ID":"ade4fee9-4c58-4961-9ffc-ff0153ca2837","Type":"ContainerDied","Data":"0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393"} Mar 18 07:06:20 crc kubenswrapper[4917]: I0318 07:06:20.699501 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-qnf9f" podStartSLOduration=3.477888183 podStartE2EDuration="8.699479528s" podCreationTimestamp="2026-03-18 07:06:12 +0000 UTC" firstStartedPulling="2026-03-18 07:06:14.079291438 +0000 UTC m=+1159.020446152" lastFinishedPulling="2026-03-18 07:06:19.300882783 +0000 UTC m=+1164.242037497" observedRunningTime="2026-03-18 07:06:20.676835948 +0000 UTC m=+1165.617990682" watchObservedRunningTime="2026-03-18 07:06:20.699479528 +0000 UTC m=+1165.640634252" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.020530 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xqwbz" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.121658 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jw8x\" (UniqueName: \"kubernetes.io/projected/cd14ce63-7434-4646-9995-5cc41d2a4c6c-kube-api-access-9jw8x\") pod \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.121844 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-combined-ca-bundle\") pod \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.121891 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-db-sync-config-data\") pod \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.122069 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-config-data\") pod \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\" (UID: \"cd14ce63-7434-4646-9995-5cc41d2a4c6c\") " Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.127192 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "cd14ce63-7434-4646-9995-5cc41d2a4c6c" (UID: "cd14ce63-7434-4646-9995-5cc41d2a4c6c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.127977 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd14ce63-7434-4646-9995-5cc41d2a4c6c-kube-api-access-9jw8x" (OuterVolumeSpecName: "kube-api-access-9jw8x") pod "cd14ce63-7434-4646-9995-5cc41d2a4c6c" (UID: "cd14ce63-7434-4646-9995-5cc41d2a4c6c"). InnerVolumeSpecName "kube-api-access-9jw8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.144175 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd14ce63-7434-4646-9995-5cc41d2a4c6c" (UID: "cd14ce63-7434-4646-9995-5cc41d2a4c6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.174216 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-config-data" (OuterVolumeSpecName: "config-data") pod "cd14ce63-7434-4646-9995-5cc41d2a4c6c" (UID: "cd14ce63-7434-4646-9995-5cc41d2a4c6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.224784 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.224822 4917 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.224836 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd14ce63-7434-4646-9995-5cc41d2a4c6c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.224849 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jw8x\" (UniqueName: \"kubernetes.io/projected/cd14ce63-7434-4646-9995-5cc41d2a4c6c-kube-api-access-9jw8x\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.655652 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" event={"ID":"ade4fee9-4c58-4961-9ffc-ff0153ca2837","Type":"ContainerStarted","Data":"3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495"} Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.656194 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.665641 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-xqwbz" event={"ID":"cd14ce63-7434-4646-9995-5cc41d2a4c6c","Type":"ContainerDied","Data":"186baace6668c11f1b7322cb1eb0d8efcfdf18b9c228ec5dfa191f6445e86bea"} Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.665722 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="186baace6668c11f1b7322cb1eb0d8efcfdf18b9c228ec5dfa191f6445e86bea" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.665852 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-xqwbz" Mar 18 07:06:21 crc kubenswrapper[4917]: I0318 07:06:21.697050 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" podStartSLOduration=6.697022819 podStartE2EDuration="6.697022819s" podCreationTimestamp="2026-03-18 07:06:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:21.6808844 +0000 UTC m=+1166.622039154" watchObservedRunningTime="2026-03-18 07:06:21.697022819 +0000 UTC m=+1166.638177573" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.091038 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-2xxch"] Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.125500 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-jt9p6"] Mar 18 07:06:22 crc kubenswrapper[4917]: E0318 07:06:22.125802 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edbd50ea-b9fc-479c-8702-82df627467bf" containerName="mariadb-database-create" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.125818 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="edbd50ea-b9fc-479c-8702-82df627467bf" containerName="mariadb-database-create" Mar 18 07:06:22 crc kubenswrapper[4917]: E0318 07:06:22.125830 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73ea0479-f6bc-4136-af63-8e92f46e3891" containerName="mariadb-database-create" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.125838 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="73ea0479-f6bc-4136-af63-8e92f46e3891" containerName="mariadb-database-create" Mar 18 07:06:22 crc kubenswrapper[4917]: E0318 07:06:22.125850 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18360b12-be17-4e63-ba90-8afb66e879e4" containerName="mariadb-database-create" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.125856 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="18360b12-be17-4e63-ba90-8afb66e879e4" containerName="mariadb-database-create" Mar 18 07:06:22 crc kubenswrapper[4917]: E0318 07:06:22.125869 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d4f4e3-0908-410e-a4ec-e40c5550d370" containerName="mariadb-account-create-update" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.125874 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d4f4e3-0908-410e-a4ec-e40c5550d370" containerName="mariadb-account-create-update" Mar 18 07:06:22 crc kubenswrapper[4917]: E0318 07:06:22.125882 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3861ad88-ced8-438a-8456-9a432a8bd828" containerName="mariadb-account-create-update" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.125888 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3861ad88-ced8-438a-8456-9a432a8bd828" containerName="mariadb-account-create-update" Mar 18 07:06:22 crc kubenswrapper[4917]: E0318 07:06:22.125907 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd14ce63-7434-4646-9995-5cc41d2a4c6c" containerName="glance-db-sync" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.125913 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd14ce63-7434-4646-9995-5cc41d2a4c6c" containerName="glance-db-sync" Mar 18 07:06:22 crc kubenswrapper[4917]: E0318 07:06:22.125928 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9abcd564-199b-4481-bb54-c1904431bce2" containerName="mariadb-account-create-update" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.125934 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9abcd564-199b-4481-bb54-c1904431bce2" containerName="mariadb-account-create-update" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.126065 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3861ad88-ced8-438a-8456-9a432a8bd828" containerName="mariadb-account-create-update" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.126076 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="edbd50ea-b9fc-479c-8702-82df627467bf" containerName="mariadb-database-create" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.126086 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="73ea0479-f6bc-4136-af63-8e92f46e3891" containerName="mariadb-database-create" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.126093 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd14ce63-7434-4646-9995-5cc41d2a4c6c" containerName="glance-db-sync" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.126101 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d4f4e3-0908-410e-a4ec-e40c5550d370" containerName="mariadb-account-create-update" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.126109 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9abcd564-199b-4481-bb54-c1904431bce2" containerName="mariadb-account-create-update" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.126120 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="18360b12-be17-4e63-ba90-8afb66e879e4" containerName="mariadb-database-create" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.126921 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.147668 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-jt9p6"] Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.240597 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-svc\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.240686 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.240765 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-config\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.240798 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.240840 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk2s6\" (UniqueName: \"kubernetes.io/projected/8e393792-6f85-4c4b-adb9-0c889d9fcc51-kube-api-access-pk2s6\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.240896 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-swift-storage-0\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.342673 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-swift-storage-0\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.342984 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-svc\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.343070 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.343155 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-config\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.343234 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.343311 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk2s6\" (UniqueName: \"kubernetes.io/projected/8e393792-6f85-4c4b-adb9-0c889d9fcc51-kube-api-access-pk2s6\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.343573 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-swift-storage-0\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.344250 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-nb\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.344256 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-sb\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.344424 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-config\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.344861 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-svc\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.359335 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk2s6\" (UniqueName: \"kubernetes.io/projected/8e393792-6f85-4c4b-adb9-0c889d9fcc51-kube-api-access-pk2s6\") pod \"dnsmasq-dns-6d88577c8c-jt9p6\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.441187 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.683678 4917 generic.go:334] "Generic (PLEG): container finished" podID="9cbf9993-0d83-499a-8cc5-11662e0641e1" containerID="2658026f893c6e07fb2028dc78145eb95678a1a39faeea5221b841ebdc338c12" exitCode=0 Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.683769 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qnf9f" event={"ID":"9cbf9993-0d83-499a-8cc5-11662e0641e1","Type":"ContainerDied","Data":"2658026f893c6e07fb2028dc78145eb95678a1a39faeea5221b841ebdc338c12"} Mar 18 07:06:22 crc kubenswrapper[4917]: I0318 07:06:22.874098 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-jt9p6"] Mar 18 07:06:22 crc kubenswrapper[4917]: W0318 07:06:22.875794 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e393792_6f85_4c4b_adb9_0c889d9fcc51.slice/crio-473c54f07ca5dac3b96059ac57d7be8d346035bf3bbe13acc69eb281483d75f9 WatchSource:0}: Error finding container 473c54f07ca5dac3b96059ac57d7be8d346035bf3bbe13acc69eb281483d75f9: Status 404 returned error can't find the container with id 473c54f07ca5dac3b96059ac57d7be8d346035bf3bbe13acc69eb281483d75f9 Mar 18 07:06:23 crc kubenswrapper[4917]: I0318 07:06:23.694218 4917 generic.go:334] "Generic (PLEG): container finished" podID="8e393792-6f85-4c4b-adb9-0c889d9fcc51" containerID="7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37" exitCode=0 Mar 18 07:06:23 crc kubenswrapper[4917]: I0318 07:06:23.694306 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" event={"ID":"8e393792-6f85-4c4b-adb9-0c889d9fcc51","Type":"ContainerDied","Data":"7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37"} Mar 18 07:06:23 crc kubenswrapper[4917]: I0318 07:06:23.694620 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" event={"ID":"8e393792-6f85-4c4b-adb9-0c889d9fcc51","Type":"ContainerStarted","Data":"473c54f07ca5dac3b96059ac57d7be8d346035bf3bbe13acc69eb281483d75f9"} Mar 18 07:06:23 crc kubenswrapper[4917]: I0318 07:06:23.694728 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" podUID="ade4fee9-4c58-4961-9ffc-ff0153ca2837" containerName="dnsmasq-dns" containerID="cri-o://3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495" gracePeriod=10 Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.025553 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.103071 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-config-data\") pod \"9cbf9993-0d83-499a-8cc5-11662e0641e1\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.103162 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2c6w\" (UniqueName: \"kubernetes.io/projected/9cbf9993-0d83-499a-8cc5-11662e0641e1-kube-api-access-n2c6w\") pod \"9cbf9993-0d83-499a-8cc5-11662e0641e1\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.103199 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-combined-ca-bundle\") pod \"9cbf9993-0d83-499a-8cc5-11662e0641e1\" (UID: \"9cbf9993-0d83-499a-8cc5-11662e0641e1\") " Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.109722 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cbf9993-0d83-499a-8cc5-11662e0641e1-kube-api-access-n2c6w" (OuterVolumeSpecName: "kube-api-access-n2c6w") pod "9cbf9993-0d83-499a-8cc5-11662e0641e1" (UID: "9cbf9993-0d83-499a-8cc5-11662e0641e1"). InnerVolumeSpecName "kube-api-access-n2c6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.131291 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cbf9993-0d83-499a-8cc5-11662e0641e1" (UID: "9cbf9993-0d83-499a-8cc5-11662e0641e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.174771 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-config-data" (OuterVolumeSpecName: "config-data") pod "9cbf9993-0d83-499a-8cc5-11662e0641e1" (UID: "9cbf9993-0d83-499a-8cc5-11662e0641e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.207684 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.207720 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cbf9993-0d83-499a-8cc5-11662e0641e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.207732 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2c6w\" (UniqueName: \"kubernetes.io/projected/9cbf9993-0d83-499a-8cc5-11662e0641e1-kube-api-access-n2c6w\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.213299 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.309103 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-config\") pod \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.309152 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-swift-storage-0\") pod \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.309221 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-svc\") pod \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.309267 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-nb\") pod \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.309361 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rngzd\" (UniqueName: \"kubernetes.io/projected/ade4fee9-4c58-4961-9ffc-ff0153ca2837-kube-api-access-rngzd\") pod \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.309398 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-sb\") pod \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\" (UID: \"ade4fee9-4c58-4961-9ffc-ff0153ca2837\") " Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.313571 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ade4fee9-4c58-4961-9ffc-ff0153ca2837-kube-api-access-rngzd" (OuterVolumeSpecName: "kube-api-access-rngzd") pod "ade4fee9-4c58-4961-9ffc-ff0153ca2837" (UID: "ade4fee9-4c58-4961-9ffc-ff0153ca2837"). InnerVolumeSpecName "kube-api-access-rngzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.352253 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ade4fee9-4c58-4961-9ffc-ff0153ca2837" (UID: "ade4fee9-4c58-4961-9ffc-ff0153ca2837"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.358137 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ade4fee9-4c58-4961-9ffc-ff0153ca2837" (UID: "ade4fee9-4c58-4961-9ffc-ff0153ca2837"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.366255 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-config" (OuterVolumeSpecName: "config") pod "ade4fee9-4c58-4961-9ffc-ff0153ca2837" (UID: "ade4fee9-4c58-4961-9ffc-ff0153ca2837"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.370254 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ade4fee9-4c58-4961-9ffc-ff0153ca2837" (UID: "ade4fee9-4c58-4961-9ffc-ff0153ca2837"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.373571 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ade4fee9-4c58-4961-9ffc-ff0153ca2837" (UID: "ade4fee9-4c58-4961-9ffc-ff0153ca2837"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.412065 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.412149 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.412165 4917 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.412176 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.412213 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ade4fee9-4c58-4961-9ffc-ff0153ca2837-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.412225 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rngzd\" (UniqueName: \"kubernetes.io/projected/ade4fee9-4c58-4961-9ffc-ff0153ca2837-kube-api-access-rngzd\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.703900 4917 generic.go:334] "Generic (PLEG): container finished" podID="ade4fee9-4c58-4961-9ffc-ff0153ca2837" containerID="3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495" exitCode=0 Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.703954 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.703965 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" event={"ID":"ade4fee9-4c58-4961-9ffc-ff0153ca2837","Type":"ContainerDied","Data":"3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495"} Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.704000 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86cbdd8bfc-2xxch" event={"ID":"ade4fee9-4c58-4961-9ffc-ff0153ca2837","Type":"ContainerDied","Data":"2134c875270f753566bf2a3a6277f2100a2011b8c2dfd86b860f2fe9d9be2812"} Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.704016 4917 scope.go:117] "RemoveContainer" containerID="3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.706890 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" event={"ID":"8e393792-6f85-4c4b-adb9-0c889d9fcc51","Type":"ContainerStarted","Data":"195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88"} Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.707066 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.710846 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-qnf9f" event={"ID":"9cbf9993-0d83-499a-8cc5-11662e0641e1","Type":"ContainerDied","Data":"81710eceb56c622aba04229100eb1c9b3e6cfb02dc83af6db5eae5029857a040"} Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.710877 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81710eceb56c622aba04229100eb1c9b3e6cfb02dc83af6db5eae5029857a040" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.710934 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-qnf9f" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.730951 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" podStartSLOduration=2.730931163 podStartE2EDuration="2.730931163s" podCreationTimestamp="2026-03-18 07:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:24.730535324 +0000 UTC m=+1169.671690058" watchObservedRunningTime="2026-03-18 07:06:24.730931163 +0000 UTC m=+1169.672085897" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.733898 4917 scope.go:117] "RemoveContainer" containerID="0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.758125 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-2xxch"] Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.762800 4917 scope.go:117] "RemoveContainer" containerID="3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495" Mar 18 07:06:24 crc kubenswrapper[4917]: E0318 07:06:24.763434 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495\": container with ID starting with 3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495 not found: ID does not exist" containerID="3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.763467 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495"} err="failed to get container status \"3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495\": rpc error: code = NotFound desc = could not find container \"3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495\": container with ID starting with 3b340485cc76f3b6b616e9fe176a722f1815fef0c29a35f56cdebcad3dd08495 not found: ID does not exist" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.763492 4917 scope.go:117] "RemoveContainer" containerID="0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393" Mar 18 07:06:24 crc kubenswrapper[4917]: E0318 07:06:24.763830 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393\": container with ID starting with 0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393 not found: ID does not exist" containerID="0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.763855 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393"} err="failed to get container status \"0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393\": rpc error: code = NotFound desc = could not find container \"0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393\": container with ID starting with 0ed1ed804ce5a07fb3628ffce42a1ddf2d1f0667395ce6a5d42756ad9da11393 not found: ID does not exist" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.764815 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86cbdd8bfc-2xxch"] Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.928478 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-jt9p6"] Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.974302 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-w6brj"] Mar 18 07:06:24 crc kubenswrapper[4917]: E0318 07:06:24.981165 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cbf9993-0d83-499a-8cc5-11662e0641e1" containerName="keystone-db-sync" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.981287 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cbf9993-0d83-499a-8cc5-11662e0641e1" containerName="keystone-db-sync" Mar 18 07:06:24 crc kubenswrapper[4917]: E0318 07:06:24.981375 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade4fee9-4c58-4961-9ffc-ff0153ca2837" containerName="init" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.981445 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade4fee9-4c58-4961-9ffc-ff0153ca2837" containerName="init" Mar 18 07:06:24 crc kubenswrapper[4917]: E0318 07:06:24.981521 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ade4fee9-4c58-4961-9ffc-ff0153ca2837" containerName="dnsmasq-dns" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.981605 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ade4fee9-4c58-4961-9ffc-ff0153ca2837" containerName="dnsmasq-dns" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.981865 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cbf9993-0d83-499a-8cc5-11662e0641e1" containerName="keystone-db-sync" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.981970 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ade4fee9-4c58-4961-9ffc-ff0153ca2837" containerName="dnsmasq-dns" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.982698 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.986517 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.987494 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.987553 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5j76w" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.987719 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.988825 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84c7999457-4hmtp"] Mar 18 07:06:24 crc kubenswrapper[4917]: I0318 07:06:24.990131 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.002671 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.025043 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-4hmtp"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.108255 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w6brj"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124465 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-credential-keys\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124558 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-config\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124600 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-scripts\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124627 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-combined-ca-bundle\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124658 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-fernet-keys\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124687 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnxvh\" (UniqueName: \"kubernetes.io/projected/f034063d-84c1-430b-bdc3-ee3680f4ddf2-kube-api-access-bnxvh\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124712 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-nb\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124735 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-swift-storage-0\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124765 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zssh9\" (UniqueName: \"kubernetes.io/projected/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-kube-api-access-zssh9\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124790 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-svc\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124819 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-sb\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.124844 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-config-data\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229408 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-combined-ca-bundle\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229465 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-fernet-keys\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229492 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnxvh\" (UniqueName: \"kubernetes.io/projected/f034063d-84c1-430b-bdc3-ee3680f4ddf2-kube-api-access-bnxvh\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229509 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-nb\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229533 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-swift-storage-0\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229556 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zssh9\" (UniqueName: \"kubernetes.io/projected/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-kube-api-access-zssh9\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229578 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-svc\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229616 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-sb\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229640 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-config-data\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229673 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-credential-keys\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229719 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-config\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.229740 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-scripts\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.234013 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-svc\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.234330 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-sb\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.235004 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-config\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.240156 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-nb\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.240470 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-swift-storage-0\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.245008 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-combined-ca-bundle\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.245055 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-config-data\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.245839 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-scripts\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.246471 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-fernet-keys\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.247189 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-credential-keys\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.268161 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zssh9\" (UniqueName: \"kubernetes.io/projected/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-kube-api-access-zssh9\") pod \"keystone-bootstrap-w6brj\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.290369 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-9ntb4"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.291343 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.303866 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xqnr5" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.304298 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.309316 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnxvh\" (UniqueName: \"kubernetes.io/projected/f034063d-84c1-430b-bdc3-ee3680f4ddf2-kube-api-access-bnxvh\") pod \"dnsmasq-dns-84c7999457-4hmtp\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.315080 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.315539 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.339657 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9ntb4"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.341139 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.344025 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kqhm8"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.344979 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.353058 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.353142 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.353393 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5n4cs" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.398937 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kqhm8"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.449074 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-combined-ca-bundle\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.449355 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-scripts\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.449381 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a06e756-cae4-44b6-8bed-4951e159e223-etc-machine-id\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.449420 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thxpw\" (UniqueName: \"kubernetes.io/projected/0b1534f9-3185-43be-afb2-e37c921ff0e1-kube-api-access-thxpw\") pod \"neutron-db-sync-kqhm8\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.449506 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zskjd\" (UniqueName: \"kubernetes.io/projected/8a06e756-cae4-44b6-8bed-4951e159e223-kube-api-access-zskjd\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.449530 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-db-sync-config-data\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.449602 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-config\") pod \"neutron-db-sync-kqhm8\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.449759 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-combined-ca-bundle\") pod \"neutron-db-sync-kqhm8\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.449809 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-config-data\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.458435 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.477273 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.482036 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.498983 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.511214 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.552566 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zskjd\" (UniqueName: \"kubernetes.io/projected/8a06e756-cae4-44b6-8bed-4951e159e223-kube-api-access-zskjd\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.552624 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-db-sync-config-data\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.552668 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-config\") pod \"neutron-db-sync-kqhm8\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.552694 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-combined-ca-bundle\") pod \"neutron-db-sync-kqhm8\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.552738 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-config-data\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.552799 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-combined-ca-bundle\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.552816 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-scripts\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.552833 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a06e756-cae4-44b6-8bed-4951e159e223-etc-machine-id\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.552853 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thxpw\" (UniqueName: \"kubernetes.io/projected/0b1534f9-3185-43be-afb2-e37c921ff0e1-kube-api-access-thxpw\") pod \"neutron-db-sync-kqhm8\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.553866 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a06e756-cae4-44b6-8bed-4951e159e223-etc-machine-id\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.554157 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-7vl5b"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.555163 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.557837 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-db-sync-config-data\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.559805 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-combined-ca-bundle\") pod \"neutron-db-sync-kqhm8\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.565807 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6v5mv" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.566114 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.572002 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-scripts\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.575878 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-config-data\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.589390 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-config\") pod \"neutron-db-sync-kqhm8\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.589806 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-combined-ca-bundle\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.593426 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thxpw\" (UniqueName: \"kubernetes.io/projected/0b1534f9-3185-43be-afb2-e37c921ff0e1-kube-api-access-thxpw\") pod \"neutron-db-sync-kqhm8\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.595464 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zskjd\" (UniqueName: \"kubernetes.io/projected/8a06e756-cae4-44b6-8bed-4951e159e223-kube-api-access-zskjd\") pod \"cinder-db-sync-9ntb4\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.596594 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-4hmtp"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.652112 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7vl5b"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654421 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z427\" (UniqueName: \"kubernetes.io/projected/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-kube-api-access-8z427\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654473 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654503 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-combined-ca-bundle\") pod \"barbican-db-sync-7vl5b\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654534 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc9n6\" (UniqueName: \"kubernetes.io/projected/e4fc471f-181e-4ace-b02d-beb73c8ea737-kube-api-access-kc9n6\") pod \"barbican-db-sync-7vl5b\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654557 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-config-data\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654619 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-db-sync-config-data\") pod \"barbican-db-sync-7vl5b\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654657 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-scripts\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654670 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-run-httpd\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654715 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-log-httpd\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.654730 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.672058 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.681711 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-rw57h"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.682826 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.684624 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.685480 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zkqh7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.694745 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.716224 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rw57h"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.740432 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-v7jt7"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.741844 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.761870 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-scripts\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.761911 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-run-httpd\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.762040 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-log-httpd\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.762059 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.762119 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z427\" (UniqueName: \"kubernetes.io/projected/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-kube-api-access-8z427\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.762173 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.762227 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-combined-ca-bundle\") pod \"barbican-db-sync-7vl5b\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.762287 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc9n6\" (UniqueName: \"kubernetes.io/projected/e4fc471f-181e-4ace-b02d-beb73c8ea737-kube-api-access-kc9n6\") pod \"barbican-db-sync-7vl5b\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.762320 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-config-data\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.762385 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-db-sync-config-data\") pod \"barbican-db-sync-7vl5b\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.765617 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-run-httpd\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.766514 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-log-httpd\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.768272 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-config-data\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.768413 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-v7jt7"] Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.781333 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.781878 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-db-sync-config-data\") pod \"barbican-db-sync-7vl5b\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.781934 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.784703 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc9n6\" (UniqueName: \"kubernetes.io/projected/e4fc471f-181e-4ace-b02d-beb73c8ea737-kube-api-access-kc9n6\") pod \"barbican-db-sync-7vl5b\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.790371 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-scripts\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.790816 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z427\" (UniqueName: \"kubernetes.io/projected/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-kube-api-access-8z427\") pod \"ceilometer-0\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " pod="openstack/ceilometer-0" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.791239 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-combined-ca-bundle\") pod \"barbican-db-sync-7vl5b\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.796774 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.799317 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ade4fee9-4c58-4961-9ffc-ff0153ca2837" path="/var/lib/kubelet/pods/ade4fee9-4c58-4961-9ffc-ff0153ca2837/volumes" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.865939 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21785c0a-645b-4a17-bfdf-cbb1167f5361-logs\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.865988 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njvbw\" (UniqueName: \"kubernetes.io/projected/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-kube-api-access-njvbw\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.866019 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.866055 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-svc\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.866100 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.866130 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.866149 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r2rg\" (UniqueName: \"kubernetes.io/projected/21785c0a-645b-4a17-bfdf-cbb1167f5361-kube-api-access-2r2rg\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.866197 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-config\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.866213 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-config-data\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.866228 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-scripts\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.866261 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-combined-ca-bundle\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.889711 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968071 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21785c0a-645b-4a17-bfdf-cbb1167f5361-logs\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968124 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njvbw\" (UniqueName: \"kubernetes.io/projected/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-kube-api-access-njvbw\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968143 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968193 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-svc\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968217 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968245 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968270 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r2rg\" (UniqueName: \"kubernetes.io/projected/21785c0a-645b-4a17-bfdf-cbb1167f5361-kube-api-access-2r2rg\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968310 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-config\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968330 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-config-data\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968348 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-scripts\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968372 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-combined-ca-bundle\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.968690 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21785c0a-645b-4a17-bfdf-cbb1167f5361-logs\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.969098 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-swift-storage-0\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.969286 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-sb\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.969831 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-svc\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.970057 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-nb\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.970466 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-config\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.975441 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-scripts\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.977627 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-config-data\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.977906 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-combined-ca-bundle\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.988363 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njvbw\" (UniqueName: \"kubernetes.io/projected/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-kube-api-access-njvbw\") pod \"dnsmasq-dns-5d5dc7cf69-v7jt7\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:25 crc kubenswrapper[4917]: I0318 07:06:25.992949 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r2rg\" (UniqueName: \"kubernetes.io/projected/21785c0a-645b-4a17-bfdf-cbb1167f5361-kube-api-access-2r2rg\") pod \"placement-db-sync-rw57h\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.004978 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.049796 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-w6brj"] Mar 18 07:06:26 crc kubenswrapper[4917]: W0318 07:06:26.059156 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ce61a38_b1f2_4f48_a0ad_a5e4a9df038e.slice/crio-efe403ce2c9a732a6927b34a9c9bc834fbb07169a7eb079b9acb59b62952b471 WatchSource:0}: Error finding container efe403ce2c9a732a6927b34a9c9bc834fbb07169a7eb079b9acb59b62952b471: Status 404 returned error can't find the container with id efe403ce2c9a732a6927b34a9c9bc834fbb07169a7eb079b9acb59b62952b471 Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.110838 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.131981 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-4hmtp"] Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.136410 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:26 crc kubenswrapper[4917]: W0318 07:06:26.146453 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf034063d_84c1_430b_bdc3_ee3680f4ddf2.slice/crio-ff2bad8a61a827d823a42e2d54720393ece39847bbdbb031fbb93c61d2b7a7e6 WatchSource:0}: Error finding container ff2bad8a61a827d823a42e2d54720393ece39847bbdbb031fbb93c61d2b7a7e6: Status 404 returned error can't find the container with id ff2bad8a61a827d823a42e2d54720393ece39847bbdbb031fbb93c61d2b7a7e6 Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.173129 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.178538 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.180725 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.180899 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.181191 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.181518 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bstgs" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.189348 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.255047 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-9ntb4"] Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.273784 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-scripts\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.273827 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.273858 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.273875 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-logs\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.273890 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.275049 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-config-data\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.275102 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b28lf\" (UniqueName: \"kubernetes.io/projected/2dff3ce0-1e17-49e6-b522-60e313715a90-kube-api-access-b28lf\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.275257 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.375679 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7vl5b"] Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.376368 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.376404 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-logs\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.376422 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.376459 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-config-data\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.376484 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b28lf\" (UniqueName: \"kubernetes.io/projected/2dff3ce0-1e17-49e6-b522-60e313715a90-kube-api-access-b28lf\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.376541 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.376674 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-scripts\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.376936 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.377079 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.377999 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.376848 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-logs\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.383469 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-scripts\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: W0318 07:06:26.384618 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4fc471f_181e_4ace_b02d_beb73c8ea737.slice/crio-b5527babdf4b5b25487bd108df9daf658f0f90514ba87beba0ae3f1712df9645 WatchSource:0}: Error finding container b5527babdf4b5b25487bd108df9daf658f0f90514ba87beba0ae3f1712df9645: Status 404 returned error can't find the container with id b5527babdf4b5b25487bd108df9daf658f0f90514ba87beba0ae3f1712df9645 Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.384758 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.386356 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-config-data\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.400495 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.402859 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b28lf\" (UniqueName: \"kubernetes.io/projected/2dff3ce0-1e17-49e6-b522-60e313715a90-kube-api-access-b28lf\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.407621 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.449322 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kqhm8"] Mar 18 07:06:26 crc kubenswrapper[4917]: W0318 07:06:26.450520 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b1534f9_3185_43be_afb2_e37c921ff0e1.slice/crio-3b185c811e94db24d592ad475cc231f2233faa6a6ca00ecc912f2cf037ea027e WatchSource:0}: Error finding container 3b185c811e94db24d592ad475cc231f2233faa6a6ca00ecc912f2cf037ea027e: Status 404 returned error can't find the container with id 3b185c811e94db24d592ad475cc231f2233faa6a6ca00ecc912f2cf037ea027e Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.519805 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.530618 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.532227 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.535437 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.535928 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.577674 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:06:26 crc kubenswrapper[4917]: W0318 07:06:26.596904 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeac5da00_c2bc_4ab2_ace7_f7a29f9af50c.slice/crio-fe3b5e7ad7880b1fbfd76a5b0209043417f76c25d32926dfe7c63093ef51afd5 WatchSource:0}: Error finding container fe3b5e7ad7880b1fbfd76a5b0209043417f76c25d32926dfe7c63093ef51afd5: Status 404 returned error can't find the container with id fe3b5e7ad7880b1fbfd76a5b0209043417f76c25d32926dfe7c63093ef51afd5 Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.628160 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.682875 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.682925 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.682958 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m55hf\" (UniqueName: \"kubernetes.io/projected/6a57ac88-b164-4262-96a3-831a1edd3300-kube-api-access-m55hf\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.682981 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.683011 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.683057 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.683090 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.683132 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.725992 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rw57h"] Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.749504 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rw57h" event={"ID":"21785c0a-645b-4a17-bfdf-cbb1167f5361","Type":"ContainerStarted","Data":"f63a1bd9f2129c3ba2f24be59426ea6c73e805ec6363febf88c2e7cdcd4b2adc"} Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.752775 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w6brj" event={"ID":"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e","Type":"ContainerStarted","Data":"95a4a889f876df802ea135e6c177b8ee43bb49af49b9ca4d3a6a07e00df978be"} Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.753163 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w6brj" event={"ID":"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e","Type":"ContainerStarted","Data":"efe403ce2c9a732a6927b34a9c9bc834fbb07169a7eb079b9acb59b62952b471"} Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.754049 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7vl5b" event={"ID":"e4fc471f-181e-4ace-b02d-beb73c8ea737","Type":"ContainerStarted","Data":"b5527babdf4b5b25487bd108df9daf658f0f90514ba87beba0ae3f1712df9645"} Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.755346 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9ntb4" event={"ID":"8a06e756-cae4-44b6-8bed-4951e159e223","Type":"ContainerStarted","Data":"3672006b6f9c924ec71ed496e2107e4e7ea0bcf489683d5964f4de7cd6051e8f"} Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.756890 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerStarted","Data":"fe3b5e7ad7880b1fbfd76a5b0209043417f76c25d32926dfe7c63093ef51afd5"} Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.759171 4917 generic.go:334] "Generic (PLEG): container finished" podID="f034063d-84c1-430b-bdc3-ee3680f4ddf2" containerID="a8147eb9a24d9e689ceba3da492ee15db69a27f3cb23b5730252950ab9bea700" exitCode=0 Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.759230 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c7999457-4hmtp" event={"ID":"f034063d-84c1-430b-bdc3-ee3680f4ddf2","Type":"ContainerDied","Data":"a8147eb9a24d9e689ceba3da492ee15db69a27f3cb23b5730252950ab9bea700"} Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.759254 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c7999457-4hmtp" event={"ID":"f034063d-84c1-430b-bdc3-ee3680f4ddf2","Type":"ContainerStarted","Data":"ff2bad8a61a827d823a42e2d54720393ece39847bbdbb031fbb93c61d2b7a7e6"} Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.770183 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" podUID="8e393792-6f85-4c4b-adb9-0c889d9fcc51" containerName="dnsmasq-dns" containerID="cri-o://195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88" gracePeriod=10 Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.770400 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kqhm8" event={"ID":"0b1534f9-3185-43be-afb2-e37c921ff0e1","Type":"ContainerStarted","Data":"3b185c811e94db24d592ad475cc231f2233faa6a6ca00ecc912f2cf037ea027e"} Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.770861 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-w6brj" podStartSLOduration=2.770842774 podStartE2EDuration="2.770842774s" podCreationTimestamp="2026-03-18 07:06:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:26.764965584 +0000 UTC m=+1171.706120308" watchObservedRunningTime="2026-03-18 07:06:26.770842774 +0000 UTC m=+1171.711997508" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786210 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786291 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786363 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786392 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786449 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786477 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m55hf\" (UniqueName: \"kubernetes.io/projected/6a57ac88-b164-4262-96a3-831a1edd3300-kube-api-access-m55hf\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786526 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786552 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786627 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.786862 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.789009 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.805495 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m55hf\" (UniqueName: \"kubernetes.io/projected/6a57ac88-b164-4262-96a3-831a1edd3300-kube-api-access-m55hf\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.806813 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.808479 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.809638 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.809699 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.819922 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:26 crc kubenswrapper[4917]: I0318 07:06:26.874267 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-v7jt7"] Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.001918 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.114743 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.203606 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-config\") pod \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.203680 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-svc\") pod \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.203732 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-nb\") pod \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.203833 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-sb\") pod \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.203873 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnxvh\" (UniqueName: \"kubernetes.io/projected/f034063d-84c1-430b-bdc3-ee3680f4ddf2-kube-api-access-bnxvh\") pod \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.203969 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-swift-storage-0\") pod \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\" (UID: \"f034063d-84c1-430b-bdc3-ee3680f4ddf2\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.219103 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.241556 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f034063d-84c1-430b-bdc3-ee3680f4ddf2" (UID: "f034063d-84c1-430b-bdc3-ee3680f4ddf2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.244860 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f034063d-84c1-430b-bdc3-ee3680f4ddf2" (UID: "f034063d-84c1-430b-bdc3-ee3680f4ddf2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.255011 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f034063d-84c1-430b-bdc3-ee3680f4ddf2-kube-api-access-bnxvh" (OuterVolumeSpecName: "kube-api-access-bnxvh") pod "f034063d-84c1-430b-bdc3-ee3680f4ddf2" (UID: "f034063d-84c1-430b-bdc3-ee3680f4ddf2"). InnerVolumeSpecName "kube-api-access-bnxvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.282540 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f034063d-84c1-430b-bdc3-ee3680f4ddf2" (UID: "f034063d-84c1-430b-bdc3-ee3680f4ddf2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.298067 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f034063d-84c1-430b-bdc3-ee3680f4ddf2" (UID: "f034063d-84c1-430b-bdc3-ee3680f4ddf2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.305378 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.305402 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.305412 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.305421 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnxvh\" (UniqueName: \"kubernetes.io/projected/f034063d-84c1-430b-bdc3-ee3680f4ddf2-kube-api-access-bnxvh\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.305431 4917 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.316850 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-config" (OuterVolumeSpecName: "config") pod "f034063d-84c1-430b-bdc3-ee3680f4ddf2" (UID: "f034063d-84c1-430b-bdc3-ee3680f4ddf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.379492 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.408973 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f034063d-84c1-430b-bdc3-ee3680f4ddf2-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.510030 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-swift-storage-0\") pod \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.510098 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk2s6\" (UniqueName: \"kubernetes.io/projected/8e393792-6f85-4c4b-adb9-0c889d9fcc51-kube-api-access-pk2s6\") pod \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.510198 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-svc\") pod \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.510282 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-nb\") pod \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.510312 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-sb\") pod \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.510404 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-config\") pod \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\" (UID: \"8e393792-6f85-4c4b-adb9-0c889d9fcc51\") " Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.517752 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e393792-6f85-4c4b-adb9-0c889d9fcc51-kube-api-access-pk2s6" (OuterVolumeSpecName: "kube-api-access-pk2s6") pod "8e393792-6f85-4c4b-adb9-0c889d9fcc51" (UID: "8e393792-6f85-4c4b-adb9-0c889d9fcc51"). InnerVolumeSpecName "kube-api-access-pk2s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.590991 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8e393792-6f85-4c4b-adb9-0c889d9fcc51" (UID: "8e393792-6f85-4c4b-adb9-0c889d9fcc51"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.608852 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8e393792-6f85-4c4b-adb9-0c889d9fcc51" (UID: "8e393792-6f85-4c4b-adb9-0c889d9fcc51"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.611626 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-config" (OuterVolumeSpecName: "config") pod "8e393792-6f85-4c4b-adb9-0c889d9fcc51" (UID: "8e393792-6f85-4c4b-adb9-0c889d9fcc51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.613292 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.613318 4917 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.613328 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk2s6\" (UniqueName: \"kubernetes.io/projected/8e393792-6f85-4c4b-adb9-0c889d9fcc51-kube-api-access-pk2s6\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.613362 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.643429 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e393792-6f85-4c4b-adb9-0c889d9fcc51" (UID: "8e393792-6f85-4c4b-adb9-0c889d9fcc51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.655430 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8e393792-6f85-4c4b-adb9-0c889d9fcc51" (UID: "8e393792-6f85-4c4b-adb9-0c889d9fcc51"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.672181 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.715029 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.715053 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8e393792-6f85-4c4b-adb9-0c889d9fcc51-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.789196 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a57ac88-b164-4262-96a3-831a1edd3300","Type":"ContainerStarted","Data":"e6bd0ed01775040f6d5a16423eac3ada5f986db1226b4a4916331d4b17068b87"} Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.798360 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dff3ce0-1e17-49e6-b522-60e313715a90","Type":"ContainerStarted","Data":"6b355e18d665d6e09e1517ec7a3cbfb8e655d57864e3baf015ee5383214acb97"} Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.815323 4917 generic.go:334] "Generic (PLEG): container finished" podID="8e393792-6f85-4c4b-adb9-0c889d9fcc51" containerID="195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88" exitCode=0 Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.815447 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.816169 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" event={"ID":"8e393792-6f85-4c4b-adb9-0c889d9fcc51","Type":"ContainerDied","Data":"195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88"} Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.816194 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d88577c8c-jt9p6" event={"ID":"8e393792-6f85-4c4b-adb9-0c889d9fcc51","Type":"ContainerDied","Data":"473c54f07ca5dac3b96059ac57d7be8d346035bf3bbe13acc69eb281483d75f9"} Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.816209 4917 scope.go:117] "RemoveContainer" containerID="195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.840648 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-jt9p6"] Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.846921 4917 generic.go:334] "Generic (PLEG): container finished" podID="217c88fe-a91f-4cc2-886e-1489ffa9d5b6" containerID="cf5781c9d765e5bfe3262ba2904bdfdaf37927348ea89c5f9645b88ff56261f9" exitCode=0 Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.847033 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" event={"ID":"217c88fe-a91f-4cc2-886e-1489ffa9d5b6","Type":"ContainerDied","Data":"cf5781c9d765e5bfe3262ba2904bdfdaf37927348ea89c5f9645b88ff56261f9"} Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.847105 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" event={"ID":"217c88fe-a91f-4cc2-886e-1489ffa9d5b6","Type":"ContainerStarted","Data":"ddb58da5c118656446ea670d64be4f15a6d614e8507b698025ab4c8c689252cf"} Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.848196 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d88577c8c-jt9p6"] Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.867443 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84c7999457-4hmtp" event={"ID":"f034063d-84c1-430b-bdc3-ee3680f4ddf2","Type":"ContainerDied","Data":"ff2bad8a61a827d823a42e2d54720393ece39847bbdbb031fbb93c61d2b7a7e6"} Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.867567 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84c7999457-4hmtp" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.874138 4917 scope.go:117] "RemoveContainer" containerID="7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.874253 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kqhm8" event={"ID":"0b1534f9-3185-43be-afb2-e37c921ff0e1","Type":"ContainerStarted","Data":"d3940946653763a23bb02d86f1beb1ca5e1898736d5352253b4ad9d1636ec614"} Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.893824 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kqhm8" podStartSLOduration=2.893801814 podStartE2EDuration="2.893801814s" podCreationTimestamp="2026-03-18 07:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:27.892699668 +0000 UTC m=+1172.833854382" watchObservedRunningTime="2026-03-18 07:06:27.893801814 +0000 UTC m=+1172.834956538" Mar 18 07:06:27 crc kubenswrapper[4917]: I0318 07:06:27.983793 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-4hmtp"] Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.013133 4917 scope.go:117] "RemoveContainer" containerID="195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88" Mar 18 07:06:28 crc kubenswrapper[4917]: E0318 07:06:28.014049 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88\": container with ID starting with 195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88 not found: ID does not exist" containerID="195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88" Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.014111 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88"} err="failed to get container status \"195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88\": rpc error: code = NotFound desc = could not find container \"195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88\": container with ID starting with 195033ae5d7bf45a89af4e681b06191032c9db2495da04bce4e545a20d64df88 not found: ID does not exist" Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.014140 4917 scope.go:117] "RemoveContainer" containerID="7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37" Mar 18 07:06:28 crc kubenswrapper[4917]: E0318 07:06:28.016019 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37\": container with ID starting with 7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37 not found: ID does not exist" containerID="7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37" Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.016070 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37"} err="failed to get container status \"7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37\": rpc error: code = NotFound desc = could not find container \"7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37\": container with ID starting with 7cbaf33f7db4cb6b571d8856746b898095219f7818e4c67a716ce4831efccc37 not found: ID does not exist" Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.016100 4917 scope.go:117] "RemoveContainer" containerID="a8147eb9a24d9e689ceba3da492ee15db69a27f3cb23b5730252950ab9bea700" Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.029435 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84c7999457-4hmtp"] Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.230486 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.315995 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.322652 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.890843 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a57ac88-b164-4262-96a3-831a1edd3300","Type":"ContainerStarted","Data":"d7fda4c624d0f12380592311bd79f35b4ef10cd75a9ecd31c39446814dc858e8"} Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.894273 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dff3ce0-1e17-49e6-b522-60e313715a90","Type":"ContainerStarted","Data":"8947646032ac130d6d82e9729d624da2decdbafb3e1b3fd1c8c5c5a8765ee433"} Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.905685 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" event={"ID":"217c88fe-a91f-4cc2-886e-1489ffa9d5b6","Type":"ContainerStarted","Data":"7a7a6c6de4ff0537660534db476062b80782d735d2278015f3c6c3958c54f567"} Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.905820 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:28 crc kubenswrapper[4917]: I0318 07:06:28.926545 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" podStartSLOduration=3.92652976 podStartE2EDuration="3.92652976s" podCreationTimestamp="2026-03-18 07:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:28.922179197 +0000 UTC m=+1173.863333911" watchObservedRunningTime="2026-03-18 07:06:28.92652976 +0000 UTC m=+1173.867684474" Mar 18 07:06:29 crc kubenswrapper[4917]: I0318 07:06:29.787776 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e393792-6f85-4c4b-adb9-0c889d9fcc51" path="/var/lib/kubelet/pods/8e393792-6f85-4c4b-adb9-0c889d9fcc51/volumes" Mar 18 07:06:29 crc kubenswrapper[4917]: I0318 07:06:29.789202 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f034063d-84c1-430b-bdc3-ee3680f4ddf2" path="/var/lib/kubelet/pods/f034063d-84c1-430b-bdc3-ee3680f4ddf2/volumes" Mar 18 07:06:29 crc kubenswrapper[4917]: I0318 07:06:29.916546 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerName="glance-log" containerID="cri-o://8947646032ac130d6d82e9729d624da2decdbafb3e1b3fd1c8c5c5a8765ee433" gracePeriod=30 Mar 18 07:06:29 crc kubenswrapper[4917]: I0318 07:06:29.916787 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dff3ce0-1e17-49e6-b522-60e313715a90","Type":"ContainerStarted","Data":"2af8f84f7bc1cd40db3830a731de6e6f43b9a25275330e49dd3728c5fe16ab8d"} Mar 18 07:06:29 crc kubenswrapper[4917]: I0318 07:06:29.917012 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerName="glance-httpd" containerID="cri-o://2af8f84f7bc1cd40db3830a731de6e6f43b9a25275330e49dd3728c5fe16ab8d" gracePeriod=30 Mar 18 07:06:29 crc kubenswrapper[4917]: I0318 07:06:29.936436 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.93641839 podStartE2EDuration="4.93641839s" podCreationTimestamp="2026-03-18 07:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:29.935544689 +0000 UTC m=+1174.876699403" watchObservedRunningTime="2026-03-18 07:06:29.93641839 +0000 UTC m=+1174.877573094" Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.932980 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a57ac88-b164-4262-96a3-831a1edd3300","Type":"ContainerStarted","Data":"4280db20c7778638489dfbb570028af9c6927d2527ba2d3699e7adc4d967c788"} Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.933524 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a57ac88-b164-4262-96a3-831a1edd3300" containerName="glance-log" containerID="cri-o://d7fda4c624d0f12380592311bd79f35b4ef10cd75a9ecd31c39446814dc858e8" gracePeriod=30 Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.935101 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a57ac88-b164-4262-96a3-831a1edd3300" containerName="glance-httpd" containerID="cri-o://4280db20c7778638489dfbb570028af9c6927d2527ba2d3699e7adc4d967c788" gracePeriod=30 Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.939144 4917 generic.go:334] "Generic (PLEG): container finished" podID="3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" containerID="95a4a889f876df802ea135e6c177b8ee43bb49af49b9ca4d3a6a07e00df978be" exitCode=0 Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.939230 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w6brj" event={"ID":"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e","Type":"ContainerDied","Data":"95a4a889f876df802ea135e6c177b8ee43bb49af49b9ca4d3a6a07e00df978be"} Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.951250 4917 generic.go:334] "Generic (PLEG): container finished" podID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerID="2af8f84f7bc1cd40db3830a731de6e6f43b9a25275330e49dd3728c5fe16ab8d" exitCode=0 Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.951277 4917 generic.go:334] "Generic (PLEG): container finished" podID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerID="8947646032ac130d6d82e9729d624da2decdbafb3e1b3fd1c8c5c5a8765ee433" exitCode=143 Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.951299 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dff3ce0-1e17-49e6-b522-60e313715a90","Type":"ContainerDied","Data":"2af8f84f7bc1cd40db3830a731de6e6f43b9a25275330e49dd3728c5fe16ab8d"} Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.951322 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dff3ce0-1e17-49e6-b522-60e313715a90","Type":"ContainerDied","Data":"8947646032ac130d6d82e9729d624da2decdbafb3e1b3fd1c8c5c5a8765ee433"} Mar 18 07:06:30 crc kubenswrapper[4917]: I0318 07:06:30.979165 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.979142474 podStartE2EDuration="5.979142474s" podCreationTimestamp="2026-03-18 07:06:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:30.975030526 +0000 UTC m=+1175.916185250" watchObservedRunningTime="2026-03-18 07:06:30.979142474 +0000 UTC m=+1175.920297198" Mar 18 07:06:31 crc kubenswrapper[4917]: I0318 07:06:31.985778 4917 generic.go:334] "Generic (PLEG): container finished" podID="6a57ac88-b164-4262-96a3-831a1edd3300" containerID="4280db20c7778638489dfbb570028af9c6927d2527ba2d3699e7adc4d967c788" exitCode=0 Mar 18 07:06:31 crc kubenswrapper[4917]: I0318 07:06:31.986128 4917 generic.go:334] "Generic (PLEG): container finished" podID="6a57ac88-b164-4262-96a3-831a1edd3300" containerID="d7fda4c624d0f12380592311bd79f35b4ef10cd75a9ecd31c39446814dc858e8" exitCode=143 Mar 18 07:06:31 crc kubenswrapper[4917]: I0318 07:06:31.985874 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a57ac88-b164-4262-96a3-831a1edd3300","Type":"ContainerDied","Data":"4280db20c7778638489dfbb570028af9c6927d2527ba2d3699e7adc4d967c788"} Mar 18 07:06:31 crc kubenswrapper[4917]: I0318 07:06:31.986244 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a57ac88-b164-4262-96a3-831a1edd3300","Type":"ContainerDied","Data":"d7fda4c624d0f12380592311bd79f35b4ef10cd75a9ecd31c39446814dc858e8"} Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.371637 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.383289 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.500645 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-logs\") pod \"2dff3ce0-1e17-49e6-b522-60e313715a90\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.500955 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-combined-ca-bundle\") pod \"2dff3ce0-1e17-49e6-b522-60e313715a90\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.500987 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-scripts\") pod \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501013 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"2dff3ce0-1e17-49e6-b522-60e313715a90\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501046 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-credential-keys\") pod \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501065 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zssh9\" (UniqueName: \"kubernetes.io/projected/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-kube-api-access-zssh9\") pod \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501088 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-combined-ca-bundle\") pod \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501115 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-config-data\") pod \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501132 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-scripts\") pod \"2dff3ce0-1e17-49e6-b522-60e313715a90\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501172 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-public-tls-certs\") pod \"2dff3ce0-1e17-49e6-b522-60e313715a90\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501197 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b28lf\" (UniqueName: \"kubernetes.io/projected/2dff3ce0-1e17-49e6-b522-60e313715a90-kube-api-access-b28lf\") pod \"2dff3ce0-1e17-49e6-b522-60e313715a90\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501225 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-config-data\") pod \"2dff3ce0-1e17-49e6-b522-60e313715a90\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501256 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-httpd-run\") pod \"2dff3ce0-1e17-49e6-b522-60e313715a90\" (UID: \"2dff3ce0-1e17-49e6-b522-60e313715a90\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501312 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-fernet-keys\") pod \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\" (UID: \"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e\") " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501401 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-logs" (OuterVolumeSpecName: "logs") pod "2dff3ce0-1e17-49e6-b522-60e313715a90" (UID: "2dff3ce0-1e17-49e6-b522-60e313715a90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.501682 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.505955 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2dff3ce0-1e17-49e6-b522-60e313715a90" (UID: "2dff3ce0-1e17-49e6-b522-60e313715a90"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.507842 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" (UID: "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.508554 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-scripts" (OuterVolumeSpecName: "scripts") pod "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" (UID: "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.508877 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "2dff3ce0-1e17-49e6-b522-60e313715a90" (UID: "2dff3ce0-1e17-49e6-b522-60e313715a90"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.513911 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dff3ce0-1e17-49e6-b522-60e313715a90-kube-api-access-b28lf" (OuterVolumeSpecName: "kube-api-access-b28lf") pod "2dff3ce0-1e17-49e6-b522-60e313715a90" (UID: "2dff3ce0-1e17-49e6-b522-60e313715a90"). InnerVolumeSpecName "kube-api-access-b28lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.517578 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-kube-api-access-zssh9" (OuterVolumeSpecName: "kube-api-access-zssh9") pod "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" (UID: "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e"). InnerVolumeSpecName "kube-api-access-zssh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.518636 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-scripts" (OuterVolumeSpecName: "scripts") pod "2dff3ce0-1e17-49e6-b522-60e313715a90" (UID: "2dff3ce0-1e17-49e6-b522-60e313715a90"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.524809 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" (UID: "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.580352 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-config-data" (OuterVolumeSpecName: "config-data") pod "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" (UID: "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.582557 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-config-data" (OuterVolumeSpecName: "config-data") pod "2dff3ce0-1e17-49e6-b522-60e313715a90" (UID: "2dff3ce0-1e17-49e6-b522-60e313715a90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.590535 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dff3ce0-1e17-49e6-b522-60e313715a90" (UID: "2dff3ce0-1e17-49e6-b522-60e313715a90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603658 4917 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603688 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603702 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603735 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603749 4917 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603760 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zssh9\" (UniqueName: \"kubernetes.io/projected/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-kube-api-access-zssh9\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603772 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603783 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603793 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b28lf\" (UniqueName: \"kubernetes.io/projected/2dff3ce0-1e17-49e6-b522-60e313715a90-kube-api-access-b28lf\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603803 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.603813 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2dff3ce0-1e17-49e6-b522-60e313715a90-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.620675 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" (UID: "3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.639928 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.647813 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2dff3ce0-1e17-49e6-b522-60e313715a90" (UID: "2dff3ce0-1e17-49e6-b522-60e313715a90"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.705710 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.705748 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.705763 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dff3ce0-1e17-49e6-b522-60e313715a90-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.998037 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-w6brj" event={"ID":"3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e","Type":"ContainerDied","Data":"efe403ce2c9a732a6927b34a9c9bc834fbb07169a7eb079b9acb59b62952b471"} Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.998080 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efe403ce2c9a732a6927b34a9c9bc834fbb07169a7eb079b9acb59b62952b471" Mar 18 07:06:32 crc kubenswrapper[4917]: I0318 07:06:32.998146 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-w6brj" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.003000 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.002930 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2dff3ce0-1e17-49e6-b522-60e313715a90","Type":"ContainerDied","Data":"6b355e18d665d6e09e1517ec7a3cbfb8e655d57864e3baf015ee5383214acb97"} Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.003241 4917 scope.go:117] "RemoveContainer" containerID="2af8f84f7bc1cd40db3830a731de6e6f43b9a25275330e49dd3728c5fe16ab8d" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.052031 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.075674 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103085 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:06:33 crc kubenswrapper[4917]: E0318 07:06:33.103461 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerName="glance-log" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103479 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerName="glance-log" Mar 18 07:06:33 crc kubenswrapper[4917]: E0318 07:06:33.103492 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" containerName="keystone-bootstrap" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103499 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" containerName="keystone-bootstrap" Mar 18 07:06:33 crc kubenswrapper[4917]: E0318 07:06:33.103509 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerName="glance-httpd" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103515 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerName="glance-httpd" Mar 18 07:06:33 crc kubenswrapper[4917]: E0318 07:06:33.103534 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e393792-6f85-4c4b-adb9-0c889d9fcc51" containerName="init" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103540 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e393792-6f85-4c4b-adb9-0c889d9fcc51" containerName="init" Mar 18 07:06:33 crc kubenswrapper[4917]: E0318 07:06:33.103550 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e393792-6f85-4c4b-adb9-0c889d9fcc51" containerName="dnsmasq-dns" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103556 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e393792-6f85-4c4b-adb9-0c889d9fcc51" containerName="dnsmasq-dns" Mar 18 07:06:33 crc kubenswrapper[4917]: E0318 07:06:33.103565 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f034063d-84c1-430b-bdc3-ee3680f4ddf2" containerName="init" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103571 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f034063d-84c1-430b-bdc3-ee3680f4ddf2" containerName="init" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103725 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e393792-6f85-4c4b-adb9-0c889d9fcc51" containerName="dnsmasq-dns" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103738 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerName="glance-log" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103750 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" containerName="keystone-bootstrap" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103762 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dff3ce0-1e17-49e6-b522-60e313715a90" containerName="glance-httpd" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.103769 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f034063d-84c1-430b-bdc3-ee3680f4ddf2" containerName="init" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.104607 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.106798 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.107088 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.113998 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.123505 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-w6brj"] Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.137886 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-w6brj"] Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.180376 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wpl6r"] Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.181399 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.185907 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.187547 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5j76w" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.187570 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.187875 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.187954 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.198826 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wpl6r"] Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.213951 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.213990 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-config-data\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.214012 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fw5j\" (UniqueName: \"kubernetes.io/projected/4afa7651-3ebd-4549-a15e-1b3d9d5537db-kube-api-access-8fw5j\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.214031 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.214065 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.214091 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-logs\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.214343 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-scripts\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.214396 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.315738 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.315790 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-config-data\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.315816 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-logs\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.315836 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-scripts\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.315880 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-scripts\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.315919 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-fernet-keys\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.315977 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316014 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316035 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-credential-keys\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316060 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-config-data\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316093 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fw5j\" (UniqueName: \"kubernetes.io/projected/4afa7651-3ebd-4549-a15e-1b3d9d5537db-kube-api-access-8fw5j\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316116 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-combined-ca-bundle\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316140 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzgls\" (UniqueName: \"kubernetes.io/projected/16611612-0229-4cd5-9877-ccccc8bf60de-kube-api-access-jzgls\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316161 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316302 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316425 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-logs\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.316751 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.323766 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.323800 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-scripts\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.324259 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-config-data\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.336606 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.338776 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fw5j\" (UniqueName: \"kubernetes.io/projected/4afa7651-3ebd-4549-a15e-1b3d9d5537db-kube-api-access-8fw5j\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.349374 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.417952 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-config-data\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.418013 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-scripts\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.418081 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-fernet-keys\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.418127 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-credential-keys\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.418151 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-combined-ca-bundle\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.418169 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzgls\" (UniqueName: \"kubernetes.io/projected/16611612-0229-4cd5-9877-ccccc8bf60de-kube-api-access-jzgls\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.419155 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.421715 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-credential-keys\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.422523 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-config-data\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.422735 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-combined-ca-bundle\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.423369 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-fernet-keys\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.423938 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-scripts\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.433215 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzgls\" (UniqueName: \"kubernetes.io/projected/16611612-0229-4cd5-9877-ccccc8bf60de-kube-api-access-jzgls\") pod \"keystone-bootstrap-wpl6r\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.500537 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.789980 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dff3ce0-1e17-49e6-b522-60e313715a90" path="/var/lib/kubelet/pods/2dff3ce0-1e17-49e6-b522-60e313715a90/volumes" Mar 18 07:06:33 crc kubenswrapper[4917]: I0318 07:06:33.791579 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e" path="/var/lib/kubelet/pods/3ce61a38-b1f2-4f48-a0ad-a5e4a9df038e/volumes" Mar 18 07:06:36 crc kubenswrapper[4917]: I0318 07:06:36.138837 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:36 crc kubenswrapper[4917]: I0318 07:06:36.236759 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-qhw5r"] Mar 18 07:06:36 crc kubenswrapper[4917]: I0318 07:06:36.237018 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerName="dnsmasq-dns" containerID="cri-o://977fe887b50dce97fe001d1efee118fc531726fa077498324d1f7a6b367dce07" gracePeriod=10 Mar 18 07:06:37 crc kubenswrapper[4917]: I0318 07:06:37.012802 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Mar 18 07:06:37 crc kubenswrapper[4917]: I0318 07:06:37.041633 4917 generic.go:334] "Generic (PLEG): container finished" podID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerID="977fe887b50dce97fe001d1efee118fc531726fa077498324d1f7a6b367dce07" exitCode=0 Mar 18 07:06:37 crc kubenswrapper[4917]: I0318 07:06:37.041677 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" event={"ID":"939755c5-26d3-460b-af09-4b24f10fc9d0","Type":"ContainerDied","Data":"977fe887b50dce97fe001d1efee118fc531726fa077498324d1f7a6b367dce07"} Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.325934 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.413004 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-combined-ca-bundle\") pod \"6a57ac88-b164-4262-96a3-831a1edd3300\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.413066 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"6a57ac88-b164-4262-96a3-831a1edd3300\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.413148 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-config-data\") pod \"6a57ac88-b164-4262-96a3-831a1edd3300\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.413247 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-internal-tls-certs\") pod \"6a57ac88-b164-4262-96a3-831a1edd3300\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.413342 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-scripts\") pod \"6a57ac88-b164-4262-96a3-831a1edd3300\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.413442 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m55hf\" (UniqueName: \"kubernetes.io/projected/6a57ac88-b164-4262-96a3-831a1edd3300-kube-api-access-m55hf\") pod \"6a57ac88-b164-4262-96a3-831a1edd3300\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.413493 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-logs\") pod \"6a57ac88-b164-4262-96a3-831a1edd3300\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.413531 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-httpd-run\") pod \"6a57ac88-b164-4262-96a3-831a1edd3300\" (UID: \"6a57ac88-b164-4262-96a3-831a1edd3300\") " Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.414253 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-logs" (OuterVolumeSpecName: "logs") pod "6a57ac88-b164-4262-96a3-831a1edd3300" (UID: "6a57ac88-b164-4262-96a3-831a1edd3300"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.414384 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a57ac88-b164-4262-96a3-831a1edd3300" (UID: "6a57ac88-b164-4262-96a3-831a1edd3300"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.414815 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.414843 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a57ac88-b164-4262-96a3-831a1edd3300-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.419890 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "6a57ac88-b164-4262-96a3-831a1edd3300" (UID: "6a57ac88-b164-4262-96a3-831a1edd3300"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.421697 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a57ac88-b164-4262-96a3-831a1edd3300-kube-api-access-m55hf" (OuterVolumeSpecName: "kube-api-access-m55hf") pod "6a57ac88-b164-4262-96a3-831a1edd3300" (UID: "6a57ac88-b164-4262-96a3-831a1edd3300"). InnerVolumeSpecName "kube-api-access-m55hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.423107 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-scripts" (OuterVolumeSpecName: "scripts") pod "6a57ac88-b164-4262-96a3-831a1edd3300" (UID: "6a57ac88-b164-4262-96a3-831a1edd3300"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.453895 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a57ac88-b164-4262-96a3-831a1edd3300" (UID: "6a57ac88-b164-4262-96a3-831a1edd3300"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.482096 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-config-data" (OuterVolumeSpecName: "config-data") pod "6a57ac88-b164-4262-96a3-831a1edd3300" (UID: "6a57ac88-b164-4262-96a3-831a1edd3300"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.484847 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6a57ac88-b164-4262-96a3-831a1edd3300" (UID: "6a57ac88-b164-4262-96a3-831a1edd3300"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.516861 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m55hf\" (UniqueName: \"kubernetes.io/projected/6a57ac88-b164-4262-96a3-831a1edd3300-kube-api-access-m55hf\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.516900 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.516947 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.516961 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.516976 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.516987 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a57ac88-b164-4262-96a3-831a1edd3300-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.533895 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 07:06:38 crc kubenswrapper[4917]: I0318 07:06:38.619052 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.058578 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a57ac88-b164-4262-96a3-831a1edd3300","Type":"ContainerDied","Data":"e6bd0ed01775040f6d5a16423eac3ada5f986db1226b4a4916331d4b17068b87"} Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.058671 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.100051 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.118756 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.139528 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:06:39 crc kubenswrapper[4917]: E0318 07:06:39.139979 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a57ac88-b164-4262-96a3-831a1edd3300" containerName="glance-httpd" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.139997 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a57ac88-b164-4262-96a3-831a1edd3300" containerName="glance-httpd" Mar 18 07:06:39 crc kubenswrapper[4917]: E0318 07:06:39.140010 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a57ac88-b164-4262-96a3-831a1edd3300" containerName="glance-log" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.140016 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a57ac88-b164-4262-96a3-831a1edd3300" containerName="glance-log" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.140182 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a57ac88-b164-4262-96a3-831a1edd3300" containerName="glance-httpd" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.140205 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a57ac88-b164-4262-96a3-831a1edd3300" containerName="glance-log" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.141192 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.143578 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.143665 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.150971 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.330542 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.330631 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.330718 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.330770 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.330787 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.330816 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-logs\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.330838 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.330890 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fstwg\" (UniqueName: \"kubernetes.io/projected/3519edfd-1fd3-415a-913a-71cd289d524a-kube-api-access-fstwg\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.432624 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.433046 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.433084 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.433102 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.433137 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-logs\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.433162 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.433179 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fstwg\" (UniqueName: \"kubernetes.io/projected/3519edfd-1fd3-415a-913a-71cd289d524a-kube-api-access-fstwg\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.433238 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.434081 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.434419 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-logs\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.434739 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.438352 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.446633 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.450341 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.450361 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.454165 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fstwg\" (UniqueName: \"kubernetes.io/projected/3519edfd-1fd3-415a-913a-71cd289d524a-kube-api-access-fstwg\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.459550 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.464280 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:39 crc kubenswrapper[4917]: I0318 07:06:39.786877 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a57ac88-b164-4262-96a3-831a1edd3300" path="/var/lib/kubelet/pods/6a57ac88-b164-4262-96a3-831a1edd3300/volumes" Mar 18 07:06:42 crc kubenswrapper[4917]: I0318 07:06:42.013542 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: connect: connection refused" Mar 18 07:06:44 crc kubenswrapper[4917]: I0318 07:06:44.120649 4917 generic.go:334] "Generic (PLEG): container finished" podID="0b1534f9-3185-43be-afb2-e37c921ff0e1" containerID="d3940946653763a23bb02d86f1beb1ca5e1898736d5352253b4ad9d1636ec614" exitCode=0 Mar 18 07:06:44 crc kubenswrapper[4917]: I0318 07:06:44.120736 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kqhm8" event={"ID":"0b1534f9-3185-43be-afb2-e37c921ff0e1","Type":"ContainerDied","Data":"d3940946653763a23bb02d86f1beb1ca5e1898736d5352253b4ad9d1636ec614"} Mar 18 07:06:45 crc kubenswrapper[4917]: E0318 07:06:45.994617 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a" Mar 18 07:06:45 crc kubenswrapper[4917]: E0318 07:06:45.995011 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kc9n6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-7vl5b_openstack(e4fc471f-181e-4ace-b02d-beb73c8ea737): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:06:45 crc kubenswrapper[4917]: E0318 07:06:45.996189 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-7vl5b" podUID="e4fc471f-181e-4ace-b02d-beb73c8ea737" Mar 18 07:06:46 crc kubenswrapper[4917]: I0318 07:06:46.019171 4917 scope.go:117] "RemoveContainer" containerID="8947646032ac130d6d82e9729d624da2decdbafb3e1b3fd1c8c5c5a8765ee433" Mar 18 07:06:46 crc kubenswrapper[4917]: E0318 07:06:46.145904 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:1240a45aec9c3e1599be762c5565556560849b49fd39c7283b8e5519dcaa501a\\\"\"" pod="openstack/barbican-db-sync-7vl5b" podUID="e4fc471f-181e-4ace-b02d-beb73c8ea737" Mar 18 07:06:47 crc kubenswrapper[4917]: E0318 07:06:47.266969 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b" Mar 18 07:06:47 crc kubenswrapper[4917]: E0318 07:06:47.267374 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zskjd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-9ntb4_openstack(8a06e756-cae4-44b6-8bed-4951e159e223): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 07:06:47 crc kubenswrapper[4917]: E0318 07:06:47.268718 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-9ntb4" podUID="8a06e756-cae4-44b6-8bed-4951e159e223" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.270307 4917 scope.go:117] "RemoveContainer" containerID="4280db20c7778638489dfbb570028af9c6927d2527ba2d3699e7adc4d967c788" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.447573 4917 scope.go:117] "RemoveContainer" containerID="d7fda4c624d0f12380592311bd79f35b4ef10cd75a9ecd31c39446814dc858e8" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.473182 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.523723 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.580751 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thxpw\" (UniqueName: \"kubernetes.io/projected/0b1534f9-3185-43be-afb2-e37c921ff0e1-kube-api-access-thxpw\") pod \"0b1534f9-3185-43be-afb2-e37c921ff0e1\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.580799 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-combined-ca-bundle\") pod \"0b1534f9-3185-43be-afb2-e37c921ff0e1\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.580898 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-config\") pod \"0b1534f9-3185-43be-afb2-e37c921ff0e1\" (UID: \"0b1534f9-3185-43be-afb2-e37c921ff0e1\") " Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.646966 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1534f9-3185-43be-afb2-e37c921ff0e1-kube-api-access-thxpw" (OuterVolumeSpecName: "kube-api-access-thxpw") pod "0b1534f9-3185-43be-afb2-e37c921ff0e1" (UID: "0b1534f9-3185-43be-afb2-e37c921ff0e1"). InnerVolumeSpecName "kube-api-access-thxpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.665446 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b1534f9-3185-43be-afb2-e37c921ff0e1" (UID: "0b1534f9-3185-43be-afb2-e37c921ff0e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.689194 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-sb\") pod \"939755c5-26d3-460b-af09-4b24f10fc9d0\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.689269 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-nb\") pod \"939755c5-26d3-460b-af09-4b24f10fc9d0\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.689380 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-config\") pod \"939755c5-26d3-460b-af09-4b24f10fc9d0\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.689408 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km4g7\" (UniqueName: \"kubernetes.io/projected/939755c5-26d3-460b-af09-4b24f10fc9d0-kube-api-access-km4g7\") pod \"939755c5-26d3-460b-af09-4b24f10fc9d0\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.689436 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-dns-svc\") pod \"939755c5-26d3-460b-af09-4b24f10fc9d0\" (UID: \"939755c5-26d3-460b-af09-4b24f10fc9d0\") " Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.689766 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.689781 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thxpw\" (UniqueName: \"kubernetes.io/projected/0b1534f9-3185-43be-afb2-e37c921ff0e1-kube-api-access-thxpw\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.715623 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/939755c5-26d3-460b-af09-4b24f10fc9d0-kube-api-access-km4g7" (OuterVolumeSpecName: "kube-api-access-km4g7") pod "939755c5-26d3-460b-af09-4b24f10fc9d0" (UID: "939755c5-26d3-460b-af09-4b24f10fc9d0"). InnerVolumeSpecName "kube-api-access-km4g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.791464 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-config" (OuterVolumeSpecName: "config") pod "0b1534f9-3185-43be-afb2-e37c921ff0e1" (UID: "0b1534f9-3185-43be-afb2-e37c921ff0e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.792556 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0b1534f9-3185-43be-afb2-e37c921ff0e1-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.792597 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km4g7\" (UniqueName: \"kubernetes.io/projected/939755c5-26d3-460b-af09-4b24f10fc9d0-kube-api-access-km4g7\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.803519 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-config" (OuterVolumeSpecName: "config") pod "939755c5-26d3-460b-af09-4b24f10fc9d0" (UID: "939755c5-26d3-460b-af09-4b24f10fc9d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.807301 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "939755c5-26d3-460b-af09-4b24f10fc9d0" (UID: "939755c5-26d3-460b-af09-4b24f10fc9d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.819063 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "939755c5-26d3-460b-af09-4b24f10fc9d0" (UID: "939755c5-26d3-460b-af09-4b24f10fc9d0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.821028 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "939755c5-26d3-460b-af09-4b24f10fc9d0" (UID: "939755c5-26d3-460b-af09-4b24f10fc9d0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.877656 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:06:47 crc kubenswrapper[4917]: W0318 07:06:47.881729 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4afa7651_3ebd_4549_a15e_1b3d9d5537db.slice/crio-4c17dbb75a5b09cb0bbe0bc37363e26fcfcca0ca4a39d969fbeeefedd2407228 WatchSource:0}: Error finding container 4c17dbb75a5b09cb0bbe0bc37363e26fcfcca0ca4a39d969fbeeefedd2407228: Status 404 returned error can't find the container with id 4c17dbb75a5b09cb0bbe0bc37363e26fcfcca0ca4a39d969fbeeefedd2407228 Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.894644 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.894680 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.894692 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.894703 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/939755c5-26d3-460b-af09-4b24f10fc9d0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:47 crc kubenswrapper[4917]: I0318 07:06:47.920306 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wpl6r"] Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.172616 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerStarted","Data":"5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d"} Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.174761 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kqhm8" event={"ID":"0b1534f9-3185-43be-afb2-e37c921ff0e1","Type":"ContainerDied","Data":"3b185c811e94db24d592ad475cc231f2233faa6a6ca00ecc912f2cf037ea027e"} Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.174789 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b185c811e94db24d592ad475cc231f2233faa6a6ca00ecc912f2cf037ea027e" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.174857 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kqhm8" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.180450 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rw57h" event={"ID":"21785c0a-645b-4a17-bfdf-cbb1167f5361","Type":"ContainerStarted","Data":"f5a7a9e3add3a8e667ae87dcb116cbb2c525acdb1a13b33992c0b97a900e43cd"} Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.192842 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" event={"ID":"939755c5-26d3-460b-af09-4b24f10fc9d0","Type":"ContainerDied","Data":"45855cb792b9089c2d031f42bbfc7c4b4b195851635dc1e2e442ecc793c69d7e"} Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.192889 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.192930 4917 scope.go:117] "RemoveContainer" containerID="977fe887b50dce97fe001d1efee118fc531726fa077498324d1f7a6b367dce07" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.199810 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4afa7651-3ebd-4549-a15e-1b3d9d5537db","Type":"ContainerStarted","Data":"4c17dbb75a5b09cb0bbe0bc37363e26fcfcca0ca4a39d969fbeeefedd2407228"} Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.202066 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wpl6r" event={"ID":"16611612-0229-4cd5-9877-ccccc8bf60de","Type":"ContainerStarted","Data":"1111a5c1e1030374c28a65653c73f8742497b834ba27f3edc23bdc2fc9a6e2e5"} Mar 18 07:06:48 crc kubenswrapper[4917]: E0318 07:06:48.204328 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:574a17f0877c175128a764f2b37fc02456649c8514689125718ce6ca974bfb6b\\\"\"" pod="openstack/cinder-db-sync-9ntb4" podUID="8a06e756-cae4-44b6-8bed-4951e159e223" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.211786 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-rw57h" podStartSLOduration=3.952528451 podStartE2EDuration="23.211766915s" podCreationTimestamp="2026-03-18 07:06:25 +0000 UTC" firstStartedPulling="2026-03-18 07:06:26.740278364 +0000 UTC m=+1171.681433078" lastFinishedPulling="2026-03-18 07:06:45.999516838 +0000 UTC m=+1190.940671542" observedRunningTime="2026-03-18 07:06:48.195692181 +0000 UTC m=+1193.136846905" watchObservedRunningTime="2026-03-18 07:06:48.211766915 +0000 UTC m=+1193.152921629" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.253349 4917 scope.go:117] "RemoveContainer" containerID="05650b436727ef2eb235496e69080df30e3464cd282dff22a92af0c2f1a65404" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.258869 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-qhw5r"] Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.267351 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b4ddd5fb7-qhw5r"] Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.425574 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.739628 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-5mh8v"] Mar 18 07:06:48 crc kubenswrapper[4917]: E0318 07:06:48.739997 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerName="init" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.740013 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerName="init" Mar 18 07:06:48 crc kubenswrapper[4917]: E0318 07:06:48.740030 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1534f9-3185-43be-afb2-e37c921ff0e1" containerName="neutron-db-sync" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.740037 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1534f9-3185-43be-afb2-e37c921ff0e1" containerName="neutron-db-sync" Mar 18 07:06:48 crc kubenswrapper[4917]: E0318 07:06:48.740065 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerName="dnsmasq-dns" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.740071 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerName="dnsmasq-dns" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.740214 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1534f9-3185-43be-afb2-e37c921ff0e1" containerName="neutron-db-sync" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.740229 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerName="dnsmasq-dns" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.741057 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.780660 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-5mh8v"] Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.883190 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5587f99f7b-qj9dj"] Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.885242 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.887089 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-5n4cs" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.889323 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.889341 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.889609 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.896136 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5587f99f7b-qj9dj"] Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.916490 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ltdr\" (UniqueName: \"kubernetes.io/projected/b761dff8-30ed-4625-9e13-69bb801f0378-kube-api-access-6ltdr\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.916541 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-config\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.916563 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.916643 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.916679 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-svc\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:48 crc kubenswrapper[4917]: I0318 07:06:48.916696 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018573 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-ovndb-tls-certs\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018673 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-config\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018705 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ltdr\" (UniqueName: \"kubernetes.io/projected/b761dff8-30ed-4625-9e13-69bb801f0378-kube-api-access-6ltdr\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018733 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-config\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018747 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018770 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmn5s\" (UniqueName: \"kubernetes.io/projected/09ab5b75-3af1-4097-acb4-3dca0e3986c6-kube-api-access-bmn5s\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018789 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018805 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-httpd-config\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018835 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-svc\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018852 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.018889 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-combined-ca-bundle\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.019704 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-config\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.020552 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-nb\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.021034 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-svc\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.021674 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-sb\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.022002 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-swift-storage-0\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.041886 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ltdr\" (UniqueName: \"kubernetes.io/projected/b761dff8-30ed-4625-9e13-69bb801f0378-kube-api-access-6ltdr\") pod \"dnsmasq-dns-5f9bff4fdf-5mh8v\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.081505 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.120804 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-combined-ca-bundle\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.121035 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-ovndb-tls-certs\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.121145 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-config\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.121220 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmn5s\" (UniqueName: \"kubernetes.io/projected/09ab5b75-3af1-4097-acb4-3dca0e3986c6-kube-api-access-bmn5s\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.121286 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-httpd-config\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.125633 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-ovndb-tls-certs\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.125748 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-combined-ca-bundle\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.129512 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-config\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.135133 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-httpd-config\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.144718 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmn5s\" (UniqueName: \"kubernetes.io/projected/09ab5b75-3af1-4097-acb4-3dca0e3986c6-kube-api-access-bmn5s\") pod \"neutron-5587f99f7b-qj9dj\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.213855 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.266068 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4afa7651-3ebd-4549-a15e-1b3d9d5537db","Type":"ContainerStarted","Data":"707fc93d69d62d504d4984b9bb73411cbdd19f1fbf74b2a04513c9762a172dd0"} Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.266109 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4afa7651-3ebd-4549-a15e-1b3d9d5537db","Type":"ContainerStarted","Data":"e98234ea84573f53bb65b8132490e1af2226b1eb3671188da6355c4b71933e2d"} Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.268037 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3519edfd-1fd3-415a-913a-71cd289d524a","Type":"ContainerStarted","Data":"6b63fb2b7875c47236753ae051a2755f4f4953bc497040216e9b4a25cb0e0ab7"} Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.273398 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wpl6r" event={"ID":"16611612-0229-4cd5-9877-ccccc8bf60de","Type":"ContainerStarted","Data":"77bd9dbc3e9fe7eaafb31487aa7fefec3a629522eea305344a3a6d2d662a7f94"} Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.291486 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wpl6r" podStartSLOduration=16.291469103 podStartE2EDuration="16.291469103s" podCreationTimestamp="2026-03-18 07:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:49.287631831 +0000 UTC m=+1194.228786535" watchObservedRunningTime="2026-03-18 07:06:49.291469103 +0000 UTC m=+1194.232623827" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.598698 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-5mh8v"] Mar 18 07:06:49 crc kubenswrapper[4917]: W0318 07:06:49.613076 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb761dff8_30ed_4625_9e13_69bb801f0378.slice/crio-b88d3d099198bd30b7e840a112d81ecc62c7c428710ecb16ea4d23d0f9e9f3f9 WatchSource:0}: Error finding container b88d3d099198bd30b7e840a112d81ecc62c7c428710ecb16ea4d23d0f9e9f3f9: Status 404 returned error can't find the container with id b88d3d099198bd30b7e840a112d81ecc62c7c428710ecb16ea4d23d0f9e9f3f9 Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.786667 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" path="/var/lib/kubelet/pods/939755c5-26d3-460b-af09-4b24f10fc9d0/volumes" Mar 18 07:06:49 crc kubenswrapper[4917]: I0318 07:06:49.883959 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5587f99f7b-qj9dj"] Mar 18 07:06:49 crc kubenswrapper[4917]: W0318 07:06:49.892747 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09ab5b75_3af1_4097_acb4_3dca0e3986c6.slice/crio-e895464ae6ccbbe0b9c8d16edfd365abd8cf7141614b582bbf86320c0555a362 WatchSource:0}: Error finding container e895464ae6ccbbe0b9c8d16edfd365abd8cf7141614b582bbf86320c0555a362: Status 404 returned error can't find the container with id e895464ae6ccbbe0b9c8d16edfd365abd8cf7141614b582bbf86320c0555a362 Mar 18 07:06:50 crc kubenswrapper[4917]: I0318 07:06:50.296210 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3519edfd-1fd3-415a-913a-71cd289d524a","Type":"ContainerStarted","Data":"d61507c00bd0791b6047696eff450e0fb3835214f876cc795fb1d3c450ce5f2e"} Mar 18 07:06:50 crc kubenswrapper[4917]: I0318 07:06:50.297847 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5587f99f7b-qj9dj" event={"ID":"09ab5b75-3af1-4097-acb4-3dca0e3986c6","Type":"ContainerStarted","Data":"e895464ae6ccbbe0b9c8d16edfd365abd8cf7141614b582bbf86320c0555a362"} Mar 18 07:06:50 crc kubenswrapper[4917]: I0318 07:06:50.299807 4917 generic.go:334] "Generic (PLEG): container finished" podID="b761dff8-30ed-4625-9e13-69bb801f0378" containerID="752c81b1b01a876106a6228f6985ce5bc96cbced050904b608ad646b342f8660" exitCode=0 Mar 18 07:06:50 crc kubenswrapper[4917]: I0318 07:06:50.299821 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" event={"ID":"b761dff8-30ed-4625-9e13-69bb801f0378","Type":"ContainerDied","Data":"752c81b1b01a876106a6228f6985ce5bc96cbced050904b608ad646b342f8660"} Mar 18 07:06:50 crc kubenswrapper[4917]: I0318 07:06:50.299876 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" event={"ID":"b761dff8-30ed-4625-9e13-69bb801f0378","Type":"ContainerStarted","Data":"b88d3d099198bd30b7e840a112d81ecc62c7c428710ecb16ea4d23d0f9e9f3f9"} Mar 18 07:06:50 crc kubenswrapper[4917]: I0318 07:06:50.397061 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=17.397042787 podStartE2EDuration="17.397042787s" podCreationTimestamp="2026-03-18 07:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:50.388400231 +0000 UTC m=+1195.329554945" watchObservedRunningTime="2026-03-18 07:06:50.397042787 +0000 UTC m=+1195.338197501" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.312440 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3519edfd-1fd3-415a-913a-71cd289d524a","Type":"ContainerStarted","Data":"7a8f91c1a0d280d4171736d8452e848c62225dd3a215c3484db00414f3528971"} Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.317969 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5587f99f7b-qj9dj" event={"ID":"09ab5b75-3af1-4097-acb4-3dca0e3986c6","Type":"ContainerStarted","Data":"f3989c857b08a9bc817fbf1d1c6766f330bfa4cc2c6d613a35e5eab43f105777"} Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.318013 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5587f99f7b-qj9dj" event={"ID":"09ab5b75-3af1-4097-acb4-3dca0e3986c6","Type":"ContainerStarted","Data":"e928e0f57982db1ec1486f972001647c43daaf622da5dba2ab54070f700afb60"} Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.318695 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.330084 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" event={"ID":"b761dff8-30ed-4625-9e13-69bb801f0378","Type":"ContainerStarted","Data":"3d4a6c920818ecac494b79ab66ff2137f371b2371675750c4dca016c56372311"} Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.330122 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.345421 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.345359547 podStartE2EDuration="12.345359547s" podCreationTimestamp="2026-03-18 07:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:51.333025993 +0000 UTC m=+1196.274180727" watchObservedRunningTime="2026-03-18 07:06:51.345359547 +0000 UTC m=+1196.286514271" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.369502 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" podStartSLOduration=3.369478853 podStartE2EDuration="3.369478853s" podCreationTimestamp="2026-03-18 07:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:51.36219814 +0000 UTC m=+1196.303352874" watchObservedRunningTime="2026-03-18 07:06:51.369478853 +0000 UTC m=+1196.310633567" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.386333 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5587f99f7b-qj9dj" podStartSLOduration=3.386314586 podStartE2EDuration="3.386314586s" podCreationTimestamp="2026-03-18 07:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:51.382904724 +0000 UTC m=+1196.324059438" watchObservedRunningTime="2026-03-18 07:06:51.386314586 +0000 UTC m=+1196.327469300" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.545317 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-864fd895df-j2ff2"] Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.547064 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.548852 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.549610 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.565501 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-864fd895df-j2ff2"] Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.689005 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-httpd-config\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.689078 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdb9l\" (UniqueName: \"kubernetes.io/projected/b43d29c8-9556-4447-aff2-f375a671ef4f-kube-api-access-kdb9l\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.689142 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-internal-tls-certs\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.689180 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-ovndb-tls-certs\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.689228 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-combined-ca-bundle\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.689322 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-config\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.689353 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-public-tls-certs\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.790889 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-httpd-config\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.790981 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdb9l\" (UniqueName: \"kubernetes.io/projected/b43d29c8-9556-4447-aff2-f375a671ef4f-kube-api-access-kdb9l\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.791034 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-internal-tls-certs\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.791950 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-ovndb-tls-certs\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.792064 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-combined-ca-bundle\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.792184 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-config\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.792229 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-public-tls-certs\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.796568 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-ovndb-tls-certs\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.798561 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-httpd-config\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.804537 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-combined-ca-bundle\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.811205 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-internal-tls-certs\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.811597 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-public-tls-certs\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.812488 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-config\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.820367 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdb9l\" (UniqueName: \"kubernetes.io/projected/b43d29c8-9556-4447-aff2-f375a671ef4f-kube-api-access-kdb9l\") pod \"neutron-864fd895df-j2ff2\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:51 crc kubenswrapper[4917]: I0318 07:06:51.872437 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:52 crc kubenswrapper[4917]: I0318 07:06:52.013746 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b4ddd5fb7-qhw5r" podUID="939755c5-26d3-460b-af09-4b24f10fc9d0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.121:5353: i/o timeout" Mar 18 07:06:52 crc kubenswrapper[4917]: I0318 07:06:52.342750 4917 generic.go:334] "Generic (PLEG): container finished" podID="16611612-0229-4cd5-9877-ccccc8bf60de" containerID="77bd9dbc3e9fe7eaafb31487aa7fefec3a629522eea305344a3a6d2d662a7f94" exitCode=0 Mar 18 07:06:52 crc kubenswrapper[4917]: I0318 07:06:52.342766 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wpl6r" event={"ID":"16611612-0229-4cd5-9877-ccccc8bf60de","Type":"ContainerDied","Data":"77bd9dbc3e9fe7eaafb31487aa7fefec3a629522eea305344a3a6d2d662a7f94"} Mar 18 07:06:52 crc kubenswrapper[4917]: I0318 07:06:52.345376 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerStarted","Data":"ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd"} Mar 18 07:06:52 crc kubenswrapper[4917]: I0318 07:06:52.347562 4917 generic.go:334] "Generic (PLEG): container finished" podID="21785c0a-645b-4a17-bfdf-cbb1167f5361" containerID="f5a7a9e3add3a8e667ae87dcb116cbb2c525acdb1a13b33992c0b97a900e43cd" exitCode=0 Mar 18 07:06:52 crc kubenswrapper[4917]: I0318 07:06:52.347782 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rw57h" event={"ID":"21785c0a-645b-4a17-bfdf-cbb1167f5361","Type":"ContainerDied","Data":"f5a7a9e3add3a8e667ae87dcb116cbb2c525acdb1a13b33992c0b97a900e43cd"} Mar 18 07:06:52 crc kubenswrapper[4917]: I0318 07:06:52.444324 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-864fd895df-j2ff2"] Mar 18 07:06:52 crc kubenswrapper[4917]: W0318 07:06:52.467899 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb43d29c8_9556_4447_aff2_f375a671ef4f.slice/crio-ff10a2f4a86d87476d5119ac936af6f4353a3483d8fb36c56f05728b6753cf5e WatchSource:0}: Error finding container ff10a2f4a86d87476d5119ac936af6f4353a3483d8fb36c56f05728b6753cf5e: Status 404 returned error can't find the container with id ff10a2f4a86d87476d5119ac936af6f4353a3483d8fb36c56f05728b6753cf5e Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.361737 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-864fd895df-j2ff2" event={"ID":"b43d29c8-9556-4447-aff2-f375a671ef4f","Type":"ContainerStarted","Data":"016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5"} Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.362104 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-864fd895df-j2ff2" event={"ID":"b43d29c8-9556-4447-aff2-f375a671ef4f","Type":"ContainerStarted","Data":"febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d"} Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.362119 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-864fd895df-j2ff2" event={"ID":"b43d29c8-9556-4447-aff2-f375a671ef4f","Type":"ContainerStarted","Data":"ff10a2f4a86d87476d5119ac936af6f4353a3483d8fb36c56f05728b6753cf5e"} Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.362842 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.443795 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.444060 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.507488 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.516098 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.531954 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-864fd895df-j2ff2" podStartSLOduration=2.531930571 podStartE2EDuration="2.531930571s" podCreationTimestamp="2026-03-18 07:06:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:53.439661747 +0000 UTC m=+1198.380816461" watchObservedRunningTime="2026-03-18 07:06:53.531930571 +0000 UTC m=+1198.473085285" Mar 18 07:06:53 crc kubenswrapper[4917]: I0318 07:06:53.993177 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.133766 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-config-data\") pod \"21785c0a-645b-4a17-bfdf-cbb1167f5361\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.133838 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-scripts\") pod \"21785c0a-645b-4a17-bfdf-cbb1167f5361\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.133866 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21785c0a-645b-4a17-bfdf-cbb1167f5361-logs\") pod \"21785c0a-645b-4a17-bfdf-cbb1167f5361\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.133905 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-combined-ca-bundle\") pod \"21785c0a-645b-4a17-bfdf-cbb1167f5361\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.133986 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r2rg\" (UniqueName: \"kubernetes.io/projected/21785c0a-645b-4a17-bfdf-cbb1167f5361-kube-api-access-2r2rg\") pod \"21785c0a-645b-4a17-bfdf-cbb1167f5361\" (UID: \"21785c0a-645b-4a17-bfdf-cbb1167f5361\") " Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.135614 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21785c0a-645b-4a17-bfdf-cbb1167f5361-logs" (OuterVolumeSpecName: "logs") pod "21785c0a-645b-4a17-bfdf-cbb1167f5361" (UID: "21785c0a-645b-4a17-bfdf-cbb1167f5361"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.140708 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21785c0a-645b-4a17-bfdf-cbb1167f5361-kube-api-access-2r2rg" (OuterVolumeSpecName: "kube-api-access-2r2rg") pod "21785c0a-645b-4a17-bfdf-cbb1167f5361" (UID: "21785c0a-645b-4a17-bfdf-cbb1167f5361"). InnerVolumeSpecName "kube-api-access-2r2rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.148201 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-scripts" (OuterVolumeSpecName: "scripts") pod "21785c0a-645b-4a17-bfdf-cbb1167f5361" (UID: "21785c0a-645b-4a17-bfdf-cbb1167f5361"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.166685 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21785c0a-645b-4a17-bfdf-cbb1167f5361" (UID: "21785c0a-645b-4a17-bfdf-cbb1167f5361"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.171923 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-config-data" (OuterVolumeSpecName: "config-data") pod "21785c0a-645b-4a17-bfdf-cbb1167f5361" (UID: "21785c0a-645b-4a17-bfdf-cbb1167f5361"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.236253 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r2rg\" (UniqueName: \"kubernetes.io/projected/21785c0a-645b-4a17-bfdf-cbb1167f5361-kube-api-access-2r2rg\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.236288 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.236299 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.236309 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21785c0a-645b-4a17-bfdf-cbb1167f5361-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.236318 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21785c0a-645b-4a17-bfdf-cbb1167f5361-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.370925 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rw57h" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.372156 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rw57h" event={"ID":"21785c0a-645b-4a17-bfdf-cbb1167f5361","Type":"ContainerDied","Data":"f63a1bd9f2129c3ba2f24be59426ea6c73e805ec6363febf88c2e7cdcd4b2adc"} Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.372182 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f63a1bd9f2129c3ba2f24be59426ea6c73e805ec6363febf88c2e7cdcd4b2adc" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.372199 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.372518 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.557223 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7fc585cd46-xw4c5"] Mar 18 07:06:54 crc kubenswrapper[4917]: E0318 07:06:54.560960 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21785c0a-645b-4a17-bfdf-cbb1167f5361" containerName="placement-db-sync" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.560987 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="21785c0a-645b-4a17-bfdf-cbb1167f5361" containerName="placement-db-sync" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.561384 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="21785c0a-645b-4a17-bfdf-cbb1167f5361" containerName="placement-db-sync" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.562466 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.567224 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-zkqh7" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.567414 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.567527 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.567696 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.567807 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.571683 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fc585cd46-xw4c5"] Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.643358 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-internal-tls-certs\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.643792 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-scripts\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.643833 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-combined-ca-bundle\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.643874 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-logs\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.643955 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-config-data\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.644020 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm7dq\" (UniqueName: \"kubernetes.io/projected/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-kube-api-access-xm7dq\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.644055 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-public-tls-certs\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.745000 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-config-data\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.745062 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm7dq\" (UniqueName: \"kubernetes.io/projected/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-kube-api-access-xm7dq\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.745110 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-public-tls-certs\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.745455 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-internal-tls-certs\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.747149 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-scripts\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.747216 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-combined-ca-bundle\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.747261 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-logs\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.747715 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-logs\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.750074 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-public-tls-certs\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.750320 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-config-data\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.750424 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-combined-ca-bundle\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.751015 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-scripts\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.752358 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-internal-tls-certs\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.768684 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm7dq\" (UniqueName: \"kubernetes.io/projected/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-kube-api-access-xm7dq\") pod \"placement-7fc585cd46-xw4c5\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:54 crc kubenswrapper[4917]: I0318 07:06:54.880372 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.091078 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.172899 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-combined-ca-bundle\") pod \"16611612-0229-4cd5-9877-ccccc8bf60de\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.172996 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-scripts\") pod \"16611612-0229-4cd5-9877-ccccc8bf60de\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.173067 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-fernet-keys\") pod \"16611612-0229-4cd5-9877-ccccc8bf60de\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.173104 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-credential-keys\") pod \"16611612-0229-4cd5-9877-ccccc8bf60de\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.173161 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-config-data\") pod \"16611612-0229-4cd5-9877-ccccc8bf60de\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.173191 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzgls\" (UniqueName: \"kubernetes.io/projected/16611612-0229-4cd5-9877-ccccc8bf60de-kube-api-access-jzgls\") pod \"16611612-0229-4cd5-9877-ccccc8bf60de\" (UID: \"16611612-0229-4cd5-9877-ccccc8bf60de\") " Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.178009 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-scripts" (OuterVolumeSpecName: "scripts") pod "16611612-0229-4cd5-9877-ccccc8bf60de" (UID: "16611612-0229-4cd5-9877-ccccc8bf60de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.178029 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "16611612-0229-4cd5-9877-ccccc8bf60de" (UID: "16611612-0229-4cd5-9877-ccccc8bf60de"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.179572 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16611612-0229-4cd5-9877-ccccc8bf60de-kube-api-access-jzgls" (OuterVolumeSpecName: "kube-api-access-jzgls") pod "16611612-0229-4cd5-9877-ccccc8bf60de" (UID: "16611612-0229-4cd5-9877-ccccc8bf60de"). InnerVolumeSpecName "kube-api-access-jzgls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.191102 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "16611612-0229-4cd5-9877-ccccc8bf60de" (UID: "16611612-0229-4cd5-9877-ccccc8bf60de"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.207575 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16611612-0229-4cd5-9877-ccccc8bf60de" (UID: "16611612-0229-4cd5-9877-ccccc8bf60de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.208703 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-config-data" (OuterVolumeSpecName: "config-data") pod "16611612-0229-4cd5-9877-ccccc8bf60de" (UID: "16611612-0229-4cd5-9877-ccccc8bf60de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.275330 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.275369 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzgls\" (UniqueName: \"kubernetes.io/projected/16611612-0229-4cd5-9877-ccccc8bf60de-kube-api-access-jzgls\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.275380 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.275388 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.275396 4917 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.275404 4917 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16611612-0229-4cd5-9877-ccccc8bf60de-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.298699 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.397827 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.399136 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wpl6r" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.399285 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wpl6r" event={"ID":"16611612-0229-4cd5-9877-ccccc8bf60de","Type":"ContainerDied","Data":"1111a5c1e1030374c28a65653c73f8742497b834ba27f3edc23bdc2fc9a6e2e5"} Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.399450 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1111a5c1e1030374c28a65653c73f8742497b834ba27f3edc23bdc2fc9a6e2e5" Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.444649 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7fc585cd46-xw4c5"] Mar 18 07:06:56 crc kubenswrapper[4917]: I0318 07:06:56.806358 4917 scope.go:117] "RemoveContainer" containerID="bfa87acff2b656f0233fae7d871deb2516ab159c286a9396e0be62ecd6f350ca" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.177771 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-f48d59955-cprlv"] Mar 18 07:06:57 crc kubenswrapper[4917]: E0318 07:06:57.178111 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16611612-0229-4cd5-9877-ccccc8bf60de" containerName="keystone-bootstrap" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.178124 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="16611612-0229-4cd5-9877-ccccc8bf60de" containerName="keystone-bootstrap" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.178461 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="16611612-0229-4cd5-9877-ccccc8bf60de" containerName="keystone-bootstrap" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.178995 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.184256 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.184279 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.188172 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.188315 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.192939 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.193296 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-5j76w" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.194876 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f48d59955-cprlv"] Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.295519 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-fernet-keys\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.295654 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-config-data\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.295932 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-credential-keys\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.295998 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-scripts\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.296042 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47ckz\" (UniqueName: \"kubernetes.io/projected/3fe020eb-0bd4-4efa-9711-3f07ce31907c-kube-api-access-47ckz\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.296074 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-public-tls-certs\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.296127 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-internal-tls-certs\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.296177 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-combined-ca-bundle\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.397313 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-credential-keys\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.397383 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-scripts\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.397408 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47ckz\" (UniqueName: \"kubernetes.io/projected/3fe020eb-0bd4-4efa-9711-3f07ce31907c-kube-api-access-47ckz\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.397426 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-public-tls-certs\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.397449 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-internal-tls-certs\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.397465 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-combined-ca-bundle\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.397525 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-fernet-keys\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.397557 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-config-data\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.402844 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-scripts\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.403456 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-public-tls-certs\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.404978 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-config-data\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.407844 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-combined-ca-bundle\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.408723 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-fernet-keys\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.410047 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-internal-tls-certs\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.410877 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-credential-keys\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.415365 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47ckz\" (UniqueName: \"kubernetes.io/projected/3fe020eb-0bd4-4efa-9711-3f07ce31907c-kube-api-access-47ckz\") pod \"keystone-f48d59955-cprlv\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:57 crc kubenswrapper[4917]: I0318 07:06:57.497700 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:58 crc kubenswrapper[4917]: I0318 07:06:58.422118 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fc585cd46-xw4c5" event={"ID":"c9492f8d-33b1-4ea7-9f85-0137cc2443ed","Type":"ContainerStarted","Data":"82b032445208a2c8a4834db5814b65f14434c9626d0d60b8ce4d9d30da355c1b"} Mar 18 07:06:58 crc kubenswrapper[4917]: I0318 07:06:58.422736 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fc585cd46-xw4c5" event={"ID":"c9492f8d-33b1-4ea7-9f85-0137cc2443ed","Type":"ContainerStarted","Data":"459b3361d8db3725080c2527685e226c0066d82d93763c9b404e8a525760513a"} Mar 18 07:06:58 crc kubenswrapper[4917]: I0318 07:06:58.587266 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-f48d59955-cprlv"] Mar 18 07:06:58 crc kubenswrapper[4917]: W0318 07:06:58.607196 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fe020eb_0bd4_4efa_9711_3f07ce31907c.slice/crio-21b378879cfae082d0af9a227d89c23bed61173340d4db654b8bbf653c807925 WatchSource:0}: Error finding container 21b378879cfae082d0af9a227d89c23bed61173340d4db654b8bbf653c807925: Status 404 returned error can't find the container with id 21b378879cfae082d0af9a227d89c23bed61173340d4db654b8bbf653c807925 Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.083333 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.139212 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-v7jt7"] Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.139472 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" podUID="217c88fe-a91f-4cc2-886e-1489ffa9d5b6" containerName="dnsmasq-dns" containerID="cri-o://7a7a6c6de4ff0537660534db476062b80782d735d2278015f3c6c3958c54f567" gracePeriod=10 Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.431361 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f48d59955-cprlv" event={"ID":"3fe020eb-0bd4-4efa-9711-3f07ce31907c","Type":"ContainerStarted","Data":"e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b"} Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.431403 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f48d59955-cprlv" event={"ID":"3fe020eb-0bd4-4efa-9711-3f07ce31907c","Type":"ContainerStarted","Data":"21b378879cfae082d0af9a227d89c23bed61173340d4db654b8bbf653c807925"} Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.432225 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.441786 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fc585cd46-xw4c5" event={"ID":"c9492f8d-33b1-4ea7-9f85-0137cc2443ed","Type":"ContainerStarted","Data":"bec7bb8157826d333559c4ee554e54a0b97bbeb1ae1a10fa37e602246aa02d77"} Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.442541 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.442575 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.446952 4917 generic.go:334] "Generic (PLEG): container finished" podID="217c88fe-a91f-4cc2-886e-1489ffa9d5b6" containerID="7a7a6c6de4ff0537660534db476062b80782d735d2278015f3c6c3958c54f567" exitCode=0 Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.447003 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" event={"ID":"217c88fe-a91f-4cc2-886e-1489ffa9d5b6","Type":"ContainerDied","Data":"7a7a6c6de4ff0537660534db476062b80782d735d2278015f3c6c3958c54f567"} Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.464532 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.465211 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.465314 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerStarted","Data":"d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a"} Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.466322 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-f48d59955-cprlv" podStartSLOduration=2.466300427 podStartE2EDuration="2.466300427s" podCreationTimestamp="2026-03-18 07:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:59.463091259 +0000 UTC m=+1204.404245973" watchObservedRunningTime="2026-03-18 07:06:59.466300427 +0000 UTC m=+1204.407455151" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.484354 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7fc585cd46-xw4c5" podStartSLOduration=5.484317456 podStartE2EDuration="5.484317456s" podCreationTimestamp="2026-03-18 07:06:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:06:59.482653727 +0000 UTC m=+1204.423808441" watchObservedRunningTime="2026-03-18 07:06:59.484317456 +0000 UTC m=+1204.425472170" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.502893 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.506089 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7vl5b" event={"ID":"e4fc471f-181e-4ace-b02d-beb73c8ea737","Type":"ContainerStarted","Data":"11408c301fda5c885632b28cf6089dd76b4a73e7d406e2ed4352533d71d8e7d9"} Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.549845 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-7vl5b" podStartSLOduration=2.636347176 podStartE2EDuration="34.549827111s" podCreationTimestamp="2026-03-18 07:06:25 +0000 UTC" firstStartedPulling="2026-03-18 07:06:26.389084176 +0000 UTC m=+1171.330238890" lastFinishedPulling="2026-03-18 07:06:58.302564111 +0000 UTC m=+1203.243718825" observedRunningTime="2026-03-18 07:06:59.537419255 +0000 UTC m=+1204.478573969" watchObservedRunningTime="2026-03-18 07:06:59.549827111 +0000 UTC m=+1204.490981825" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.587994 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.691975 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.746213 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-config\") pod \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.746277 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-svc\") pod \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.746378 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-nb\") pod \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.746440 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-sb\") pod \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.746528 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njvbw\" (UniqueName: \"kubernetes.io/projected/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-kube-api-access-njvbw\") pod \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.746615 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-swift-storage-0\") pod \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.767868 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-kube-api-access-njvbw" (OuterVolumeSpecName: "kube-api-access-njvbw") pod "217c88fe-a91f-4cc2-886e-1489ffa9d5b6" (UID: "217c88fe-a91f-4cc2-886e-1489ffa9d5b6"). InnerVolumeSpecName "kube-api-access-njvbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.825669 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-config" (OuterVolumeSpecName: "config") pod "217c88fe-a91f-4cc2-886e-1489ffa9d5b6" (UID: "217c88fe-a91f-4cc2-886e-1489ffa9d5b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.833301 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "217c88fe-a91f-4cc2-886e-1489ffa9d5b6" (UID: "217c88fe-a91f-4cc2-886e-1489ffa9d5b6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.837863 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "217c88fe-a91f-4cc2-886e-1489ffa9d5b6" (UID: "217c88fe-a91f-4cc2-886e-1489ffa9d5b6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.854162 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "217c88fe-a91f-4cc2-886e-1489ffa9d5b6" (UID: "217c88fe-a91f-4cc2-886e-1489ffa9d5b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.854451 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-nb\") pod \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\" (UID: \"217c88fe-a91f-4cc2-886e-1489ffa9d5b6\") " Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.855345 4917 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.855377 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.855390 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.855403 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njvbw\" (UniqueName: \"kubernetes.io/projected/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-kube-api-access-njvbw\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:59 crc kubenswrapper[4917]: W0318 07:06:59.855506 4917 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/217c88fe-a91f-4cc2-886e-1489ffa9d5b6/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.855521 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "217c88fe-a91f-4cc2-886e-1489ffa9d5b6" (UID: "217c88fe-a91f-4cc2-886e-1489ffa9d5b6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.863051 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "217c88fe-a91f-4cc2-886e-1489ffa9d5b6" (UID: "217c88fe-a91f-4cc2-886e-1489ffa9d5b6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.956762 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:06:59 crc kubenswrapper[4917]: I0318 07:06:59.957136 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/217c88fe-a91f-4cc2-886e-1489ffa9d5b6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.518615 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9ntb4" event={"ID":"8a06e756-cae4-44b6-8bed-4951e159e223","Type":"ContainerStarted","Data":"ef99ff0c8bb9a55c09de7e1fee465d5b783cfe8b54162cde0f1b11477802c4a9"} Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.533833 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" event={"ID":"217c88fe-a91f-4cc2-886e-1489ffa9d5b6","Type":"ContainerDied","Data":"ddb58da5c118656446ea670d64be4f15a6d614e8507b698025ab4c8c689252cf"} Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.533900 4917 scope.go:117] "RemoveContainer" containerID="7a7a6c6de4ff0537660534db476062b80782d735d2278015f3c6c3958c54f567" Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.534280 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d5dc7cf69-v7jt7" Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.536726 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.536772 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.563366 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-9ntb4" podStartSLOduration=2.681039243 podStartE2EDuration="35.563345148s" podCreationTimestamp="2026-03-18 07:06:25 +0000 UTC" firstStartedPulling="2026-03-18 07:06:26.313517661 +0000 UTC m=+1171.254672375" lastFinishedPulling="2026-03-18 07:06:59.195823566 +0000 UTC m=+1204.136978280" observedRunningTime="2026-03-18 07:07:00.548903933 +0000 UTC m=+1205.490058657" watchObservedRunningTime="2026-03-18 07:07:00.563345148 +0000 UTC m=+1205.504499862" Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.601671 4917 scope.go:117] "RemoveContainer" containerID="cf5781c9d765e5bfe3262ba2904bdfdaf37927348ea89c5f9645b88ff56261f9" Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.605271 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-v7jt7"] Mar 18 07:07:00 crc kubenswrapper[4917]: I0318 07:07:00.616017 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d5dc7cf69-v7jt7"] Mar 18 07:07:01 crc kubenswrapper[4917]: I0318 07:07:01.551931 4917 generic.go:334] "Generic (PLEG): container finished" podID="e4fc471f-181e-4ace-b02d-beb73c8ea737" containerID="11408c301fda5c885632b28cf6089dd76b4a73e7d406e2ed4352533d71d8e7d9" exitCode=0 Mar 18 07:07:01 crc kubenswrapper[4917]: I0318 07:07:01.552006 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7vl5b" event={"ID":"e4fc471f-181e-4ace-b02d-beb73c8ea737","Type":"ContainerDied","Data":"11408c301fda5c885632b28cf6089dd76b4a73e7d406e2ed4352533d71d8e7d9"} Mar 18 07:07:01 crc kubenswrapper[4917]: I0318 07:07:01.782339 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217c88fe-a91f-4cc2-886e-1489ffa9d5b6" path="/var/lib/kubelet/pods/217c88fe-a91f-4cc2-886e-1489ffa9d5b6/volumes" Mar 18 07:07:02 crc kubenswrapper[4917]: I0318 07:07:02.476772 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 07:07:02 crc kubenswrapper[4917]: I0318 07:07:02.479106 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 07:07:04 crc kubenswrapper[4917]: E0318 07:07:04.396267 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a06e756_cae4_44b6_8bed_4951e159e223.slice/crio-conmon-ef99ff0c8bb9a55c09de7e1fee465d5b783cfe8b54162cde0f1b11477802c4a9.scope\": RecentStats: unable to find data in memory cache]" Mar 18 07:07:04 crc kubenswrapper[4917]: I0318 07:07:04.617078 4917 generic.go:334] "Generic (PLEG): container finished" podID="8a06e756-cae4-44b6-8bed-4951e159e223" containerID="ef99ff0c8bb9a55c09de7e1fee465d5b783cfe8b54162cde0f1b11477802c4a9" exitCode=0 Mar 18 07:07:04 crc kubenswrapper[4917]: I0318 07:07:04.617113 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9ntb4" event={"ID":"8a06e756-cae4-44b6-8bed-4951e159e223","Type":"ContainerDied","Data":"ef99ff0c8bb9a55c09de7e1fee465d5b783cfe8b54162cde0f1b11477802c4a9"} Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.308635 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.464299 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-db-sync-config-data\") pod \"e4fc471f-181e-4ace-b02d-beb73c8ea737\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.464425 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc9n6\" (UniqueName: \"kubernetes.io/projected/e4fc471f-181e-4ace-b02d-beb73c8ea737-kube-api-access-kc9n6\") pod \"e4fc471f-181e-4ace-b02d-beb73c8ea737\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.464524 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-combined-ca-bundle\") pod \"e4fc471f-181e-4ace-b02d-beb73c8ea737\" (UID: \"e4fc471f-181e-4ace-b02d-beb73c8ea737\") " Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.474691 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e4fc471f-181e-4ace-b02d-beb73c8ea737" (UID: "e4fc471f-181e-4ace-b02d-beb73c8ea737"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.478543 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4fc471f-181e-4ace-b02d-beb73c8ea737-kube-api-access-kc9n6" (OuterVolumeSpecName: "kube-api-access-kc9n6") pod "e4fc471f-181e-4ace-b02d-beb73c8ea737" (UID: "e4fc471f-181e-4ace-b02d-beb73c8ea737"). InnerVolumeSpecName "kube-api-access-kc9n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.489978 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4fc471f-181e-4ace-b02d-beb73c8ea737" (UID: "e4fc471f-181e-4ace-b02d-beb73c8ea737"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.567022 4917 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.567062 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc9n6\" (UniqueName: \"kubernetes.io/projected/e4fc471f-181e-4ace-b02d-beb73c8ea737-kube-api-access-kc9n6\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.567074 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4fc471f-181e-4ace-b02d-beb73c8ea737-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.626852 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7vl5b" event={"ID":"e4fc471f-181e-4ace-b02d-beb73c8ea737","Type":"ContainerDied","Data":"b5527babdf4b5b25487bd108df9daf658f0f90514ba87beba0ae3f1712df9645"} Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.626896 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5527babdf4b5b25487bd108df9daf658f0f90514ba87beba0ae3f1712df9645" Mar 18 07:07:05 crc kubenswrapper[4917]: I0318 07:07:05.626861 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7vl5b" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.022228 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.177454 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-combined-ca-bundle\") pod \"8a06e756-cae4-44b6-8bed-4951e159e223\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.177539 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-scripts\") pod \"8a06e756-cae4-44b6-8bed-4951e159e223\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.177568 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-db-sync-config-data\") pod \"8a06e756-cae4-44b6-8bed-4951e159e223\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.177626 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-config-data\") pod \"8a06e756-cae4-44b6-8bed-4951e159e223\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.177714 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zskjd\" (UniqueName: \"kubernetes.io/projected/8a06e756-cae4-44b6-8bed-4951e159e223-kube-api-access-zskjd\") pod \"8a06e756-cae4-44b6-8bed-4951e159e223\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.177766 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a06e756-cae4-44b6-8bed-4951e159e223-etc-machine-id\") pod \"8a06e756-cae4-44b6-8bed-4951e159e223\" (UID: \"8a06e756-cae4-44b6-8bed-4951e159e223\") " Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.178227 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a06e756-cae4-44b6-8bed-4951e159e223-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8a06e756-cae4-44b6-8bed-4951e159e223" (UID: "8a06e756-cae4-44b6-8bed-4951e159e223"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.178929 4917 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a06e756-cae4-44b6-8bed-4951e159e223-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.182733 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8a06e756-cae4-44b6-8bed-4951e159e223" (UID: "8a06e756-cae4-44b6-8bed-4951e159e223"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.183842 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a06e756-cae4-44b6-8bed-4951e159e223-kube-api-access-zskjd" (OuterVolumeSpecName: "kube-api-access-zskjd") pod "8a06e756-cae4-44b6-8bed-4951e159e223" (UID: "8a06e756-cae4-44b6-8bed-4951e159e223"). InnerVolumeSpecName "kube-api-access-zskjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.184087 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-scripts" (OuterVolumeSpecName: "scripts") pod "8a06e756-cae4-44b6-8bed-4951e159e223" (UID: "8a06e756-cae4-44b6-8bed-4951e159e223"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.205720 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a06e756-cae4-44b6-8bed-4951e159e223" (UID: "8a06e756-cae4-44b6-8bed-4951e159e223"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.222695 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-config-data" (OuterVolumeSpecName: "config-data") pod "8a06e756-cae4-44b6-8bed-4951e159e223" (UID: "8a06e756-cae4-44b6-8bed-4951e159e223"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.280698 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zskjd\" (UniqueName: \"kubernetes.io/projected/8a06e756-cae4-44b6-8bed-4951e159e223-kube-api-access-zskjd\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.280730 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.280755 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.280764 4917 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.280772 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a06e756-cae4-44b6-8bed-4951e159e223-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.641512 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-b749f8cd6-w2w6h"] Mar 18 07:07:06 crc kubenswrapper[4917]: E0318 07:07:06.642134 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217c88fe-a91f-4cc2-886e-1489ffa9d5b6" containerName="dnsmasq-dns" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.642147 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="217c88fe-a91f-4cc2-886e-1489ffa9d5b6" containerName="dnsmasq-dns" Mar 18 07:07:06 crc kubenswrapper[4917]: E0318 07:07:06.642161 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4fc471f-181e-4ace-b02d-beb73c8ea737" containerName="barbican-db-sync" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.642167 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4fc471f-181e-4ace-b02d-beb73c8ea737" containerName="barbican-db-sync" Mar 18 07:07:06 crc kubenswrapper[4917]: E0318 07:07:06.642178 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a06e756-cae4-44b6-8bed-4951e159e223" containerName="cinder-db-sync" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.642184 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a06e756-cae4-44b6-8bed-4951e159e223" containerName="cinder-db-sync" Mar 18 07:07:06 crc kubenswrapper[4917]: E0318 07:07:06.642202 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="217c88fe-a91f-4cc2-886e-1489ffa9d5b6" containerName="init" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.642208 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="217c88fe-a91f-4cc2-886e-1489ffa9d5b6" containerName="init" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.642356 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="217c88fe-a91f-4cc2-886e-1489ffa9d5b6" containerName="dnsmasq-dns" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.642393 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a06e756-cae4-44b6-8bed-4951e159e223" containerName="cinder-db-sync" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.642411 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4fc471f-181e-4ace-b02d-beb73c8ea737" containerName="barbican-db-sync" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.651255 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.659832 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6v5mv" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.660053 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.660219 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.676739 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-9ntb4" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.677526 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5948845567-w7h4j"] Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.692075 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-9ntb4" event={"ID":"8a06e756-cae4-44b6-8bed-4951e159e223","Type":"ContainerDied","Data":"3672006b6f9c924ec71ed496e2107e4e7ea0bcf489683d5964f4de7cd6051e8f"} Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.692134 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3672006b6f9c924ec71ed496e2107e4e7ea0bcf489683d5964f4de7cd6051e8f" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.692215 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.694825 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.727433 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5948845567-w7h4j"] Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.767463 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b749f8cd6-w2w6h"] Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794165 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data-custom\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794225 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dj29\" (UniqueName: \"kubernetes.io/projected/156b1187-23ee-4a81-8d1d-ad91c2468b7d-kube-api-access-8dj29\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794255 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794300 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-combined-ca-bundle\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794317 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2zbs\" (UniqueName: \"kubernetes.io/projected/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-kube-api-access-z2zbs\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794337 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data-custom\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794356 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156b1187-23ee-4a81-8d1d-ad91c2468b7d-logs\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794808 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794861 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-logs\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.794897 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-combined-ca-bundle\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.848643 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-774b7f4d7c-6dlmq"] Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.851197 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.888399 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-774b7f4d7c-6dlmq"] Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.895851 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-combined-ca-bundle\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.895923 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data-custom\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.895961 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dj29\" (UniqueName: \"kubernetes.io/projected/156b1187-23ee-4a81-8d1d-ad91c2468b7d-kube-api-access-8dj29\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.895983 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.896008 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-combined-ca-bundle\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.896023 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2zbs\" (UniqueName: \"kubernetes.io/projected/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-kube-api-access-z2zbs\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.896040 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data-custom\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.896057 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156b1187-23ee-4a81-8d1d-ad91c2468b7d-logs\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.896074 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.896109 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-logs\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.896521 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-logs\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.902864 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156b1187-23ee-4a81-8d1d-ad91c2468b7d-logs\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.910251 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-combined-ca-bundle\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.912053 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-combined-ca-bundle\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.912103 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d75d99dc6-t8qt7"] Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.912491 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data-custom\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.913503 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.912497 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data-custom\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.922293 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.925719 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.927701 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.945832 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dj29\" (UniqueName: \"kubernetes.io/projected/156b1187-23ee-4a81-8d1d-ad91c2468b7d-kube-api-access-8dj29\") pod \"barbican-keystone-listener-b749f8cd6-w2w6h\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.955552 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d75d99dc6-t8qt7"] Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.956538 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2zbs\" (UniqueName: \"kubernetes.io/projected/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-kube-api-access-z2zbs\") pod \"barbican-worker-5948845567-w7h4j\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.993877 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.995381 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.998261 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-nb\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.998336 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-combined-ca-bundle\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.998379 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.998454 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-sb\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:06 crc kubenswrapper[4917]: I0318 07:07:06.998500 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmlw4\" (UniqueName: \"kubernetes.io/projected/558cd5bd-4452-464b-be7f-3fb70a6dabb9-kube-api-access-mmlw4\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:06.998665 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/558cd5bd-4452-464b-be7f-3fb70a6dabb9-logs\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.003573 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-config\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.003650 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data-custom\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.003685 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prnx7\" (UniqueName: \"kubernetes.io/projected/3feb9887-73ab-494a-945f-a519183ef56e-kube-api-access-prnx7\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.003826 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-swift-storage-0\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.003857 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-svc\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.005197 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.005633 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.005759 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xqnr5" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.005886 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.036107 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.078648 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774b7f4d7c-6dlmq"] Mar 18 07:07:07 crc kubenswrapper[4917]: E0318 07:07:07.079284 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-prnx7 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" podUID="3feb9887-73ab-494a-945f-a519183ef56e" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.095009 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.098863 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-94c999df7-cjnd8"] Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.100385 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.104920 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-cjnd8"] Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106602 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106635 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-config\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106660 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3db0a703-a265-4128-8f61-309293963e63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106677 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data-custom\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106696 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106713 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prnx7\" (UniqueName: \"kubernetes.io/projected/3feb9887-73ab-494a-945f-a519183ef56e-kube-api-access-prnx7\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106766 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-swift-storage-0\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106786 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-svc\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106802 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnfkf\" (UniqueName: \"kubernetes.io/projected/3db0a703-a265-4128-8f61-309293963e63-kube-api-access-mnfkf\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106864 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-nb\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106885 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-combined-ca-bundle\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106900 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106919 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-sb\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106945 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmlw4\" (UniqueName: \"kubernetes.io/projected/558cd5bd-4452-464b-be7f-3fb70a6dabb9-kube-api-access-mmlw4\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106965 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106983 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/558cd5bd-4452-464b-be7f-3fb70a6dabb9-logs\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.106998 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-scripts\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.107801 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-config\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.108754 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/558cd5bd-4452-464b-be7f-3fb70a6dabb9-logs\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.108817 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-nb\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.110448 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-swift-storage-0\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.111985 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data-custom\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.112062 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-combined-ca-bundle\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.112454 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-svc\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.114495 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-sb\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.117021 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.121039 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.135221 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmlw4\" (UniqueName: \"kubernetes.io/projected/558cd5bd-4452-464b-be7f-3fb70a6dabb9-kube-api-access-mmlw4\") pod \"barbican-api-5d75d99dc6-t8qt7\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.135309 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prnx7\" (UniqueName: \"kubernetes.io/projected/3feb9887-73ab-494a-945f-a519183ef56e-kube-api-access-prnx7\") pod \"dnsmasq-dns-774b7f4d7c-6dlmq\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.191148 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.192487 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.201279 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.210643 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211562 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211609 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-scripts\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211633 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211653 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-config\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211677 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3db0a703-a265-4128-8f61-309293963e63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211697 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211711 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-sb\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211730 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-svc\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211761 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t6md\" (UniqueName: \"kubernetes.io/projected/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-kube-api-access-9t6md\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211801 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnfkf\" (UniqueName: \"kubernetes.io/projected/3db0a703-a265-4128-8f61-309293963e63-kube-api-access-mnfkf\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211852 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-nb\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211883 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-swift-storage-0\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.211957 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3db0a703-a265-4128-8f61-309293963e63-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.218087 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.223619 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.224381 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-scripts\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.224545 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.249949 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnfkf\" (UniqueName: \"kubernetes.io/projected/3db0a703-a265-4128-8f61-309293963e63-kube-api-access-mnfkf\") pod \"cinder-scheduler-0\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.264990 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.316514 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-swift-storage-0\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.316598 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.316637 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-config\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.316658 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-scripts\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.316679 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-sb\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.316693 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-svc\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.316828 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-logs\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.316896 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t6md\" (UniqueName: \"kubernetes.io/projected/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-kube-api-access-9t6md\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.316940 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46cm\" (UniqueName: \"kubernetes.io/projected/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-kube-api-access-g46cm\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.317121 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data-custom\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.317144 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.317189 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-nb\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.317312 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.317422 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-svc\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.318005 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-sb\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.318093 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-config\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.318594 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-swift-storage-0\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.322197 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-nb\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.322730 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.339648 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t6md\" (UniqueName: \"kubernetes.io/projected/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-kube-api-access-9t6md\") pod \"dnsmasq-dns-94c999df7-cjnd8\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.419181 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-scripts\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.421031 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.421130 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-logs\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.421233 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g46cm\" (UniqueName: \"kubernetes.io/projected/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-kube-api-access-g46cm\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.421459 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data-custom\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.421499 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.421633 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.421731 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.421880 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.425976 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-scripts\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.426097 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data-custom\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.426421 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-logs\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.426561 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.431439 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.449314 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46cm\" (UniqueName: \"kubernetes.io/projected/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-kube-api-access-g46cm\") pod \"cinder-api-0\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.674018 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-b749f8cd6-w2w6h"] Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.693419 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" event={"ID":"156b1187-23ee-4a81-8d1d-ad91c2468b7d","Type":"ContainerStarted","Data":"9ec7b84ab64b0d0c583b8018f43d660e75ae61c45f617fb8f7929be5ae5f08c7"} Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.698795 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.699634 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="ceilometer-central-agent" containerID="cri-o://5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d" gracePeriod=30 Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.699848 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerStarted","Data":"0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924"} Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.700007 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.700079 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="proxy-httpd" containerID="cri-o://0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924" gracePeriod=30 Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.700147 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="sg-core" containerID="cri-o://d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a" gracePeriod=30 Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.700202 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="ceilometer-notification-agent" containerID="cri-o://ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd" gracePeriod=30 Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.716757 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.731189 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.739259 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.823026694 podStartE2EDuration="42.739237286s" podCreationTimestamp="2026-03-18 07:06:25 +0000 UTC" firstStartedPulling="2026-03-18 07:06:26.609649164 +0000 UTC m=+1171.550803878" lastFinishedPulling="2026-03-18 07:07:06.525859756 +0000 UTC m=+1211.467014470" observedRunningTime="2026-03-18 07:07:07.723984622 +0000 UTC m=+1212.665139356" watchObservedRunningTime="2026-03-18 07:07:07.739237286 +0000 UTC m=+1212.680392010" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.748253 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5948845567-w7h4j"] Mar 18 07:07:07 crc kubenswrapper[4917]: W0318 07:07:07.758594 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb3c95e_6b59_4ca7_aef2_8bae3eb9dca6.slice/crio-1a93e29be7f00a03a0efedd2eb748f7b33a31515679236629a47bee3e7af1df1 WatchSource:0}: Error finding container 1a93e29be7f00a03a0efedd2eb748f7b33a31515679236629a47bee3e7af1df1: Status 404 returned error can't find the container with id 1a93e29be7f00a03a0efedd2eb748f7b33a31515679236629a47bee3e7af1df1 Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.830768 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prnx7\" (UniqueName: \"kubernetes.io/projected/3feb9887-73ab-494a-945f-a519183ef56e-kube-api-access-prnx7\") pod \"3feb9887-73ab-494a-945f-a519183ef56e\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.831148 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-config\") pod \"3feb9887-73ab-494a-945f-a519183ef56e\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.831182 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-nb\") pod \"3feb9887-73ab-494a-945f-a519183ef56e\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.831218 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-swift-storage-0\") pod \"3feb9887-73ab-494a-945f-a519183ef56e\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.831253 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-svc\") pod \"3feb9887-73ab-494a-945f-a519183ef56e\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.831284 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-sb\") pod \"3feb9887-73ab-494a-945f-a519183ef56e\" (UID: \"3feb9887-73ab-494a-945f-a519183ef56e\") " Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.832270 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3feb9887-73ab-494a-945f-a519183ef56e" (UID: "3feb9887-73ab-494a-945f-a519183ef56e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.832325 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3feb9887-73ab-494a-945f-a519183ef56e" (UID: "3feb9887-73ab-494a-945f-a519183ef56e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.832342 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-config" (OuterVolumeSpecName: "config") pod "3feb9887-73ab-494a-945f-a519183ef56e" (UID: "3feb9887-73ab-494a-945f-a519183ef56e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.832448 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3feb9887-73ab-494a-945f-a519183ef56e" (UID: "3feb9887-73ab-494a-945f-a519183ef56e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.833457 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3feb9887-73ab-494a-945f-a519183ef56e" (UID: "3feb9887-73ab-494a-945f-a519183ef56e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.843099 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3feb9887-73ab-494a-945f-a519183ef56e-kube-api-access-prnx7" (OuterVolumeSpecName: "kube-api-access-prnx7") pod "3feb9887-73ab-494a-945f-a519183ef56e" (UID: "3feb9887-73ab-494a-945f-a519183ef56e"). InnerVolumeSpecName "kube-api-access-prnx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.868525 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d75d99dc6-t8qt7"] Mar 18 07:07:07 crc kubenswrapper[4917]: W0318 07:07:07.871325 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod558cd5bd_4452_464b_be7f_3fb70a6dabb9.slice/crio-00fbee2b0b8ef3a98ebd3183572af4ff8404bcbaf4b36d407107f32f8b2aa1b5 WatchSource:0}: Error finding container 00fbee2b0b8ef3a98ebd3183572af4ff8404bcbaf4b36d407107f32f8b2aa1b5: Status 404 returned error can't find the container with id 00fbee2b0b8ef3a98ebd3183572af4ff8404bcbaf4b36d407107f32f8b2aa1b5 Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.934150 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prnx7\" (UniqueName: \"kubernetes.io/projected/3feb9887-73ab-494a-945f-a519183ef56e-kube-api-access-prnx7\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.935270 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.935299 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.935313 4917 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.935323 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.935331 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3feb9887-73ab-494a-945f-a519183ef56e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.961323 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:07:07 crc kubenswrapper[4917]: W0318 07:07:07.976128 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61ec9fd8_58fa_4279_a404_61d4ef4f4c32.slice/crio-68605f20477f94fa82fb849e360c08c87fae23b68adfdefa0e303982cfb1f7f9 WatchSource:0}: Error finding container 68605f20477f94fa82fb849e360c08c87fae23b68adfdefa0e303982cfb1f7f9: Status 404 returned error can't find the container with id 68605f20477f94fa82fb849e360c08c87fae23b68adfdefa0e303982cfb1f7f9 Mar 18 07:07:07 crc kubenswrapper[4917]: I0318 07:07:07.976878 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-cjnd8"] Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.231765 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:07:08 crc kubenswrapper[4917]: W0318 07:07:08.240379 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa340a90_d6b8_4da7_ba99_cd74ac2b18a6.slice/crio-54d81f8b08a9c2604c8a3be1006f7b53ab6a78b76a26bc13fffbb75cf935234b WatchSource:0}: Error finding container 54d81f8b08a9c2604c8a3be1006f7b53ab6a78b76a26bc13fffbb75cf935234b: Status 404 returned error can't find the container with id 54d81f8b08a9c2604c8a3be1006f7b53ab6a78b76a26bc13fffbb75cf935234b Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.727832 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5948845567-w7h4j" event={"ID":"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6","Type":"ContainerStarted","Data":"1a93e29be7f00a03a0efedd2eb748f7b33a31515679236629a47bee3e7af1df1"} Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.731388 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3db0a703-a265-4128-8f61-309293963e63","Type":"ContainerStarted","Data":"01678cace47242222af79661e6c78c4f1055836029a1f9a19b40487ffbc70650"} Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.734796 4917 generic.go:334] "Generic (PLEG): container finished" podID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerID="0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924" exitCode=0 Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.734826 4917 generic.go:334] "Generic (PLEG): container finished" podID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerID="d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a" exitCode=2 Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.734834 4917 generic.go:334] "Generic (PLEG): container finished" podID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerID="5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d" exitCode=0 Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.734873 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerDied","Data":"0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924"} Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.734898 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerDied","Data":"d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a"} Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.734907 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerDied","Data":"5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d"} Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.735884 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6","Type":"ContainerStarted","Data":"54d81f8b08a9c2604c8a3be1006f7b53ab6a78b76a26bc13fffbb75cf935234b"} Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.736689 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d75d99dc6-t8qt7" event={"ID":"558cd5bd-4452-464b-be7f-3fb70a6dabb9","Type":"ContainerStarted","Data":"00fbee2b0b8ef3a98ebd3183572af4ff8404bcbaf4b36d407107f32f8b2aa1b5"} Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.738634 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-774b7f4d7c-6dlmq" Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.738682 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" event={"ID":"61ec9fd8-58fa-4279-a404-61d4ef4f4c32","Type":"ContainerStarted","Data":"68605f20477f94fa82fb849e360c08c87fae23b68adfdefa0e303982cfb1f7f9"} Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.792222 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-774b7f4d7c-6dlmq"] Mar 18 07:07:08 crc kubenswrapper[4917]: I0318 07:07:08.799752 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-774b7f4d7c-6dlmq"] Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.096575 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.555840 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.672327 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-config-data\") pod \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.672413 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-scripts\") pod \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.672462 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-run-httpd\") pod \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.672481 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z427\" (UniqueName: \"kubernetes.io/projected/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-kube-api-access-8z427\") pod \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.672496 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-log-httpd\") pod \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.672566 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-combined-ca-bundle\") pod \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.672647 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-sg-core-conf-yaml\") pod \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\" (UID: \"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c\") " Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.672971 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" (UID: "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.673447 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" (UID: "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.676377 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-scripts" (OuterVolumeSpecName: "scripts") pod "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" (UID: "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.682189 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-kube-api-access-8z427" (OuterVolumeSpecName: "kube-api-access-8z427") pod "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" (UID: "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c"). InnerVolumeSpecName "kube-api-access-8z427". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.700158 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" (UID: "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.754189 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6","Type":"ContainerStarted","Data":"cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916"} Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.758727 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" (UID: "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.759401 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d75d99dc6-t8qt7" event={"ID":"558cd5bd-4452-464b-be7f-3fb70a6dabb9","Type":"ContainerStarted","Data":"6fdfe90cf84e7de527631564d4574fee39bc40358870bd58a1a2804d7d68ec11"} Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.759430 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d75d99dc6-t8qt7" event={"ID":"558cd5bd-4452-464b-be7f-3fb70a6dabb9","Type":"ContainerStarted","Data":"c4648d7557570eb59c55bf30078a84d901b2fe55bb25f477c0a54a5e49afac30"} Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.759472 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.759485 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.765696 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-config-data" (OuterVolumeSpecName: "config-data") pod "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" (UID: "eac5da00-c2bc-4ab2-ace7-f7a29f9af50c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.766372 4917 generic.go:334] "Generic (PLEG): container finished" podID="61ec9fd8-58fa-4279-a404-61d4ef4f4c32" containerID="3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da" exitCode=0 Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.766485 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" event={"ID":"61ec9fd8-58fa-4279-a404-61d4ef4f4c32","Type":"ContainerDied","Data":"3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da"} Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.771251 4917 generic.go:334] "Generic (PLEG): container finished" podID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerID="ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd" exitCode=0 Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.771284 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerDied","Data":"ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd"} Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.771303 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eac5da00-c2bc-4ab2-ace7-f7a29f9af50c","Type":"ContainerDied","Data":"fe3b5e7ad7880b1fbfd76a5b0209043417f76c25d32926dfe7c63093ef51afd5"} Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.771319 4917 scope.go:117] "RemoveContainer" containerID="0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.771432 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.781268 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.781411 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.781428 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z427\" (UniqueName: \"kubernetes.io/projected/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-kube-api-access-8z427\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.781439 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.781448 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.781458 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.781466 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.790339 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d75d99dc6-t8qt7" podStartSLOduration=3.790321894 podStartE2EDuration="3.790321894s" podCreationTimestamp="2026-03-18 07:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:09.778107282 +0000 UTC m=+1214.719261996" watchObservedRunningTime="2026-03-18 07:07:09.790321894 +0000 UTC m=+1214.731476598" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.823071 4917 scope.go:117] "RemoveContainer" containerID="d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.827312 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3feb9887-73ab-494a-945f-a519183ef56e" path="/var/lib/kubelet/pods/3feb9887-73ab-494a-945f-a519183ef56e/volumes" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.851356 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.873024 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.887031 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:09 crc kubenswrapper[4917]: E0318 07:07:09.887618 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="ceilometer-notification-agent" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.887636 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="ceilometer-notification-agent" Mar 18 07:07:09 crc kubenswrapper[4917]: E0318 07:07:09.887656 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="sg-core" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.887664 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="sg-core" Mar 18 07:07:09 crc kubenswrapper[4917]: E0318 07:07:09.887698 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="proxy-httpd" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.887705 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="proxy-httpd" Mar 18 07:07:09 crc kubenswrapper[4917]: E0318 07:07:09.887728 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="ceilometer-central-agent" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.887735 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="ceilometer-central-agent" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.888042 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="ceilometer-notification-agent" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.888077 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="sg-core" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.888094 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="ceilometer-central-agent" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.888103 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" containerName="proxy-httpd" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.890853 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.895021 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.895065 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.897438 4917 scope.go:117] "RemoveContainer" containerID="ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.911366 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.937551 4917 scope.go:117] "RemoveContainer" containerID="5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.985245 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.985871 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dnq9\" (UniqueName: \"kubernetes.io/projected/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-kube-api-access-7dnq9\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.985952 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-config-data\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.985972 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.986016 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-log-httpd\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.986048 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-run-httpd\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:09 crc kubenswrapper[4917]: I0318 07:07:09.986094 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-scripts\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.087993 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.088034 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dnq9\" (UniqueName: \"kubernetes.io/projected/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-kube-api-access-7dnq9\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.088089 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-config-data\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.088106 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.088134 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-log-httpd\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.088170 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-run-httpd\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.088200 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-scripts\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.088834 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-log-httpd\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.088838 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-run-httpd\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.091711 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.092335 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-scripts\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.092990 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.093561 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-config-data\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.105361 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dnq9\" (UniqueName: \"kubernetes.io/projected/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-kube-api-access-7dnq9\") pod \"ceilometer-0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.205095 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.588458 4917 scope.go:117] "RemoveContainer" containerID="0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924" Mar 18 07:07:10 crc kubenswrapper[4917]: E0318 07:07:10.588887 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924\": container with ID starting with 0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924 not found: ID does not exist" containerID="0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.588928 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924"} err="failed to get container status \"0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924\": rpc error: code = NotFound desc = could not find container \"0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924\": container with ID starting with 0779615ba3d9554545774dc56ac469b74752525568484c6acbd6f9c61a557924 not found: ID does not exist" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.588973 4917 scope.go:117] "RemoveContainer" containerID="d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a" Mar 18 07:07:10 crc kubenswrapper[4917]: E0318 07:07:10.591667 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a\": container with ID starting with d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a not found: ID does not exist" containerID="d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.591725 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a"} err="failed to get container status \"d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a\": rpc error: code = NotFound desc = could not find container \"d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a\": container with ID starting with d2286a88b6e106a160513d26eff1acfcc38ef7b434ccf10ca369e1c39f0d924a not found: ID does not exist" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.591764 4917 scope.go:117] "RemoveContainer" containerID="ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd" Mar 18 07:07:10 crc kubenswrapper[4917]: E0318 07:07:10.592069 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd\": container with ID starting with ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd not found: ID does not exist" containerID="ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.592131 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd"} err="failed to get container status \"ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd\": rpc error: code = NotFound desc = could not find container \"ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd\": container with ID starting with ddfaf197b98e15b36dc9feb23ba8ac82dbacefea91cf19a4a948bc3e43f478fd not found: ID does not exist" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.592153 4917 scope.go:117] "RemoveContainer" containerID="5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d" Mar 18 07:07:10 crc kubenswrapper[4917]: E0318 07:07:10.592619 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d\": container with ID starting with 5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d not found: ID does not exist" containerID="5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.592673 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d"} err="failed to get container status \"5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d\": rpc error: code = NotFound desc = could not find container \"5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d\": container with ID starting with 5a770f1bd9b91d604ef926451aeb17e1852759d14ae0785d91352e98beccf24d not found: ID does not exist" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.816606 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" event={"ID":"61ec9fd8-58fa-4279-a404-61d4ef4f4c32","Type":"ContainerStarted","Data":"01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe"} Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.817203 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.823316 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3db0a703-a265-4128-8f61-309293963e63","Type":"ContainerStarted","Data":"f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d"} Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.834239 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" podStartSLOduration=3.834205686 podStartE2EDuration="3.834205686s" podCreationTimestamp="2026-03-18 07:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:10.83355573 +0000 UTC m=+1215.774710444" watchObservedRunningTime="2026-03-18 07:07:10.834205686 +0000 UTC m=+1215.775360400" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.838921 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerName="cinder-api-log" containerID="cri-o://cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916" gracePeriod=30 Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.839137 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6","Type":"ContainerStarted","Data":"d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3"} Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.839188 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.839390 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerName="cinder-api" containerID="cri-o://d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3" gracePeriod=30 Mar 18 07:07:10 crc kubenswrapper[4917]: I0318 07:07:10.864030 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.8640100779999997 podStartE2EDuration="3.864010078s" podCreationTimestamp="2026-03-18 07:07:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:10.859394958 +0000 UTC m=+1215.800549682" watchObservedRunningTime="2026-03-18 07:07:10.864010078 +0000 UTC m=+1215.805164792" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.082925 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:11 crc kubenswrapper[4917]: W0318 07:07:11.107056 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85d579cf_02f6_43b0_acd6_a2b9d00ce6a0.slice/crio-f2b5fc5c813e41eb4c0dacd4737d48891ace4019dcacc8dd224c78aa66096242 WatchSource:0}: Error finding container f2b5fc5c813e41eb4c0dacd4737d48891ace4019dcacc8dd224c78aa66096242: Status 404 returned error can't find the container with id f2b5fc5c813e41eb4c0dacd4737d48891ace4019dcacc8dd224c78aa66096242 Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.531390 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.621048 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-combined-ca-bundle\") pod \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.622272 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-etc-machine-id\") pod \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.622308 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-logs\") pod \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.622324 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" (UID: "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.622359 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data\") pod \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.622387 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-scripts\") pod \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.622454 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g46cm\" (UniqueName: \"kubernetes.io/projected/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-kube-api-access-g46cm\") pod \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.622509 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data-custom\") pod \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\" (UID: \"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6\") " Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.622893 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-logs" (OuterVolumeSpecName: "logs") pod "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" (UID: "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.622926 4917 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.626945 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-kube-api-access-g46cm" (OuterVolumeSpecName: "kube-api-access-g46cm") pod "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" (UID: "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6"). InnerVolumeSpecName "kube-api-access-g46cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.629672 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" (UID: "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.640701 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-scripts" (OuterVolumeSpecName: "scripts") pod "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" (UID: "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.648086 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" (UID: "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.688009 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data" (OuterVolumeSpecName: "config-data") pod "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" (UID: "fa340a90-d6b8-4da7-ba99-cd74ac2b18a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.724297 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.724351 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.724365 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.724377 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g46cm\" (UniqueName: \"kubernetes.io/projected/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-kube-api-access-g46cm\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.724391 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.724403 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.784496 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac5da00-c2bc-4ab2-ace7-f7a29f9af50c" path="/var/lib/kubelet/pods/eac5da00-c2bc-4ab2-ace7-f7a29f9af50c/volumes" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.856823 4917 generic.go:334] "Generic (PLEG): container finished" podID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerID="d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3" exitCode=0 Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.856867 4917 generic.go:334] "Generic (PLEG): container finished" podID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerID="cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916" exitCode=143 Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.856932 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.856970 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6","Type":"ContainerDied","Data":"d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3"} Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.857012 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6","Type":"ContainerDied","Data":"cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916"} Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.857042 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fa340a90-d6b8-4da7-ba99-cd74ac2b18a6","Type":"ContainerDied","Data":"54d81f8b08a9c2604c8a3be1006f7b53ab6a78b76a26bc13fffbb75cf935234b"} Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.857058 4917 scope.go:117] "RemoveContainer" containerID="d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.861847 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerStarted","Data":"f2b5fc5c813e41eb4c0dacd4737d48891ace4019dcacc8dd224c78aa66096242"} Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.869617 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" event={"ID":"156b1187-23ee-4a81-8d1d-ad91c2468b7d","Type":"ContainerStarted","Data":"1760f96d4b10c19a42b16a801413b85944dced5f3964dea049426aa637ed35f1"} Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.869656 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" event={"ID":"156b1187-23ee-4a81-8d1d-ad91c2468b7d","Type":"ContainerStarted","Data":"a36e72dd6a4a408d1a8611cc5d11c6ae06e79402f38ae0b454c8d8abb3d8a397"} Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.883509 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5948845567-w7h4j" event={"ID":"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6","Type":"ContainerStarted","Data":"1e96cd0c2b6ad866d42096dc84192a730d0d27e13ac4658e5fa9dd813dbe217c"} Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.883571 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5948845567-w7h4j" event={"ID":"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6","Type":"ContainerStarted","Data":"f5c44c7ab9e02426289634fa91304fcc0c9c17e0ffb94d8253cd509cae2718d1"} Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.884441 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.899090 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3db0a703-a265-4128-8f61-309293963e63","Type":"ContainerStarted","Data":"e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801"} Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.899171 4917 scope.go:117] "RemoveContainer" containerID="cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.907146 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.925003 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:07:11 crc kubenswrapper[4917]: E0318 07:07:11.925535 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerName="cinder-api" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.925557 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerName="cinder-api" Mar 18 07:07:11 crc kubenswrapper[4917]: E0318 07:07:11.925616 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerName="cinder-api-log" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.925627 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerName="cinder-api-log" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.925834 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerName="cinder-api" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.925877 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" containerName="cinder-api-log" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.931036 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.934878 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.935140 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.942503 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" podStartSLOduration=3.007963488 podStartE2EDuration="5.942485176s" podCreationTimestamp="2026-03-18 07:07:06 +0000 UTC" firstStartedPulling="2026-03-18 07:07:07.683836553 +0000 UTC m=+1212.624991267" lastFinishedPulling="2026-03-18 07:07:10.618358241 +0000 UTC m=+1215.559512955" observedRunningTime="2026-03-18 07:07:11.903652499 +0000 UTC m=+1216.844807223" watchObservedRunningTime="2026-03-18 07:07:11.942485176 +0000 UTC m=+1216.883639890" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.947914 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.951483 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.955340 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5948845567-w7h4j" podStartSLOduration=3.107512655 podStartE2EDuration="5.955322012s" podCreationTimestamp="2026-03-18 07:07:06 +0000 UTC" firstStartedPulling="2026-03-18 07:07:07.762242786 +0000 UTC m=+1212.703397500" lastFinishedPulling="2026-03-18 07:07:10.610052143 +0000 UTC m=+1215.551206857" observedRunningTime="2026-03-18 07:07:11.940847447 +0000 UTC m=+1216.882002211" watchObservedRunningTime="2026-03-18 07:07:11.955322012 +0000 UTC m=+1216.896476726" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.973951 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.492972426 podStartE2EDuration="5.973932857s" podCreationTimestamp="2026-03-18 07:07:06 +0000 UTC" firstStartedPulling="2026-03-18 07:07:07.968763818 +0000 UTC m=+1212.909918532" lastFinishedPulling="2026-03-18 07:07:09.449724249 +0000 UTC m=+1214.390878963" observedRunningTime="2026-03-18 07:07:11.971183592 +0000 UTC m=+1216.912338316" watchObservedRunningTime="2026-03-18 07:07:11.973932857 +0000 UTC m=+1216.915087581" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.978251 4917 scope.go:117] "RemoveContainer" containerID="d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3" Mar 18 07:07:11 crc kubenswrapper[4917]: E0318 07:07:11.978703 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3\": container with ID starting with d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3 not found: ID does not exist" containerID="d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.978734 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3"} err="failed to get container status \"d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3\": rpc error: code = NotFound desc = could not find container \"d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3\": container with ID starting with d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3 not found: ID does not exist" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.978755 4917 scope.go:117] "RemoveContainer" containerID="cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916" Mar 18 07:07:11 crc kubenswrapper[4917]: E0318 07:07:11.979035 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916\": container with ID starting with cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916 not found: ID does not exist" containerID="cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.979055 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916"} err="failed to get container status \"cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916\": rpc error: code = NotFound desc = could not find container \"cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916\": container with ID starting with cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916 not found: ID does not exist" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.979067 4917 scope.go:117] "RemoveContainer" containerID="d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.979396 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3"} err="failed to get container status \"d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3\": rpc error: code = NotFound desc = could not find container \"d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3\": container with ID starting with d51efbbf702ee6e437fba6e585f87eee4656484c150e60df0d93d831d90f09a3 not found: ID does not exist" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.979417 4917 scope.go:117] "RemoveContainer" containerID="cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916" Mar 18 07:07:11 crc kubenswrapper[4917]: I0318 07:07:11.979627 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916"} err="failed to get container status \"cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916\": rpc error: code = NotFound desc = could not find container \"cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916\": container with ID starting with cb08a8eebe25746702fba571856ee7221a73d2d9559365eab99c09953b097916 not found: ID does not exist" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.032369 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54ac6ac1-72cb-4383-8206-92169da43249-etc-machine-id\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.037681 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-public-tls-certs\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.038064 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data-custom\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.038089 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ac6ac1-72cb-4383-8206-92169da43249-logs\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.038150 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75qmk\" (UniqueName: \"kubernetes.io/projected/54ac6ac1-72cb-4383-8206-92169da43249-kube-api-access-75qmk\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.038215 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.038346 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-scripts\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.038408 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.038446 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.140443 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data-custom\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.140515 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ac6ac1-72cb-4383-8206-92169da43249-logs\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.140637 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75qmk\" (UniqueName: \"kubernetes.io/projected/54ac6ac1-72cb-4383-8206-92169da43249-kube-api-access-75qmk\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.140694 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.140762 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-scripts\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.140801 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.140843 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.140916 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54ac6ac1-72cb-4383-8206-92169da43249-etc-machine-id\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.140998 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-public-tls-certs\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.141187 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54ac6ac1-72cb-4383-8206-92169da43249-etc-machine-id\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.141294 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ac6ac1-72cb-4383-8206-92169da43249-logs\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.145437 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-scripts\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.146533 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.149211 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-public-tls-certs\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.149285 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data-custom\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.151040 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.153067 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.164256 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75qmk\" (UniqueName: \"kubernetes.io/projected/54ac6ac1-72cb-4383-8206-92169da43249-kube-api-access-75qmk\") pod \"cinder-api-0\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.281452 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.325229 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.776575 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:07:12 crc kubenswrapper[4917]: W0318 07:07:12.786360 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54ac6ac1_72cb_4383_8206_92169da43249.slice/crio-1b44ce85d63978bf96bfe8a750d393bbadcce33ce70e04f1c7203d1352090821 WatchSource:0}: Error finding container 1b44ce85d63978bf96bfe8a750d393bbadcce33ce70e04f1c7203d1352090821: Status 404 returned error can't find the container with id 1b44ce85d63978bf96bfe8a750d393bbadcce33ce70e04f1c7203d1352090821 Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.912024 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6b9bcbc9d4-vddlw"] Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.913932 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.917208 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.917543 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.942669 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b9bcbc9d4-vddlw"] Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.950066 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerStarted","Data":"24fd6ec4065ba9d97b15ee4a4825f12e37b49905e2a31cb36a22cfb331dfbb5c"} Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.950664 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerStarted","Data":"c4784213f6dfdc33a2715e839ad6fe12b6d12d6c90dac1fd28d25374e0ac1655"} Mar 18 07:07:12 crc kubenswrapper[4917]: I0318 07:07:12.952753 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54ac6ac1-72cb-4383-8206-92169da43249","Type":"ContainerStarted","Data":"1b44ce85d63978bf96bfe8a750d393bbadcce33ce70e04f1c7203d1352090821"} Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.063439 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-logs\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.063488 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-combined-ca-bundle\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.063513 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.063547 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-public-tls-certs\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.063626 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47f75\" (UniqueName: \"kubernetes.io/projected/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-kube-api-access-47f75\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.063670 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data-custom\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.063718 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-internal-tls-certs\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.165940 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-logs\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.165983 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-combined-ca-bundle\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.166008 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.166032 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-public-tls-certs\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.166065 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47f75\" (UniqueName: \"kubernetes.io/projected/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-kube-api-access-47f75\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.166111 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data-custom\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.166135 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-internal-tls-certs\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.167597 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-logs\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.169879 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-internal-tls-certs\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.171273 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.172879 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data-custom\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.175304 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-public-tls-certs\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.176187 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-combined-ca-bundle\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.185209 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47f75\" (UniqueName: \"kubernetes.io/projected/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-kube-api-access-47f75\") pod \"barbican-api-6b9bcbc9d4-vddlw\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.242783 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.709181 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6b9bcbc9d4-vddlw"] Mar 18 07:07:13 crc kubenswrapper[4917]: W0318 07:07:13.713957 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f3ff50e_a301_4abe_bbaf_2b0075b80b47.slice/crio-ccf1f02a88ccfdc525acd4b3b33ed6f7ffb6a36da39a8304b607e281ce86c407 WatchSource:0}: Error finding container ccf1f02a88ccfdc525acd4b3b33ed6f7ffb6a36da39a8304b607e281ce86c407: Status 404 returned error can't find the container with id ccf1f02a88ccfdc525acd4b3b33ed6f7ffb6a36da39a8304b607e281ce86c407 Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.790211 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa340a90-d6b8-4da7-ba99-cd74ac2b18a6" path="/var/lib/kubelet/pods/fa340a90-d6b8-4da7-ba99-cd74ac2b18a6/volumes" Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.975351 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerStarted","Data":"cb93167bee5a4a4269396e265133d3d22202227b4bb9bbe713e076bd8788da69"} Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.979929 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54ac6ac1-72cb-4383-8206-92169da43249","Type":"ContainerStarted","Data":"33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744"} Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.981958 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" event={"ID":"0f3ff50e-a301-4abe-bbaf-2b0075b80b47","Type":"ContainerStarted","Data":"e6a5a8a7029188f639c27fa6e3638c25cc766067465396ac13f1f7a3e5c9c941"} Mar 18 07:07:13 crc kubenswrapper[4917]: I0318 07:07:13.981995 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" event={"ID":"0f3ff50e-a301-4abe-bbaf-2b0075b80b47","Type":"ContainerStarted","Data":"ccf1f02a88ccfdc525acd4b3b33ed6f7ffb6a36da39a8304b607e281ce86c407"} Mar 18 07:07:14 crc kubenswrapper[4917]: I0318 07:07:14.996785 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerStarted","Data":"531ffe1b7cfc0b9f5e26ef62846e479e8cffd4ece59d49be663ff79e1852e9b0"} Mar 18 07:07:14 crc kubenswrapper[4917]: I0318 07:07:14.997401 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 07:07:14 crc kubenswrapper[4917]: I0318 07:07:14.998749 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54ac6ac1-72cb-4383-8206-92169da43249","Type":"ContainerStarted","Data":"dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f"} Mar 18 07:07:14 crc kubenswrapper[4917]: I0318 07:07:14.998908 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 07:07:15 crc kubenswrapper[4917]: I0318 07:07:15.000741 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" event={"ID":"0f3ff50e-a301-4abe-bbaf-2b0075b80b47","Type":"ContainerStarted","Data":"09de8cec2a2b8b86818b49eb10353534cc95b1159f2eddd3b1fc8ba166f591d6"} Mar 18 07:07:15 crc kubenswrapper[4917]: I0318 07:07:15.001019 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:15 crc kubenswrapper[4917]: I0318 07:07:15.029418 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.440508827 podStartE2EDuration="6.029393853s" podCreationTimestamp="2026-03-18 07:07:09 +0000 UTC" firstStartedPulling="2026-03-18 07:07:11.110063185 +0000 UTC m=+1216.051217899" lastFinishedPulling="2026-03-18 07:07:14.698948211 +0000 UTC m=+1219.640102925" observedRunningTime="2026-03-18 07:07:15.024356873 +0000 UTC m=+1219.965511627" watchObservedRunningTime="2026-03-18 07:07:15.029393853 +0000 UTC m=+1219.970548597" Mar 18 07:07:15 crc kubenswrapper[4917]: I0318 07:07:15.099996 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.099976839 podStartE2EDuration="4.099976839s" podCreationTimestamp="2026-03-18 07:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:15.089602071 +0000 UTC m=+1220.030756795" watchObservedRunningTime="2026-03-18 07:07:15.099976839 +0000 UTC m=+1220.041131563" Mar 18 07:07:15 crc kubenswrapper[4917]: I0318 07:07:15.102537 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" podStartSLOduration=3.10253163 podStartE2EDuration="3.10253163s" podCreationTimestamp="2026-03-18 07:07:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:15.067128205 +0000 UTC m=+1220.008282919" watchObservedRunningTime="2026-03-18 07:07:15.10253163 +0000 UTC m=+1220.043686354" Mar 18 07:07:16 crc kubenswrapper[4917]: I0318 07:07:16.013671 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:17 crc kubenswrapper[4917]: I0318 07:07:17.424052 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:07:17 crc kubenswrapper[4917]: I0318 07:07:17.511340 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-5mh8v"] Mar 18 07:07:17 crc kubenswrapper[4917]: I0318 07:07:17.511653 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" podUID="b761dff8-30ed-4625-9e13-69bb801f0378" containerName="dnsmasq-dns" containerID="cri-o://3d4a6c920818ecac494b79ab66ff2137f371b2371675750c4dca016c56372311" gracePeriod=10 Mar 18 07:07:17 crc kubenswrapper[4917]: I0318 07:07:17.637922 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 07:07:17 crc kubenswrapper[4917]: I0318 07:07:17.708373 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.036444 4917 generic.go:334] "Generic (PLEG): container finished" podID="b761dff8-30ed-4625-9e13-69bb801f0378" containerID="3d4a6c920818ecac494b79ab66ff2137f371b2371675750c4dca016c56372311" exitCode=0 Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.036965 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3db0a703-a265-4128-8f61-309293963e63" containerName="cinder-scheduler" containerID="cri-o://f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d" gracePeriod=30 Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.037209 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" event={"ID":"b761dff8-30ed-4625-9e13-69bb801f0378","Type":"ContainerDied","Data":"3d4a6c920818ecac494b79ab66ff2137f371b2371675750c4dca016c56372311"} Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.037239 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" event={"ID":"b761dff8-30ed-4625-9e13-69bb801f0378","Type":"ContainerDied","Data":"b88d3d099198bd30b7e840a112d81ecc62c7c428710ecb16ea4d23d0f9e9f3f9"} Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.037250 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b88d3d099198bd30b7e840a112d81ecc62c7c428710ecb16ea4d23d0f9e9f3f9" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.037476 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="3db0a703-a265-4128-8f61-309293963e63" containerName="probe" containerID="cri-o://e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801" gracePeriod=30 Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.060883 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.177433 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-nb\") pod \"b761dff8-30ed-4625-9e13-69bb801f0378\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.177545 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ltdr\" (UniqueName: \"kubernetes.io/projected/b761dff8-30ed-4625-9e13-69bb801f0378-kube-api-access-6ltdr\") pod \"b761dff8-30ed-4625-9e13-69bb801f0378\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.177659 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-svc\") pod \"b761dff8-30ed-4625-9e13-69bb801f0378\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.177735 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-config\") pod \"b761dff8-30ed-4625-9e13-69bb801f0378\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.177801 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-swift-storage-0\") pod \"b761dff8-30ed-4625-9e13-69bb801f0378\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.177922 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-sb\") pod \"b761dff8-30ed-4625-9e13-69bb801f0378\" (UID: \"b761dff8-30ed-4625-9e13-69bb801f0378\") " Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.196746 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b761dff8-30ed-4625-9e13-69bb801f0378-kube-api-access-6ltdr" (OuterVolumeSpecName: "kube-api-access-6ltdr") pod "b761dff8-30ed-4625-9e13-69bb801f0378" (UID: "b761dff8-30ed-4625-9e13-69bb801f0378"). InnerVolumeSpecName "kube-api-access-6ltdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.252726 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b761dff8-30ed-4625-9e13-69bb801f0378" (UID: "b761dff8-30ed-4625-9e13-69bb801f0378"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.260005 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-config" (OuterVolumeSpecName: "config") pod "b761dff8-30ed-4625-9e13-69bb801f0378" (UID: "b761dff8-30ed-4625-9e13-69bb801f0378"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.260546 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b761dff8-30ed-4625-9e13-69bb801f0378" (UID: "b761dff8-30ed-4625-9e13-69bb801f0378"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.275148 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b761dff8-30ed-4625-9e13-69bb801f0378" (UID: "b761dff8-30ed-4625-9e13-69bb801f0378"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.276668 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b761dff8-30ed-4625-9e13-69bb801f0378" (UID: "b761dff8-30ed-4625-9e13-69bb801f0378"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.280282 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.280324 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.280340 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ltdr\" (UniqueName: \"kubernetes.io/projected/b761dff8-30ed-4625-9e13-69bb801f0378-kube-api-access-6ltdr\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.280355 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.280365 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.280375 4917 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b761dff8-30ed-4625-9e13-69bb801f0378-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.680812 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:18 crc kubenswrapper[4917]: I0318 07:07:18.755054 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.048732 4917 generic.go:334] "Generic (PLEG): container finished" podID="3db0a703-a265-4128-8f61-309293963e63" containerID="e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801" exitCode=0 Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.048833 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3db0a703-a265-4128-8f61-309293963e63","Type":"ContainerDied","Data":"e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801"} Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.049163 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f9bff4fdf-5mh8v" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.075900 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-5mh8v"] Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.085973 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f9bff4fdf-5mh8v"] Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.228472 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.567644 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-864fd895df-j2ff2"] Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.567877 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-864fd895df-j2ff2" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-api" containerID="cri-o://febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d" gracePeriod=30 Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.568262 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-864fd895df-j2ff2" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-httpd" containerID="cri-o://016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5" gracePeriod=30 Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.588714 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-864fd895df-j2ff2" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": read tcp 10.217.0.2:51780->10.217.0.158:9696: read: connection reset by peer" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.617969 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-df967876c-494l9"] Mar 18 07:07:19 crc kubenswrapper[4917]: E0318 07:07:19.622414 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b761dff8-30ed-4625-9e13-69bb801f0378" containerName="dnsmasq-dns" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.622437 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b761dff8-30ed-4625-9e13-69bb801f0378" containerName="dnsmasq-dns" Mar 18 07:07:19 crc kubenswrapper[4917]: E0318 07:07:19.622460 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b761dff8-30ed-4625-9e13-69bb801f0378" containerName="init" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.622466 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b761dff8-30ed-4625-9e13-69bb801f0378" containerName="init" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.622780 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b761dff8-30ed-4625-9e13-69bb801f0378" containerName="dnsmasq-dns" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.624193 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.640774 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-df967876c-494l9"] Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.785212 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b761dff8-30ed-4625-9e13-69bb801f0378" path="/var/lib/kubelet/pods/b761dff8-30ed-4625-9e13-69bb801f0378/volumes" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.813727 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-combined-ca-bundle\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.813780 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-public-tls-certs\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.813807 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-config\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.813848 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-httpd-config\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.813879 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-internal-tls-certs\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.813901 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-ovndb-tls-certs\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.813920 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt6v6\" (UniqueName: \"kubernetes.io/projected/4be30c2f-97f1-4477-8322-8ed29dbd3c60-kube-api-access-mt6v6\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.915984 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-combined-ca-bundle\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.916038 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-public-tls-certs\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.916069 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-config\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.916114 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-httpd-config\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.916147 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-internal-tls-certs\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.916170 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-ovndb-tls-certs\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.916188 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt6v6\" (UniqueName: \"kubernetes.io/projected/4be30c2f-97f1-4477-8322-8ed29dbd3c60-kube-api-access-mt6v6\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.921662 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-ovndb-tls-certs\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.922454 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-public-tls-certs\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.927667 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-internal-tls-certs\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.928344 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-httpd-config\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.933293 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-combined-ca-bundle\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.936367 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt6v6\" (UniqueName: \"kubernetes.io/projected/4be30c2f-97f1-4477-8322-8ed29dbd3c60-kube-api-access-mt6v6\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.948730 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-config\") pod \"neutron-df967876c-494l9\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:19 crc kubenswrapper[4917]: I0318 07:07:19.950256 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:20 crc kubenswrapper[4917]: I0318 07:07:20.069430 4917 generic.go:334] "Generic (PLEG): container finished" podID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerID="016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5" exitCode=0 Mar 18 07:07:20 crc kubenswrapper[4917]: I0318 07:07:20.069464 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-864fd895df-j2ff2" event={"ID":"b43d29c8-9556-4447-aff2-f375a671ef4f","Type":"ContainerDied","Data":"016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5"} Mar 18 07:07:20 crc kubenswrapper[4917]: I0318 07:07:20.080012 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:20 crc kubenswrapper[4917]: I0318 07:07:20.509895 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-df967876c-494l9"] Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.094984 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df967876c-494l9" event={"ID":"4be30c2f-97f1-4477-8322-8ed29dbd3c60","Type":"ContainerStarted","Data":"73299c429264676075eece28a8b7265bc2bcf9550d93b8dd8a305ef5c9be5061"} Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.095484 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.095512 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df967876c-494l9" event={"ID":"4be30c2f-97f1-4477-8322-8ed29dbd3c60","Type":"ContainerStarted","Data":"65ec14bc4fbd1d630fe7a906e20bb195b244f65201d429c85361678850908b23"} Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.095528 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df967876c-494l9" event={"ID":"4be30c2f-97f1-4477-8322-8ed29dbd3c60","Type":"ContainerStarted","Data":"ddd8f5ab47d54228e6650aa6597aac7478e65c7c29f6f7b7b6697aa9ff43a1b4"} Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.120043 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-df967876c-494l9" podStartSLOduration=2.120025521 podStartE2EDuration="2.120025521s" podCreationTimestamp="2026-03-18 07:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:21.116152498 +0000 UTC m=+1226.057307212" watchObservedRunningTime="2026-03-18 07:07:21.120025521 +0000 UTC m=+1226.061180235" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.632105 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.695964 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d75d99dc6-t8qt7"] Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.696179 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d75d99dc6-t8qt7" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api-log" containerID="cri-o://c4648d7557570eb59c55bf30078a84d901b2fe55bb25f477c0a54a5e49afac30" gracePeriod=30 Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.696255 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d75d99dc6-t8qt7" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api" containerID="cri-o://6fdfe90cf84e7de527631564d4574fee39bc40358870bd58a1a2804d7d68ec11" gracePeriod=30 Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.724966 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.845816 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data\") pod \"3db0a703-a265-4128-8f61-309293963e63\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.845886 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data-custom\") pod \"3db0a703-a265-4128-8f61-309293963e63\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.845943 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-combined-ca-bundle\") pod \"3db0a703-a265-4128-8f61-309293963e63\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.845966 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnfkf\" (UniqueName: \"kubernetes.io/projected/3db0a703-a265-4128-8f61-309293963e63-kube-api-access-mnfkf\") pod \"3db0a703-a265-4128-8f61-309293963e63\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.846043 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3db0a703-a265-4128-8f61-309293963e63-etc-machine-id\") pod \"3db0a703-a265-4128-8f61-309293963e63\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.846125 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-scripts\") pod \"3db0a703-a265-4128-8f61-309293963e63\" (UID: \"3db0a703-a265-4128-8f61-309293963e63\") " Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.846443 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3db0a703-a265-4128-8f61-309293963e63-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3db0a703-a265-4128-8f61-309293963e63" (UID: "3db0a703-a265-4128-8f61-309293963e63"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.851480 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3db0a703-a265-4128-8f61-309293963e63" (UID: "3db0a703-a265-4128-8f61-309293963e63"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.851787 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db0a703-a265-4128-8f61-309293963e63-kube-api-access-mnfkf" (OuterVolumeSpecName: "kube-api-access-mnfkf") pod "3db0a703-a265-4128-8f61-309293963e63" (UID: "3db0a703-a265-4128-8f61-309293963e63"). InnerVolumeSpecName "kube-api-access-mnfkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.853622 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-scripts" (OuterVolumeSpecName: "scripts") pod "3db0a703-a265-4128-8f61-309293963e63" (UID: "3db0a703-a265-4128-8f61-309293963e63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.873176 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-864fd895df-j2ff2" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.914955 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3db0a703-a265-4128-8f61-309293963e63" (UID: "3db0a703-a265-4128-8f61-309293963e63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.948570 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnfkf\" (UniqueName: \"kubernetes.io/projected/3db0a703-a265-4128-8f61-309293963e63-kube-api-access-mnfkf\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.948627 4917 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3db0a703-a265-4128-8f61-309293963e63-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.948639 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.948650 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.948660 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:21 crc kubenswrapper[4917]: I0318 07:07:21.971511 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data" (OuterVolumeSpecName: "config-data") pod "3db0a703-a265-4128-8f61-309293963e63" (UID: "3db0a703-a265-4128-8f61-309293963e63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.050421 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db0a703-a265-4128-8f61-309293963e63-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.108051 4917 generic.go:334] "Generic (PLEG): container finished" podID="3db0a703-a265-4128-8f61-309293963e63" containerID="f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d" exitCode=0 Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.108121 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.108133 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3db0a703-a265-4128-8f61-309293963e63","Type":"ContainerDied","Data":"f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d"} Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.108162 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3db0a703-a265-4128-8f61-309293963e63","Type":"ContainerDied","Data":"01678cace47242222af79661e6c78c4f1055836029a1f9a19b40487ffbc70650"} Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.108182 4917 scope.go:117] "RemoveContainer" containerID="e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.112350 4917 generic.go:334] "Generic (PLEG): container finished" podID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerID="c4648d7557570eb59c55bf30078a84d901b2fe55bb25f477c0a54a5e49afac30" exitCode=143 Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.112401 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d75d99dc6-t8qt7" event={"ID":"558cd5bd-4452-464b-be7f-3fb70a6dabb9","Type":"ContainerDied","Data":"c4648d7557570eb59c55bf30078a84d901b2fe55bb25f477c0a54a5e49afac30"} Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.145353 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.157976 4917 scope.go:117] "RemoveContainer" containerID="f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.163136 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.180686 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:07:22 crc kubenswrapper[4917]: E0318 07:07:22.181120 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db0a703-a265-4128-8f61-309293963e63" containerName="probe" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.181142 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db0a703-a265-4128-8f61-309293963e63" containerName="probe" Mar 18 07:07:22 crc kubenswrapper[4917]: E0318 07:07:22.181157 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db0a703-a265-4128-8f61-309293963e63" containerName="cinder-scheduler" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.181166 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db0a703-a265-4128-8f61-309293963e63" containerName="cinder-scheduler" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.181423 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db0a703-a265-4128-8f61-309293963e63" containerName="probe" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.181458 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db0a703-a265-4128-8f61-309293963e63" containerName="cinder-scheduler" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.182485 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.185250 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.194823 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.230141 4917 scope.go:117] "RemoveContainer" containerID="e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801" Mar 18 07:07:22 crc kubenswrapper[4917]: E0318 07:07:22.231038 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801\": container with ID starting with e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801 not found: ID does not exist" containerID="e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.231082 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801"} err="failed to get container status \"e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801\": rpc error: code = NotFound desc = could not find container \"e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801\": container with ID starting with e2ad2400e77ad7e20f8f912c392c86a252ef70edfe4985e35b50b7a9747d7801 not found: ID does not exist" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.231112 4917 scope.go:117] "RemoveContainer" containerID="f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d" Mar 18 07:07:22 crc kubenswrapper[4917]: E0318 07:07:22.233970 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d\": container with ID starting with f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d not found: ID does not exist" containerID="f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.234001 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d"} err="failed to get container status \"f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d\": rpc error: code = NotFound desc = could not find container \"f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d\": container with ID starting with f70a9fae8ce5244415b7292fe14edbde7d5d145f9877fe2313464c462c3c1f8d not found: ID does not exist" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.363362 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-scripts\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.363444 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.363465 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad1b30db-147f-4314-8fbe-ba8aa096be57-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.363508 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.363533 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkl4n\" (UniqueName: \"kubernetes.io/projected/ad1b30db-147f-4314-8fbe-ba8aa096be57-kube-api-access-pkl4n\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.363564 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.466049 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.466120 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad1b30db-147f-4314-8fbe-ba8aa096be57-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.466212 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.466254 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkl4n\" (UniqueName: \"kubernetes.io/projected/ad1b30db-147f-4314-8fbe-ba8aa096be57-kube-api-access-pkl4n\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.466296 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.466437 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-scripts\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.468094 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad1b30db-147f-4314-8fbe-ba8aa096be57-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.476454 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.477024 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-scripts\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.478457 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.480154 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.494470 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkl4n\" (UniqueName: \"kubernetes.io/projected/ad1b30db-147f-4314-8fbe-ba8aa096be57-kube-api-access-pkl4n\") pod \"cinder-scheduler-0\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.511376 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 07:07:22 crc kubenswrapper[4917]: I0318 07:07:22.989653 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:07:22 crc kubenswrapper[4917]: W0318 07:07:22.990370 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad1b30db_147f_4314_8fbe_ba8aa096be57.slice/crio-184f89cb853d19513fbf115a3dddacb62855f8fa97bc71b9e33e0ef276e3cf86 WatchSource:0}: Error finding container 184f89cb853d19513fbf115a3dddacb62855f8fa97bc71b9e33e0ef276e3cf86: Status 404 returned error can't find the container with id 184f89cb853d19513fbf115a3dddacb62855f8fa97bc71b9e33e0ef276e3cf86 Mar 18 07:07:23 crc kubenswrapper[4917]: I0318 07:07:23.125422 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad1b30db-147f-4314-8fbe-ba8aa096be57","Type":"ContainerStarted","Data":"184f89cb853d19513fbf115a3dddacb62855f8fa97bc71b9e33e0ef276e3cf86"} Mar 18 07:07:23 crc kubenswrapper[4917]: I0318 07:07:23.798089 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db0a703-a265-4128-8f61-309293963e63" path="/var/lib/kubelet/pods/3db0a703-a265-4128-8f61-309293963e63/volumes" Mar 18 07:07:24 crc kubenswrapper[4917]: I0318 07:07:24.094447 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 07:07:24 crc kubenswrapper[4917]: I0318 07:07:24.135041 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad1b30db-147f-4314-8fbe-ba8aa096be57","Type":"ContainerStarted","Data":"3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b"} Mar 18 07:07:24 crc kubenswrapper[4917]: I0318 07:07:24.135953 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad1b30db-147f-4314-8fbe-ba8aa096be57","Type":"ContainerStarted","Data":"53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9"} Mar 18 07:07:24 crc kubenswrapper[4917]: I0318 07:07:24.162933 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.162909766 podStartE2EDuration="2.162909766s" podCreationTimestamp="2026-03-18 07:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:24.161024742 +0000 UTC m=+1229.102179476" watchObservedRunningTime="2026-03-18 07:07:24.162909766 +0000 UTC m=+1229.104064480" Mar 18 07:07:24 crc kubenswrapper[4917]: I0318 07:07:24.931739 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d75d99dc6-t8qt7" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:48804->10.217.0.164:9311: read: connection reset by peer" Mar 18 07:07:24 crc kubenswrapper[4917]: I0318 07:07:24.931747 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5d75d99dc6-t8qt7" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.164:9311/healthcheck\": read tcp 10.217.0.2:48802->10.217.0.164:9311: read: connection reset by peer" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.106908 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.147678 4917 generic.go:334] "Generic (PLEG): container finished" podID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerID="febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d" exitCode=0 Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.147759 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-864fd895df-j2ff2" event={"ID":"b43d29c8-9556-4447-aff2-f375a671ef4f","Type":"ContainerDied","Data":"febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d"} Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.147771 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-864fd895df-j2ff2" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.147792 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-864fd895df-j2ff2" event={"ID":"b43d29c8-9556-4447-aff2-f375a671ef4f","Type":"ContainerDied","Data":"ff10a2f4a86d87476d5119ac936af6f4353a3483d8fb36c56f05728b6753cf5e"} Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.147812 4917 scope.go:117] "RemoveContainer" containerID="016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.150977 4917 generic.go:334] "Generic (PLEG): container finished" podID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerID="6fdfe90cf84e7de527631564d4574fee39bc40358870bd58a1a2804d7d68ec11" exitCode=0 Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.151189 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d75d99dc6-t8qt7" event={"ID":"558cd5bd-4452-464b-be7f-3fb70a6dabb9","Type":"ContainerDied","Data":"6fdfe90cf84e7de527631564d4574fee39bc40358870bd58a1a2804d7d68ec11"} Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.173711 4917 scope.go:117] "RemoveContainer" containerID="febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.200737 4917 scope.go:117] "RemoveContainer" containerID="016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5" Mar 18 07:07:25 crc kubenswrapper[4917]: E0318 07:07:25.204301 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5\": container with ID starting with 016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5 not found: ID does not exist" containerID="016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.204344 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5"} err="failed to get container status \"016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5\": rpc error: code = NotFound desc = could not find container \"016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5\": container with ID starting with 016af58af180bc43fbd15c8479b1f5aec70b0afe9cb2a758d509ead7b95adea5 not found: ID does not exist" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.204375 4917 scope.go:117] "RemoveContainer" containerID="febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d" Mar 18 07:07:25 crc kubenswrapper[4917]: E0318 07:07:25.205539 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d\": container with ID starting with febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d not found: ID does not exist" containerID="febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.205575 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d"} err="failed to get container status \"febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d\": rpc error: code = NotFound desc = could not find container \"febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d\": container with ID starting with febcff036dd8dbaa6a518091ad4903960a3af08969a759c4d6af57115282421d not found: ID does not exist" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.216138 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-combined-ca-bundle\") pod \"b43d29c8-9556-4447-aff2-f375a671ef4f\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.216217 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-public-tls-certs\") pod \"b43d29c8-9556-4447-aff2-f375a671ef4f\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.216247 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-ovndb-tls-certs\") pod \"b43d29c8-9556-4447-aff2-f375a671ef4f\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.216322 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-httpd-config\") pod \"b43d29c8-9556-4447-aff2-f375a671ef4f\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.216391 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-config\") pod \"b43d29c8-9556-4447-aff2-f375a671ef4f\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.216420 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdb9l\" (UniqueName: \"kubernetes.io/projected/b43d29c8-9556-4447-aff2-f375a671ef4f-kube-api-access-kdb9l\") pod \"b43d29c8-9556-4447-aff2-f375a671ef4f\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.216485 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-internal-tls-certs\") pod \"b43d29c8-9556-4447-aff2-f375a671ef4f\" (UID: \"b43d29c8-9556-4447-aff2-f375a671ef4f\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.228280 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b43d29c8-9556-4447-aff2-f375a671ef4f-kube-api-access-kdb9l" (OuterVolumeSpecName: "kube-api-access-kdb9l") pod "b43d29c8-9556-4447-aff2-f375a671ef4f" (UID: "b43d29c8-9556-4447-aff2-f375a671ef4f"). InnerVolumeSpecName "kube-api-access-kdb9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.261885 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "b43d29c8-9556-4447-aff2-f375a671ef4f" (UID: "b43d29c8-9556-4447-aff2-f375a671ef4f"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.317901 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdb9l\" (UniqueName: \"kubernetes.io/projected/b43d29c8-9556-4447-aff2-f375a671ef4f-kube-api-access-kdb9l\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.319451 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.341690 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b43d29c8-9556-4447-aff2-f375a671ef4f" (UID: "b43d29c8-9556-4447-aff2-f375a671ef4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.351772 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-config" (OuterVolumeSpecName: "config") pod "b43d29c8-9556-4447-aff2-f375a671ef4f" (UID: "b43d29c8-9556-4447-aff2-f375a671ef4f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.361172 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b43d29c8-9556-4447-aff2-f375a671ef4f" (UID: "b43d29c8-9556-4447-aff2-f375a671ef4f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.361463 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b43d29c8-9556-4447-aff2-f375a671ef4f" (UID: "b43d29c8-9556-4447-aff2-f375a671ef4f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.364525 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "b43d29c8-9556-4447-aff2-f375a671ef4f" (UID: "b43d29c8-9556-4447-aff2-f375a671ef4f"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.391267 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.421323 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.421353 4917 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.421362 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.421374 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.421386 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b43d29c8-9556-4447-aff2-f375a671ef4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.488153 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-864fd895df-j2ff2"] Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.499267 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-864fd895df-j2ff2"] Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.522896 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmlw4\" (UniqueName: \"kubernetes.io/projected/558cd5bd-4452-464b-be7f-3fb70a6dabb9-kube-api-access-mmlw4\") pod \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.522999 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data-custom\") pod \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.523099 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/558cd5bd-4452-464b-be7f-3fb70a6dabb9-logs\") pod \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.523163 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-combined-ca-bundle\") pod \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.523211 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data\") pod \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\" (UID: \"558cd5bd-4452-464b-be7f-3fb70a6dabb9\") " Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.523882 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/558cd5bd-4452-464b-be7f-3fb70a6dabb9-logs" (OuterVolumeSpecName: "logs") pod "558cd5bd-4452-464b-be7f-3fb70a6dabb9" (UID: "558cd5bd-4452-464b-be7f-3fb70a6dabb9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.526626 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558cd5bd-4452-464b-be7f-3fb70a6dabb9-kube-api-access-mmlw4" (OuterVolumeSpecName: "kube-api-access-mmlw4") pod "558cd5bd-4452-464b-be7f-3fb70a6dabb9" (UID: "558cd5bd-4452-464b-be7f-3fb70a6dabb9"). InnerVolumeSpecName "kube-api-access-mmlw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.527435 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "558cd5bd-4452-464b-be7f-3fb70a6dabb9" (UID: "558cd5bd-4452-464b-be7f-3fb70a6dabb9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.546388 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "558cd5bd-4452-464b-be7f-3fb70a6dabb9" (UID: "558cd5bd-4452-464b-be7f-3fb70a6dabb9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.572454 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data" (OuterVolumeSpecName: "config-data") pod "558cd5bd-4452-464b-be7f-3fb70a6dabb9" (UID: "558cd5bd-4452-464b-be7f-3fb70a6dabb9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.624680 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.624909 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/558cd5bd-4452-464b-be7f-3fb70a6dabb9-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.624974 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.625028 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/558cd5bd-4452-464b-be7f-3fb70a6dabb9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.625079 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmlw4\" (UniqueName: \"kubernetes.io/projected/558cd5bd-4452-464b-be7f-3fb70a6dabb9-kube-api-access-mmlw4\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:25 crc kubenswrapper[4917]: I0318 07:07:25.790751 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" path="/var/lib/kubelet/pods/b43d29c8-9556-4447-aff2-f375a671ef4f/volumes" Mar 18 07:07:26 crc kubenswrapper[4917]: I0318 07:07:26.161140 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d75d99dc6-t8qt7" event={"ID":"558cd5bd-4452-464b-be7f-3fb70a6dabb9","Type":"ContainerDied","Data":"00fbee2b0b8ef3a98ebd3183572af4ff8404bcbaf4b36d407107f32f8b2aa1b5"} Mar 18 07:07:26 crc kubenswrapper[4917]: I0318 07:07:26.161179 4917 scope.go:117] "RemoveContainer" containerID="6fdfe90cf84e7de527631564d4574fee39bc40358870bd58a1a2804d7d68ec11" Mar 18 07:07:26 crc kubenswrapper[4917]: I0318 07:07:26.161281 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d75d99dc6-t8qt7" Mar 18 07:07:26 crc kubenswrapper[4917]: I0318 07:07:26.181402 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d75d99dc6-t8qt7"] Mar 18 07:07:26 crc kubenswrapper[4917]: I0318 07:07:26.187725 4917 scope.go:117] "RemoveContainer" containerID="c4648d7557570eb59c55bf30078a84d901b2fe55bb25f477c0a54a5e49afac30" Mar 18 07:07:26 crc kubenswrapper[4917]: I0318 07:07:26.188706 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5d75d99dc6-t8qt7"] Mar 18 07:07:26 crc kubenswrapper[4917]: I0318 07:07:26.405116 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.425610 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.512176 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.664567 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6f74bf5646-8prg6"] Mar 18 07:07:27 crc kubenswrapper[4917]: E0318 07:07:27.665024 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api-log" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.665046 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api-log" Mar 18 07:07:27 crc kubenswrapper[4917]: E0318 07:07:27.665071 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-httpd" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.665080 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-httpd" Mar 18 07:07:27 crc kubenswrapper[4917]: E0318 07:07:27.665117 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.665125 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api" Mar 18 07:07:27 crc kubenswrapper[4917]: E0318 07:07:27.665145 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-api" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.665154 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-api" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.665355 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-httpd" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.665373 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b43d29c8-9556-4447-aff2-f375a671ef4f" containerName="neutron-api" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.665402 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api-log" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.665414 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" containerName="barbican-api" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.666908 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.689374 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f74bf5646-8prg6"] Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.767062 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-public-tls-certs\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.767118 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-scripts\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.767143 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-internal-tls-certs\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.767194 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-842zg\" (UniqueName: \"kubernetes.io/projected/2140bc3b-8c96-4226-b7c4-811b0724682d-kube-api-access-842zg\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.767217 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-config-data\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.767262 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-combined-ca-bundle\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.767297 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2140bc3b-8c96-4226-b7c4-811b0724682d-logs\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.785733 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558cd5bd-4452-464b-be7f-3fb70a6dabb9" path="/var/lib/kubelet/pods/558cd5bd-4452-464b-be7f-3fb70a6dabb9/volumes" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.868608 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-public-tls-certs\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.868700 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-scripts\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.868734 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-internal-tls-certs\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.868798 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-842zg\" (UniqueName: \"kubernetes.io/projected/2140bc3b-8c96-4226-b7c4-811b0724682d-kube-api-access-842zg\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.868827 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-config-data\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.868901 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-combined-ca-bundle\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.868937 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2140bc3b-8c96-4226-b7c4-811b0724682d-logs\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.871395 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2140bc3b-8c96-4226-b7c4-811b0724682d-logs\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.874403 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-public-tls-certs\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.875475 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-config-data\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.883001 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-scripts\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.891275 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-internal-tls-certs\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.894827 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-combined-ca-bundle\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.895864 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-842zg\" (UniqueName: \"kubernetes.io/projected/2140bc3b-8c96-4226-b7c4-811b0724682d-kube-api-access-842zg\") pod \"placement-6f74bf5646-8prg6\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:27 crc kubenswrapper[4917]: I0318 07:07:27.987731 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:28 crc kubenswrapper[4917]: W0318 07:07:28.518035 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2140bc3b_8c96_4226_b7c4_811b0724682d.slice/crio-d41cab000d3a29afef33509ed9b90327b6aa87400a3addf1dcc69f927f8d56ae WatchSource:0}: Error finding container d41cab000d3a29afef33509ed9b90327b6aa87400a3addf1dcc69f927f8d56ae: Status 404 returned error can't find the container with id d41cab000d3a29afef33509ed9b90327b6aa87400a3addf1dcc69f927f8d56ae Mar 18 07:07:28 crc kubenswrapper[4917]: I0318 07:07:28.525125 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6f74bf5646-8prg6"] Mar 18 07:07:29 crc kubenswrapper[4917]: I0318 07:07:29.069934 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:07:29 crc kubenswrapper[4917]: I0318 07:07:29.201492 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f74bf5646-8prg6" event={"ID":"2140bc3b-8c96-4226-b7c4-811b0724682d","Type":"ContainerStarted","Data":"1592e72e88530b3687f95124325d50bfc66a649981b642fed4a4d25898d5c3d7"} Mar 18 07:07:29 crc kubenswrapper[4917]: I0318 07:07:29.201540 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f74bf5646-8prg6" event={"ID":"2140bc3b-8c96-4226-b7c4-811b0724682d","Type":"ContainerStarted","Data":"c1a1e3e9c9d8361d0c4155702ce697636d2fe57a0d4920a2496cd0fbb0e9a366"} Mar 18 07:07:29 crc kubenswrapper[4917]: I0318 07:07:29.201553 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f74bf5646-8prg6" event={"ID":"2140bc3b-8c96-4226-b7c4-811b0724682d","Type":"ContainerStarted","Data":"d41cab000d3a29afef33509ed9b90327b6aa87400a3addf1dcc69f927f8d56ae"} Mar 18 07:07:29 crc kubenswrapper[4917]: I0318 07:07:29.201654 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:29 crc kubenswrapper[4917]: I0318 07:07:29.201908 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:29 crc kubenswrapper[4917]: I0318 07:07:29.243371 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6f74bf5646-8prg6" podStartSLOduration=2.243350628 podStartE2EDuration="2.243350628s" podCreationTimestamp="2026-03-18 07:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:29.22269342 +0000 UTC m=+1234.163848154" watchObservedRunningTime="2026-03-18 07:07:29.243350628 +0000 UTC m=+1234.184505342" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.656375 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.658076 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.661225 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.661250 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.664101 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-tbkpm" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.669956 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.717157 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8p6z\" (UniqueName: \"kubernetes.io/projected/fd47f339-3398-4565-82f8-4b715e0b19a9-kube-api-access-n8p6z\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.717242 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.717326 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config-secret\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.717349 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.819120 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8p6z\" (UniqueName: \"kubernetes.io/projected/fd47f339-3398-4565-82f8-4b715e0b19a9-kube-api-access-n8p6z\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.819155 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.819228 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config-secret\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.819262 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.820752 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.826832 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config-secret\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.836689 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-combined-ca-bundle\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.849020 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8p6z\" (UniqueName: \"kubernetes.io/projected/fd47f339-3398-4565-82f8-4b715e0b19a9-kube-api-access-n8p6z\") pod \"openstackclient\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " pod="openstack/openstackclient" Mar 18 07:07:30 crc kubenswrapper[4917]: I0318 07:07:30.981657 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 07:07:31 crc kubenswrapper[4917]: I0318 07:07:31.473800 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 07:07:32 crc kubenswrapper[4917]: I0318 07:07:32.230260 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fd47f339-3398-4565-82f8-4b715e0b19a9","Type":"ContainerStarted","Data":"860bfc8e64cc23f4ed775c6323168555d18d9ea5d0992d66e87f434164c4d326"} Mar 18 07:07:32 crc kubenswrapper[4917]: I0318 07:07:32.774348 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.135987 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.136902 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="ceilometer-central-agent" containerID="cri-o://c4784213f6dfdc33a2715e839ad6fe12b6d12d6c90dac1fd28d25374e0ac1655" gracePeriod=30 Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.137540 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="proxy-httpd" containerID="cri-o://531ffe1b7cfc0b9f5e26ef62846e479e8cffd4ece59d49be663ff79e1852e9b0" gracePeriod=30 Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.137638 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="sg-core" containerID="cri-o://cb93167bee5a4a4269396e265133d3d22202227b4bb9bbe713e076bd8788da69" gracePeriod=30 Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.137714 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="ceilometer-notification-agent" containerID="cri-o://24fd6ec4065ba9d97b15ee4a4825f12e37b49905e2a31cb36a22cfb331dfbb5c" gracePeriod=30 Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.148199 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": EOF" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.310110 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7f954fcdf9-tt6r4"] Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.312282 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.319176 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.319484 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.319657 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.337489 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f954fcdf9-tt6r4"] Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.422835 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-config-data\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.422896 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-run-httpd\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.422919 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-combined-ca-bundle\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.422944 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-log-httpd\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.423100 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-etc-swift\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.423169 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-internal-tls-certs\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.423259 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpppx\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-kube-api-access-kpppx\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.423453 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-public-tls-certs\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.526049 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-public-tls-certs\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.526839 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-config-data\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.526869 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-run-httpd\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.526889 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-combined-ca-bundle\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.526910 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-log-httpd\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.526940 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-etc-swift\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.526965 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-internal-tls-certs\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.526999 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpppx\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-kube-api-access-kpppx\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.527488 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-run-httpd\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.527735 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-log-httpd\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.541466 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-combined-ca-bundle\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.541951 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-internal-tls-certs\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.542682 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-config-data\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.543124 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-etc-swift\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.546304 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-public-tls-certs\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.547300 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpppx\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-kube-api-access-kpppx\") pod \"swift-proxy-7f954fcdf9-tt6r4\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:35 crc kubenswrapper[4917]: I0318 07:07:35.669142 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:36 crc kubenswrapper[4917]: I0318 07:07:36.269998 4917 generic.go:334] "Generic (PLEG): container finished" podID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerID="531ffe1b7cfc0b9f5e26ef62846e479e8cffd4ece59d49be663ff79e1852e9b0" exitCode=0 Mar 18 07:07:36 crc kubenswrapper[4917]: I0318 07:07:36.270028 4917 generic.go:334] "Generic (PLEG): container finished" podID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerID="cb93167bee5a4a4269396e265133d3d22202227b4bb9bbe713e076bd8788da69" exitCode=2 Mar 18 07:07:36 crc kubenswrapper[4917]: I0318 07:07:36.270035 4917 generic.go:334] "Generic (PLEG): container finished" podID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerID="24fd6ec4065ba9d97b15ee4a4825f12e37b49905e2a31cb36a22cfb331dfbb5c" exitCode=0 Mar 18 07:07:36 crc kubenswrapper[4917]: I0318 07:07:36.270042 4917 generic.go:334] "Generic (PLEG): container finished" podID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerID="c4784213f6dfdc33a2715e839ad6fe12b6d12d6c90dac1fd28d25374e0ac1655" exitCode=0 Mar 18 07:07:36 crc kubenswrapper[4917]: I0318 07:07:36.270058 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerDied","Data":"531ffe1b7cfc0b9f5e26ef62846e479e8cffd4ece59d49be663ff79e1852e9b0"} Mar 18 07:07:36 crc kubenswrapper[4917]: I0318 07:07:36.270081 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerDied","Data":"cb93167bee5a4a4269396e265133d3d22202227b4bb9bbe713e076bd8788da69"} Mar 18 07:07:36 crc kubenswrapper[4917]: I0318 07:07:36.270089 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerDied","Data":"24fd6ec4065ba9d97b15ee4a4825f12e37b49905e2a31cb36a22cfb331dfbb5c"} Mar 18 07:07:36 crc kubenswrapper[4917]: I0318 07:07:36.270097 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerDied","Data":"c4784213f6dfdc33a2715e839ad6fe12b6d12d6c90dac1fd28d25374e0ac1655"} Mar 18 07:07:37 crc kubenswrapper[4917]: I0318 07:07:37.055498 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:07:37 crc kubenswrapper[4917]: I0318 07:07:37.056154 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-log" containerID="cri-o://d61507c00bd0791b6047696eff450e0fb3835214f876cc795fb1d3c450ce5f2e" gracePeriod=30 Mar 18 07:07:37 crc kubenswrapper[4917]: I0318 07:07:37.056265 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-httpd" containerID="cri-o://7a8f91c1a0d280d4171736d8452e848c62225dd3a215c3484db00414f3528971" gracePeriod=30 Mar 18 07:07:37 crc kubenswrapper[4917]: I0318 07:07:37.288569 4917 generic.go:334] "Generic (PLEG): container finished" podID="3519edfd-1fd3-415a-913a-71cd289d524a" containerID="d61507c00bd0791b6047696eff450e0fb3835214f876cc795fb1d3c450ce5f2e" exitCode=143 Mar 18 07:07:37 crc kubenswrapper[4917]: I0318 07:07:37.288634 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3519edfd-1fd3-415a-913a-71cd289d524a","Type":"ContainerDied","Data":"d61507c00bd0791b6047696eff450e0fb3835214f876cc795fb1d3c450ce5f2e"} Mar 18 07:07:37 crc kubenswrapper[4917]: I0318 07:07:37.926481 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:07:37 crc kubenswrapper[4917]: I0318 07:07:37.926979 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerName="glance-log" containerID="cri-o://e98234ea84573f53bb65b8132490e1af2226b1eb3671188da6355c4b71933e2d" gracePeriod=30 Mar 18 07:07:37 crc kubenswrapper[4917]: I0318 07:07:37.927103 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerName="glance-httpd" containerID="cri-o://707fc93d69d62d504d4984b9bb73411cbdd19f1fbf74b2a04513c9762a172dd0" gracePeriod=30 Mar 18 07:07:38 crc kubenswrapper[4917]: I0318 07:07:38.302784 4917 generic.go:334] "Generic (PLEG): container finished" podID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerID="e98234ea84573f53bb65b8132490e1af2226b1eb3671188da6355c4b71933e2d" exitCode=143 Mar 18 07:07:38 crc kubenswrapper[4917]: I0318 07:07:38.303077 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4afa7651-3ebd-4549-a15e-1b3d9d5537db","Type":"ContainerDied","Data":"e98234ea84573f53bb65b8132490e1af2226b1eb3671188da6355c4b71933e2d"} Mar 18 07:07:40 crc kubenswrapper[4917]: I0318 07:07:40.205983 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.168:3000/\": dial tcp 10.217.0.168:3000: connect: connection refused" Mar 18 07:07:40 crc kubenswrapper[4917]: I0318 07:07:40.208997 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:37198->10.217.0.155:9292: read: connection reset by peer" Mar 18 07:07:40 crc kubenswrapper[4917]: I0318 07:07:40.209073 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:37210->10.217.0.155:9292: read: connection reset by peer" Mar 18 07:07:40 crc kubenswrapper[4917]: I0318 07:07:40.330251 4917 generic.go:334] "Generic (PLEG): container finished" podID="3519edfd-1fd3-415a-913a-71cd289d524a" containerID="7a8f91c1a0d280d4171736d8452e848c62225dd3a215c3484db00414f3528971" exitCode=0 Mar 18 07:07:40 crc kubenswrapper[4917]: I0318 07:07:40.330336 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3519edfd-1fd3-415a-913a-71cd289d524a","Type":"ContainerDied","Data":"7a8f91c1a0d280d4171736d8452e848c62225dd3a215c3484db00414f3528971"} Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.133569 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.187230 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.242284 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-scripts\") pod \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.242349 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-combined-ca-bundle\") pod \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.242407 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-run-httpd\") pod \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.242426 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-config-data\") pod \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.242469 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dnq9\" (UniqueName: \"kubernetes.io/projected/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-kube-api-access-7dnq9\") pod \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.242506 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-sg-core-conf-yaml\") pod \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.242552 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-log-httpd\") pod \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\" (UID: \"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.243150 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" (UID: "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.243229 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" (UID: "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.247276 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-kube-api-access-7dnq9" (OuterVolumeSpecName: "kube-api-access-7dnq9") pod "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" (UID: "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0"). InnerVolumeSpecName "kube-api-access-7dnq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.252987 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-scripts" (OuterVolumeSpecName: "scripts") pod "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" (UID: "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.280157 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" (UID: "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.316749 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" (UID: "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.344809 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-httpd-run\") pod \"3519edfd-1fd3-415a-913a-71cd289d524a\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.344931 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3519edfd-1fd3-415a-913a-71cd289d524a\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.344979 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-internal-tls-certs\") pod \"3519edfd-1fd3-415a-913a-71cd289d524a\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.345018 4917 generic.go:334] "Generic (PLEG): container finished" podID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerID="707fc93d69d62d504d4984b9bb73411cbdd19f1fbf74b2a04513c9762a172dd0" exitCode=0 Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.345209 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fstwg\" (UniqueName: \"kubernetes.io/projected/3519edfd-1fd3-415a-913a-71cd289d524a-kube-api-access-fstwg\") pod \"3519edfd-1fd3-415a-913a-71cd289d524a\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.345351 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-logs\") pod \"3519edfd-1fd3-415a-913a-71cd289d524a\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.345375 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-scripts\") pod \"3519edfd-1fd3-415a-913a-71cd289d524a\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.345451 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4afa7651-3ebd-4549-a15e-1b3d9d5537db","Type":"ContainerDied","Data":"707fc93d69d62d504d4984b9bb73411cbdd19f1fbf74b2a04513c9762a172dd0"} Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.345536 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3519edfd-1fd3-415a-913a-71cd289d524a" (UID: "3519edfd-1fd3-415a-913a-71cd289d524a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.345626 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-config-data\") pod \"3519edfd-1fd3-415a-913a-71cd289d524a\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.345808 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-combined-ca-bundle\") pod \"3519edfd-1fd3-415a-913a-71cd289d524a\" (UID: \"3519edfd-1fd3-415a-913a-71cd289d524a\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.345997 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-logs" (OuterVolumeSpecName: "logs") pod "3519edfd-1fd3-415a-913a-71cd289d524a" (UID: "3519edfd-1fd3-415a-913a-71cd289d524a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.346620 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.346652 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.346664 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.346678 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.346689 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dnq9\" (UniqueName: \"kubernetes.io/projected/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-kube-api-access-7dnq9\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.346702 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.346713 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.346722 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3519edfd-1fd3-415a-913a-71cd289d524a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.351836 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3519edfd-1fd3-415a-913a-71cd289d524a-kube-api-access-fstwg" (OuterVolumeSpecName: "kube-api-access-fstwg") pod "3519edfd-1fd3-415a-913a-71cd289d524a" (UID: "3519edfd-1fd3-415a-913a-71cd289d524a"). InnerVolumeSpecName "kube-api-access-fstwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.352698 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "3519edfd-1fd3-415a-913a-71cd289d524a" (UID: "3519edfd-1fd3-415a-913a-71cd289d524a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.352998 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-scripts" (OuterVolumeSpecName: "scripts") pod "3519edfd-1fd3-415a-913a-71cd289d524a" (UID: "3519edfd-1fd3-415a-913a-71cd289d524a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.353091 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3519edfd-1fd3-415a-913a-71cd289d524a","Type":"ContainerDied","Data":"6b63fb2b7875c47236753ae051a2755f4f4953bc497040216e9b4a25cb0e0ab7"} Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.353132 4917 scope.go:117] "RemoveContainer" containerID="7a8f91c1a0d280d4171736d8452e848c62225dd3a215c3484db00414f3528971" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.353257 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.362159 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fd47f339-3398-4565-82f8-4b715e0b19a9","Type":"ContainerStarted","Data":"c4b6c0f7460d0188f56f5fc10d1b3881ea6f8779c842ed7425cfa88b2e731c1c"} Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.380245 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-config-data" (OuterVolumeSpecName: "config-data") pod "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" (UID: "85d579cf-02f6-43b0-acd6-a2b9d00ce6a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.386881 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"85d579cf-02f6-43b0-acd6-a2b9d00ce6a0","Type":"ContainerDied","Data":"f2b5fc5c813e41eb4c0dacd4737d48891ace4019dcacc8dd224c78aa66096242"} Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.387146 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.389357 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.086887663 podStartE2EDuration="11.389335913s" podCreationTimestamp="2026-03-18 07:07:30 +0000 UTC" firstStartedPulling="2026-03-18 07:07:31.471113849 +0000 UTC m=+1236.412268563" lastFinishedPulling="2026-03-18 07:07:40.773562089 +0000 UTC m=+1245.714716813" observedRunningTime="2026-03-18 07:07:41.382022546 +0000 UTC m=+1246.323177260" watchObservedRunningTime="2026-03-18 07:07:41.389335913 +0000 UTC m=+1246.330490627" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.392119 4917 scope.go:117] "RemoveContainer" containerID="d61507c00bd0791b6047696eff450e0fb3835214f876cc795fb1d3c450ce5f2e" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.419133 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3519edfd-1fd3-415a-913a-71cd289d524a" (UID: "3519edfd-1fd3-415a-913a-71cd289d524a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.433363 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7f954fcdf9-tt6r4"] Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.440248 4917 scope.go:117] "RemoveContainer" containerID="531ffe1b7cfc0b9f5e26ef62846e479e8cffd4ece59d49be663ff79e1852e9b0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.445841 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-config-data" (OuterVolumeSpecName: "config-data") pod "3519edfd-1fd3-415a-913a-71cd289d524a" (UID: "3519edfd-1fd3-415a-913a-71cd289d524a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.450005 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.450041 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.450052 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fstwg\" (UniqueName: \"kubernetes.io/projected/3519edfd-1fd3-415a-913a-71cd289d524a-kube-api-access-fstwg\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.450064 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.450072 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.450079 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.450735 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.461782 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.466794 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3519edfd-1fd3-415a-913a-71cd289d524a" (UID: "3519edfd-1fd3-415a-913a-71cd289d524a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.472798 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:41 crc kubenswrapper[4917]: E0318 07:07:41.473285 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="ceilometer-notification-agent" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.473353 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="ceilometer-notification-agent" Mar 18 07:07:41 crc kubenswrapper[4917]: E0318 07:07:41.473437 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-httpd" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.473502 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-httpd" Mar 18 07:07:41 crc kubenswrapper[4917]: E0318 07:07:41.473569 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="sg-core" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.473642 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="sg-core" Mar 18 07:07:41 crc kubenswrapper[4917]: E0318 07:07:41.473703 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-log" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.473751 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-log" Mar 18 07:07:41 crc kubenswrapper[4917]: E0318 07:07:41.473877 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="proxy-httpd" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.473941 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="proxy-httpd" Mar 18 07:07:41 crc kubenswrapper[4917]: E0318 07:07:41.473994 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="ceilometer-central-agent" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.474049 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="ceilometer-central-agent" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.474259 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-httpd" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.474333 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="sg-core" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.474387 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="ceilometer-central-agent" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.474449 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="proxy-httpd" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.474505 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" containerName="ceilometer-notification-agent" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.474571 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" containerName="glance-log" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.476211 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.478303 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.479223 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.480123 4917 scope.go:117] "RemoveContainer" containerID="cb93167bee5a4a4269396e265133d3d22202227b4bb9bbe713e076bd8788da69" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.492356 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.505959 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.514300 4917 scope.go:117] "RemoveContainer" containerID="24fd6ec4065ba9d97b15ee4a4825f12e37b49905e2a31cb36a22cfb331dfbb5c" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.531077 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.533197 4917 scope.go:117] "RemoveContainer" containerID="c4784213f6dfdc33a2715e839ad6fe12b6d12d6c90dac1fd28d25374e0ac1655" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.551350 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.551449 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-scripts\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.551475 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-log-httpd\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.551500 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.551562 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-config-data\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.551609 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-run-httpd\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.551682 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58sr6\" (UniqueName: \"kubernetes.io/projected/f6383507-902d-4956-9251-ed88757b2e98-kube-api-access-58sr6\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.551747 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3519edfd-1fd3-415a-913a-71cd289d524a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.551762 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653040 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-httpd-run\") pod \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653150 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-scripts\") pod \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653182 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-config-data\") pod \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653228 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-logs\") pod \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653255 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-combined-ca-bundle\") pod \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653329 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fw5j\" (UniqueName: \"kubernetes.io/projected/4afa7651-3ebd-4549-a15e-1b3d9d5537db-kube-api-access-8fw5j\") pod \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653344 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-public-tls-certs\") pod \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653381 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\" (UID: \"4afa7651-3ebd-4549-a15e-1b3d9d5537db\") " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653537 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4afa7651-3ebd-4549-a15e-1b3d9d5537db" (UID: "4afa7651-3ebd-4549-a15e-1b3d9d5537db"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653652 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653726 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-config-data\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653758 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-run-httpd\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653825 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58sr6\" (UniqueName: \"kubernetes.io/projected/f6383507-902d-4956-9251-ed88757b2e98-kube-api-access-58sr6\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653864 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653913 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-scripts\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653929 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-log-httpd\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.653978 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.654307 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-logs" (OuterVolumeSpecName: "logs") pod "4afa7651-3ebd-4549-a15e-1b3d9d5537db" (UID: "4afa7651-3ebd-4549-a15e-1b3d9d5537db"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.654392 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-log-httpd\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.655769 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-run-httpd\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.660748 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4afa7651-3ebd-4549-a15e-1b3d9d5537db-kube-api-access-8fw5j" (OuterVolumeSpecName: "kube-api-access-8fw5j") pod "4afa7651-3ebd-4549-a15e-1b3d9d5537db" (UID: "4afa7651-3ebd-4549-a15e-1b3d9d5537db"). InnerVolumeSpecName "kube-api-access-8fw5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.661710 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-scripts" (OuterVolumeSpecName: "scripts") pod "4afa7651-3ebd-4549-a15e-1b3d9d5537db" (UID: "4afa7651-3ebd-4549-a15e-1b3d9d5537db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.662177 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-config-data\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.663862 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.664291 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-scripts\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.664711 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "4afa7651-3ebd-4549-a15e-1b3d9d5537db" (UID: "4afa7651-3ebd-4549-a15e-1b3d9d5537db"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.669171 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.673896 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58sr6\" (UniqueName: \"kubernetes.io/projected/f6383507-902d-4956-9251-ed88757b2e98-kube-api-access-58sr6\") pod \"ceilometer-0\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.726169 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-config-data" (OuterVolumeSpecName: "config-data") pod "4afa7651-3ebd-4549-a15e-1b3d9d5537db" (UID: "4afa7651-3ebd-4549-a15e-1b3d9d5537db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.726777 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4afa7651-3ebd-4549-a15e-1b3d9d5537db" (UID: "4afa7651-3ebd-4549-a15e-1b3d9d5537db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.755537 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.755561 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.755570 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4afa7651-3ebd-4549-a15e-1b3d9d5537db-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.755578 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.755606 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fw5j\" (UniqueName: \"kubernetes.io/projected/4afa7651-3ebd-4549-a15e-1b3d9d5537db-kube-api-access-8fw5j\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.755631 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.766035 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4afa7651-3ebd-4549-a15e-1b3d9d5537db" (UID: "4afa7651-3ebd-4549-a15e-1b3d9d5537db"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.783408 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d579cf-02f6-43b0-acd6-a2b9d00ce6a0" path="/var/lib/kubelet/pods/85d579cf-02f6-43b0-acd6-a2b9d00ce6a0/volumes" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.792572 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.813284 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.857665 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4afa7651-3ebd-4549-a15e-1b3d9d5537db-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:41 crc kubenswrapper[4917]: I0318 07:07:41.857703 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.151934 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-wj558"] Mar 18 07:07:42 crc kubenswrapper[4917]: E0318 07:07:42.152669 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerName="glance-httpd" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.152688 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerName="glance-httpd" Mar 18 07:07:42 crc kubenswrapper[4917]: E0318 07:07:42.152707 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerName="glance-log" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.152716 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerName="glance-log" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.153486 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerName="glance-httpd" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.153511 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" containerName="glance-log" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.154233 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.171172 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wj558"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.265770 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpkl9\" (UniqueName: \"kubernetes.io/projected/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-kube-api-access-bpkl9\") pod \"nova-api-db-create-wj558\" (UID: \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\") " pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.265918 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-operator-scripts\") pod \"nova-api-db-create-wj558\" (UID: \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\") " pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.266070 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4ff4-account-create-update-htzjz"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.267240 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.269532 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.281541 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-sr7h8"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.282780 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.315624 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4ff4-account-create-update-htzjz"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.367531 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6ksl\" (UniqueName: \"kubernetes.io/projected/87c4981c-d457-45ea-8208-57085879a6f5-kube-api-access-l6ksl\") pod \"nova-cell0-db-create-sr7h8\" (UID: \"87c4981c-d457-45ea-8208-57085879a6f5\") " pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.367646 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe31ee4c-2020-409c-a29c-e91f4107a3f3-operator-scripts\") pod \"nova-api-4ff4-account-create-update-htzjz\" (UID: \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\") " pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.367678 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpkl9\" (UniqueName: \"kubernetes.io/projected/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-kube-api-access-bpkl9\") pod \"nova-api-db-create-wj558\" (UID: \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\") " pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.367698 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6rdh\" (UniqueName: \"kubernetes.io/projected/fe31ee4c-2020-409c-a29c-e91f4107a3f3-kube-api-access-t6rdh\") pod \"nova-api-4ff4-account-create-update-htzjz\" (UID: \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\") " pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.367714 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c4981c-d457-45ea-8208-57085879a6f5-operator-scripts\") pod \"nova-cell0-db-create-sr7h8\" (UID: \"87c4981c-d457-45ea-8208-57085879a6f5\") " pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.367768 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-operator-scripts\") pod \"nova-api-db-create-wj558\" (UID: \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\") " pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.368932 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-operator-scripts\") pod \"nova-api-db-create-wj558\" (UID: \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\") " pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.373225 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sr7h8"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.394145 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpkl9\" (UniqueName: \"kubernetes.io/projected/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-kube-api-access-bpkl9\") pod \"nova-api-db-create-wj558\" (UID: \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\") " pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.409042 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" event={"ID":"b7d0bad6-9874-40d7-8848-e138b487c00e","Type":"ContainerStarted","Data":"b7dd7e7dcd0ead924d5151b493b692b8cda6c133769ce8b19017ed20a1493c30"} Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.409385 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" event={"ID":"b7d0bad6-9874-40d7-8848-e138b487c00e","Type":"ContainerStarted","Data":"34cb7d14ec735ff0e328ee9c042c8e5f4b080009d5c97efbb47af3b0fca71c21"} Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.409398 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" event={"ID":"b7d0bad6-9874-40d7-8848-e138b487c00e","Type":"ContainerStarted","Data":"b9ac0a39b967b6f9fa2ed4e25c828093b3222b9edf5f2e0378ad05f5df9cb2b5"} Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.409430 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.409448 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.417454 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerStarted","Data":"3b7a33b40d2b6ba2fb38b5d81b2c8d62bc40976cdca5f01a992041532fcd6b7e"} Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.417673 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.421470 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.421503 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4afa7651-3ebd-4549-a15e-1b3d9d5537db","Type":"ContainerDied","Data":"4c17dbb75a5b09cb0bbe0bc37363e26fcfcca0ca4a39d969fbeeefedd2407228"} Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.421530 4917 scope.go:117] "RemoveContainer" containerID="707fc93d69d62d504d4984b9bb73411cbdd19f1fbf74b2a04513c9762a172dd0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.434347 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" podStartSLOduration=7.434329132 podStartE2EDuration="7.434329132s" podCreationTimestamp="2026-03-18 07:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:42.430852448 +0000 UTC m=+1247.372007162" watchObservedRunningTime="2026-03-18 07:07:42.434329132 +0000 UTC m=+1247.375483846" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.469701 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6ksl\" (UniqueName: \"kubernetes.io/projected/87c4981c-d457-45ea-8208-57085879a6f5-kube-api-access-l6ksl\") pod \"nova-cell0-db-create-sr7h8\" (UID: \"87c4981c-d457-45ea-8208-57085879a6f5\") " pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.469795 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe31ee4c-2020-409c-a29c-e91f4107a3f3-operator-scripts\") pod \"nova-api-4ff4-account-create-update-htzjz\" (UID: \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\") " pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.469818 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6rdh\" (UniqueName: \"kubernetes.io/projected/fe31ee4c-2020-409c-a29c-e91f4107a3f3-kube-api-access-t6rdh\") pod \"nova-api-4ff4-account-create-update-htzjz\" (UID: \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\") " pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.469836 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c4981c-d457-45ea-8208-57085879a6f5-operator-scripts\") pod \"nova-cell0-db-create-sr7h8\" (UID: \"87c4981c-d457-45ea-8208-57085879a6f5\") " pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.472331 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe31ee4c-2020-409c-a29c-e91f4107a3f3-operator-scripts\") pod \"nova-api-4ff4-account-create-update-htzjz\" (UID: \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\") " pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.472518 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c4981c-d457-45ea-8208-57085879a6f5-operator-scripts\") pod \"nova-cell0-db-create-sr7h8\" (UID: \"87c4981c-d457-45ea-8208-57085879a6f5\") " pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.473337 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.479196 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4hnm7"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.482351 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.483672 4917 scope.go:117] "RemoveContainer" containerID="e98234ea84573f53bb65b8132490e1af2226b1eb3671188da6355c4b71933e2d" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.500194 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6rdh\" (UniqueName: \"kubernetes.io/projected/fe31ee4c-2020-409c-a29c-e91f4107a3f3-kube-api-access-t6rdh\") pod \"nova-api-4ff4-account-create-update-htzjz\" (UID: \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\") " pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.509677 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6ksl\" (UniqueName: \"kubernetes.io/projected/87c4981c-d457-45ea-8208-57085879a6f5-kube-api-access-l6ksl\") pod \"nova-cell0-db-create-sr7h8\" (UID: \"87c4981c-d457-45ea-8208-57085879a6f5\") " pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.519718 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4hnm7"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.541673 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-de6b-account-create-update-j2zpb"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.542837 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.548572 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-de6b-account-create-update-j2zpb"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.551875 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.561130 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.569158 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.571858 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhbzd\" (UniqueName: \"kubernetes.io/projected/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-kube-api-access-mhbzd\") pod \"nova-cell1-db-create-4hnm7\" (UID: \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\") " pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.571980 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-operator-scripts\") pod \"nova-cell1-db-create-4hnm7\" (UID: \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\") " pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.578345 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.579630 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.582161 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bstgs" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.582371 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.582510 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.582555 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.583661 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.592937 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.603667 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.673526 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6hwl\" (UniqueName: \"kubernetes.io/projected/6a55197a-92c3-451c-9d5d-d3a6426c995b-kube-api-access-s6hwl\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.673572 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.673619 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-operator-scripts\") pod \"nova-cell1-db-create-4hnm7\" (UID: \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\") " pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.673785 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psmwq\" (UniqueName: \"kubernetes.io/projected/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-kube-api-access-psmwq\") pod \"nova-cell0-de6b-account-create-update-j2zpb\" (UID: \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\") " pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.673971 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.674003 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.674048 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.674189 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.674228 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-logs\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.674233 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-operator-scripts\") pod \"nova-cell1-db-create-4hnm7\" (UID: \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\") " pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.674264 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.674388 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhbzd\" (UniqueName: \"kubernetes.io/projected/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-kube-api-access-mhbzd\") pod \"nova-cell1-db-create-4hnm7\" (UID: \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\") " pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.674431 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-operator-scripts\") pod \"nova-cell0-de6b-account-create-update-j2zpb\" (UID: \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\") " pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.688088 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8033-account-create-update-dxbxz"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.689159 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.695244 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.701949 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8033-account-create-update-dxbxz"] Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.716697 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhbzd\" (UniqueName: \"kubernetes.io/projected/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-kube-api-access-mhbzd\") pod \"nova-cell1-db-create-4hnm7\" (UID: \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\") " pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.775952 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6hwl\" (UniqueName: \"kubernetes.io/projected/6a55197a-92c3-451c-9d5d-d3a6426c995b-kube-api-access-s6hwl\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776016 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776052 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psmwq\" (UniqueName: \"kubernetes.io/projected/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-kube-api-access-psmwq\") pod \"nova-cell0-de6b-account-create-update-j2zpb\" (UID: \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\") " pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776090 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48d7dad-f180-4cbd-bd77-906daa3558ed-operator-scripts\") pod \"nova-cell1-8033-account-create-update-dxbxz\" (UID: \"f48d7dad-f180-4cbd-bd77-906daa3558ed\") " pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776117 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j59l\" (UniqueName: \"kubernetes.io/projected/f48d7dad-f180-4cbd-bd77-906daa3558ed-kube-api-access-6j59l\") pod \"nova-cell1-8033-account-create-update-dxbxz\" (UID: \"f48d7dad-f180-4cbd-bd77-906daa3558ed\") " pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776151 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776168 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776195 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776236 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776251 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-logs\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776268 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.776306 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-operator-scripts\") pod \"nova-cell0-de6b-account-create-update-j2zpb\" (UID: \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\") " pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.777171 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-operator-scripts\") pod \"nova-cell0-de6b-account-create-update-j2zpb\" (UID: \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\") " pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.779655 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.780087 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-logs\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.792547 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-config-data\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.793064 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.794062 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.801847 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.802973 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6hwl\" (UniqueName: \"kubernetes.io/projected/6a55197a-92c3-451c-9d5d-d3a6426c995b-kube-api-access-s6hwl\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.816198 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psmwq\" (UniqueName: \"kubernetes.io/projected/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-kube-api-access-psmwq\") pod \"nova-cell0-de6b-account-create-update-j2zpb\" (UID: \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\") " pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.826570 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-scripts\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.834530 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.854448 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.891354 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.891748 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48d7dad-f180-4cbd-bd77-906daa3558ed-operator-scripts\") pod \"nova-cell1-8033-account-create-update-dxbxz\" (UID: \"f48d7dad-f180-4cbd-bd77-906daa3558ed\") " pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.891810 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j59l\" (UniqueName: \"kubernetes.io/projected/f48d7dad-f180-4cbd-bd77-906daa3558ed-kube-api-access-6j59l\") pod \"nova-cell1-8033-account-create-update-dxbxz\" (UID: \"f48d7dad-f180-4cbd-bd77-906daa3558ed\") " pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.893370 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48d7dad-f180-4cbd-bd77-906daa3558ed-operator-scripts\") pod \"nova-cell1-8033-account-create-update-dxbxz\" (UID: \"f48d7dad-f180-4cbd-bd77-906daa3558ed\") " pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.908578 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:07:42 crc kubenswrapper[4917]: I0318 07:07:42.914124 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j59l\" (UniqueName: \"kubernetes.io/projected/f48d7dad-f180-4cbd-bd77-906daa3558ed-kube-api-access-6j59l\") pod \"nova-cell1-8033-account-create-update-dxbxz\" (UID: \"f48d7dad-f180-4cbd-bd77-906daa3558ed\") " pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.092608 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wj558"] Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.201544 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-sr7h8"] Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.204965 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.268022 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4ff4-account-create-update-htzjz"] Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.403432 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4hnm7"] Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.438460 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4ff4-account-create-update-htzjz" event={"ID":"fe31ee4c-2020-409c-a29c-e91f4107a3f3","Type":"ContainerStarted","Data":"9b3dd81a6e9f3f9927e9c22f8d9f2f6eb9a17f4361d608a7a51dd0eb75d67993"} Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.478788 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-de6b-account-create-update-j2zpb"] Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.501141 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wj558" event={"ID":"25920c8e-4c1a-42d3-b021-de1b7f8b39d8","Type":"ContainerStarted","Data":"259eaeef31695e4a95c6ead3dec4e94a3b00c523ac3406e2f36ae99dc209ff02"} Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.501217 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wj558" event={"ID":"25920c8e-4c1a-42d3-b021-de1b7f8b39d8","Type":"ContainerStarted","Data":"ea7a9dee6da020768ab3205c3e38f190553946097ab23011e8c61ba4cc59f1f8"} Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.504758 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sr7h8" event={"ID":"87c4981c-d457-45ea-8208-57085879a6f5","Type":"ContainerStarted","Data":"5391c5258e30de289cb97f9720bd82ec551e1e54dc8c0276f4834d6ad1838641"} Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.513271 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4hnm7" event={"ID":"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5","Type":"ContainerStarted","Data":"1b9df155cf07c6530f1c51e08aadd546ca01db44f9da54ce9ea5fc1721a09d1e"} Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.519283 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerStarted","Data":"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f"} Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.529429 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-wj558" podStartSLOduration=1.52941253 podStartE2EDuration="1.52941253s" podCreationTimestamp="2026-03-18 07:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:43.519714996 +0000 UTC m=+1248.460869710" watchObservedRunningTime="2026-03-18 07:07:43.52941253 +0000 UTC m=+1248.470567244" Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.568217 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.795987 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4afa7651-3ebd-4549-a15e-1b3d9d5537db" path="/var/lib/kubelet/pods/4afa7651-3ebd-4549-a15e-1b3d9d5537db/volumes" Mar 18 07:07:43 crc kubenswrapper[4917]: I0318 07:07:43.797575 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8033-account-create-update-dxbxz"] Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.549426 4917 generic.go:334] "Generic (PLEG): container finished" podID="f48d7dad-f180-4cbd-bd77-906daa3558ed" containerID="ffdaf1affb8cdd37abdd624f362cd39b00ca3fc290879d1fa10e21ae85d99228" exitCode=0 Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.549639 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8033-account-create-update-dxbxz" event={"ID":"f48d7dad-f180-4cbd-bd77-906daa3558ed","Type":"ContainerDied","Data":"ffdaf1affb8cdd37abdd624f362cd39b00ca3fc290879d1fa10e21ae85d99228"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.549822 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8033-account-create-update-dxbxz" event={"ID":"f48d7dad-f180-4cbd-bd77-906daa3558ed","Type":"ContainerStarted","Data":"2a374fa0a14f980112f70c75372e837a6c67e37c9879b9614e5513ab0911ffc8"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.551343 4917 generic.go:334] "Generic (PLEG): container finished" podID="1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5" containerID="d6e8064caadde781912b56059dd14c3925340d5dcf0edfb08f0a245a4f8a6412" exitCode=0 Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.551383 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4hnm7" event={"ID":"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5","Type":"ContainerDied","Data":"d6e8064caadde781912b56059dd14c3925340d5dcf0edfb08f0a245a4f8a6412"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.555658 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerStarted","Data":"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.558882 4917 generic.go:334] "Generic (PLEG): container finished" podID="fe31ee4c-2020-409c-a29c-e91f4107a3f3" containerID="e7762f0ffafe3ca8573f7fd63a2d74e00c6e853e9b80b754389deaea5ffccc20" exitCode=0 Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.559008 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4ff4-account-create-update-htzjz" event={"ID":"fe31ee4c-2020-409c-a29c-e91f4107a3f3","Type":"ContainerDied","Data":"e7762f0ffafe3ca8573f7fd63a2d74e00c6e853e9b80b754389deaea5ffccc20"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.561524 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a55197a-92c3-451c-9d5d-d3a6426c995b","Type":"ContainerStarted","Data":"8eda75a9630a00d3bc84d4255c1197ed99b2146ee96f3b0c2b6c80a91b3dc188"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.561551 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a55197a-92c3-451c-9d5d-d3a6426c995b","Type":"ContainerStarted","Data":"87d09ffc94d20217be7a87bd5a256bac55d1b3e9873f7cfcbe7bfad674cfe67f"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.564825 4917 generic.go:334] "Generic (PLEG): container finished" podID="25920c8e-4c1a-42d3-b021-de1b7f8b39d8" containerID="259eaeef31695e4a95c6ead3dec4e94a3b00c523ac3406e2f36ae99dc209ff02" exitCode=0 Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.564898 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wj558" event={"ID":"25920c8e-4c1a-42d3-b021-de1b7f8b39d8","Type":"ContainerDied","Data":"259eaeef31695e4a95c6ead3dec4e94a3b00c523ac3406e2f36ae99dc209ff02"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.567010 4917 generic.go:334] "Generic (PLEG): container finished" podID="87c4981c-d457-45ea-8208-57085879a6f5" containerID="207b783340dbb078486c8bbf78cc7f1b5f6a610500418d725d472abd712d7974" exitCode=0 Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.567051 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sr7h8" event={"ID":"87c4981c-d457-45ea-8208-57085879a6f5","Type":"ContainerDied","Data":"207b783340dbb078486c8bbf78cc7f1b5f6a610500418d725d472abd712d7974"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.603136 4917 generic.go:334] "Generic (PLEG): container finished" podID="52afc7b2-83cd-41dc-bdb5-b764acb0af7b" containerID="9a35ebbfeb0fef433b98cf0eda6e6f539c140c481a83d869a292012ac839622b" exitCode=0 Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.604270 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" event={"ID":"52afc7b2-83cd-41dc-bdb5-b764acb0af7b","Type":"ContainerDied","Data":"9a35ebbfeb0fef433b98cf0eda6e6f539c140c481a83d869a292012ac839622b"} Mar 18 07:07:44 crc kubenswrapper[4917]: I0318 07:07:44.604308 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" event={"ID":"52afc7b2-83cd-41dc-bdb5-b764acb0af7b","Type":"ContainerStarted","Data":"cf29dc22bafdc4abf9f93812ccdcfe939490ef4a7ce0f51cd289988ec27ee235"} Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.063572 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.162596 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48d7dad-f180-4cbd-bd77-906daa3558ed-operator-scripts\") pod \"f48d7dad-f180-4cbd-bd77-906daa3558ed\" (UID: \"f48d7dad-f180-4cbd-bd77-906daa3558ed\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.162833 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6j59l\" (UniqueName: \"kubernetes.io/projected/f48d7dad-f180-4cbd-bd77-906daa3558ed-kube-api-access-6j59l\") pod \"f48d7dad-f180-4cbd-bd77-906daa3558ed\" (UID: \"f48d7dad-f180-4cbd-bd77-906daa3558ed\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.163998 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f48d7dad-f180-4cbd-bd77-906daa3558ed-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f48d7dad-f180-4cbd-bd77-906daa3558ed" (UID: "f48d7dad-f180-4cbd-bd77-906daa3558ed"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.174889 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f48d7dad-f180-4cbd-bd77-906daa3558ed-kube-api-access-6j59l" (OuterVolumeSpecName: "kube-api-access-6j59l") pod "f48d7dad-f180-4cbd-bd77-906daa3558ed" (UID: "f48d7dad-f180-4cbd-bd77-906daa3558ed"). InnerVolumeSpecName "kube-api-access-6j59l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.264728 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6j59l\" (UniqueName: \"kubernetes.io/projected/f48d7dad-f180-4cbd-bd77-906daa3558ed-kube-api-access-6j59l\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.265053 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48d7dad-f180-4cbd-bd77-906daa3558ed-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.289825 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.324933 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.332865 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.348983 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.362180 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.366831 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psmwq\" (UniqueName: \"kubernetes.io/projected/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-kube-api-access-psmwq\") pod \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\" (UID: \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.366907 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-operator-scripts\") pod \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\" (UID: \"52afc7b2-83cd-41dc-bdb5-b764acb0af7b\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.367927 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "52afc7b2-83cd-41dc-bdb5-b764acb0af7b" (UID: "52afc7b2-83cd-41dc-bdb5-b764acb0af7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.376035 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-kube-api-access-psmwq" (OuterVolumeSpecName: "kube-api-access-psmwq") pod "52afc7b2-83cd-41dc-bdb5-b764acb0af7b" (UID: "52afc7b2-83cd-41dc-bdb5-b764acb0af7b"). InnerVolumeSpecName "kube-api-access-psmwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.468960 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpkl9\" (UniqueName: \"kubernetes.io/projected/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-kube-api-access-bpkl9\") pod \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\" (UID: \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.469267 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6ksl\" (UniqueName: \"kubernetes.io/projected/87c4981c-d457-45ea-8208-57085879a6f5-kube-api-access-l6ksl\") pod \"87c4981c-d457-45ea-8208-57085879a6f5\" (UID: \"87c4981c-d457-45ea-8208-57085879a6f5\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.469361 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe31ee4c-2020-409c-a29c-e91f4107a3f3-operator-scripts\") pod \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\" (UID: \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.469471 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhbzd\" (UniqueName: \"kubernetes.io/projected/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-kube-api-access-mhbzd\") pod \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\" (UID: \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.469529 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-operator-scripts\") pod \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\" (UID: \"25920c8e-4c1a-42d3-b021-de1b7f8b39d8\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.469579 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c4981c-d457-45ea-8208-57085879a6f5-operator-scripts\") pod \"87c4981c-d457-45ea-8208-57085879a6f5\" (UID: \"87c4981c-d457-45ea-8208-57085879a6f5\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.469620 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6rdh\" (UniqueName: \"kubernetes.io/projected/fe31ee4c-2020-409c-a29c-e91f4107a3f3-kube-api-access-t6rdh\") pod \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\" (UID: \"fe31ee4c-2020-409c-a29c-e91f4107a3f3\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.469651 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-operator-scripts\") pod \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\" (UID: \"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5\") " Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.469951 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe31ee4c-2020-409c-a29c-e91f4107a3f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe31ee4c-2020-409c-a29c-e91f4107a3f3" (UID: "fe31ee4c-2020-409c-a29c-e91f4107a3f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.470277 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psmwq\" (UniqueName: \"kubernetes.io/projected/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-kube-api-access-psmwq\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.470296 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52afc7b2-83cd-41dc-bdb5-b764acb0af7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.470307 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe31ee4c-2020-409c-a29c-e91f4107a3f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.472507 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "25920c8e-4c1a-42d3-b021-de1b7f8b39d8" (UID: "25920c8e-4c1a-42d3-b021-de1b7f8b39d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.472542 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5" (UID: "1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.472697 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-kube-api-access-bpkl9" (OuterVolumeSpecName: "kube-api-access-bpkl9") pod "25920c8e-4c1a-42d3-b021-de1b7f8b39d8" (UID: "25920c8e-4c1a-42d3-b021-de1b7f8b39d8"). InnerVolumeSpecName "kube-api-access-bpkl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.473427 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c4981c-d457-45ea-8208-57085879a6f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87c4981c-d457-45ea-8208-57085879a6f5" (UID: "87c4981c-d457-45ea-8208-57085879a6f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.475523 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe31ee4c-2020-409c-a29c-e91f4107a3f3-kube-api-access-t6rdh" (OuterVolumeSpecName: "kube-api-access-t6rdh") pod "fe31ee4c-2020-409c-a29c-e91f4107a3f3" (UID: "fe31ee4c-2020-409c-a29c-e91f4107a3f3"). InnerVolumeSpecName "kube-api-access-t6rdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.475687 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c4981c-d457-45ea-8208-57085879a6f5-kube-api-access-l6ksl" (OuterVolumeSpecName: "kube-api-access-l6ksl") pod "87c4981c-d457-45ea-8208-57085879a6f5" (UID: "87c4981c-d457-45ea-8208-57085879a6f5"). InnerVolumeSpecName "kube-api-access-l6ksl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.475699 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-kube-api-access-mhbzd" (OuterVolumeSpecName: "kube-api-access-mhbzd") pod "1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5" (UID: "1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5"). InnerVolumeSpecName "kube-api-access-mhbzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.571360 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhbzd\" (UniqueName: \"kubernetes.io/projected/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-kube-api-access-mhbzd\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.571387 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.571396 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87c4981c-d457-45ea-8208-57085879a6f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.571407 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6rdh\" (UniqueName: \"kubernetes.io/projected/fe31ee4c-2020-409c-a29c-e91f4107a3f3-kube-api-access-t6rdh\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.571418 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.571428 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpkl9\" (UniqueName: \"kubernetes.io/projected/25920c8e-4c1a-42d3-b021-de1b7f8b39d8-kube-api-access-bpkl9\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.571438 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6ksl\" (UniqueName: \"kubernetes.io/projected/87c4981c-d457-45ea-8208-57085879a6f5-kube-api-access-l6ksl\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.624732 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8033-account-create-update-dxbxz" event={"ID":"f48d7dad-f180-4cbd-bd77-906daa3558ed","Type":"ContainerDied","Data":"2a374fa0a14f980112f70c75372e837a6c67e37c9879b9614e5513ab0911ffc8"} Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.624777 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a374fa0a14f980112f70c75372e837a6c67e37c9879b9614e5513ab0911ffc8" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.624835 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8033-account-create-update-dxbxz" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.628895 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4hnm7" event={"ID":"1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5","Type":"ContainerDied","Data":"1b9df155cf07c6530f1c51e08aadd546ca01db44f9da54ce9ea5fc1721a09d1e"} Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.628918 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4hnm7" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.628953 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b9df155cf07c6530f1c51e08aadd546ca01db44f9da54ce9ea5fc1721a09d1e" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.632224 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerStarted","Data":"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472"} Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.639695 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4ff4-account-create-update-htzjz" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.639685 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4ff4-account-create-update-htzjz" event={"ID":"fe31ee4c-2020-409c-a29c-e91f4107a3f3","Type":"ContainerDied","Data":"9b3dd81a6e9f3f9927e9c22f8d9f2f6eb9a17f4361d608a7a51dd0eb75d67993"} Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.639825 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b3dd81a6e9f3f9927e9c22f8d9f2f6eb9a17f4361d608a7a51dd0eb75d67993" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.641639 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a55197a-92c3-451c-9d5d-d3a6426c995b","Type":"ContainerStarted","Data":"ea57b6c51e2a6d56fc3ad907733130bd48fe5600650c29357de522f2e527c1cf"} Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.644399 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wj558" event={"ID":"25920c8e-4c1a-42d3-b021-de1b7f8b39d8","Type":"ContainerDied","Data":"ea7a9dee6da020768ab3205c3e38f190553946097ab23011e8c61ba4cc59f1f8"} Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.644435 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea7a9dee6da020768ab3205c3e38f190553946097ab23011e8c61ba4cc59f1f8" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.644413 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wj558" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.646776 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-sr7h8" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.646787 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-sr7h8" event={"ID":"87c4981c-d457-45ea-8208-57085879a6f5","Type":"ContainerDied","Data":"5391c5258e30de289cb97f9720bd82ec551e1e54dc8c0276f4834d6ad1838641"} Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.646961 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5391c5258e30de289cb97f9720bd82ec551e1e54dc8c0276f4834d6ad1838641" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.654792 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" event={"ID":"52afc7b2-83cd-41dc-bdb5-b764acb0af7b","Type":"ContainerDied","Data":"cf29dc22bafdc4abf9f93812ccdcfe939490ef4a7ce0f51cd289988ec27ee235"} Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.654830 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf29dc22bafdc4abf9f93812ccdcfe939490ef4a7ce0f51cd289988ec27ee235" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.654882 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-de6b-account-create-update-j2zpb" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.678624 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.678604212 podStartE2EDuration="4.678604212s" podCreationTimestamp="2026-03-18 07:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:07:46.664369469 +0000 UTC m=+1251.605524193" watchObservedRunningTime="2026-03-18 07:07:46.678604212 +0000 UTC m=+1251.619758946" Mar 18 07:07:46 crc kubenswrapper[4917]: I0318 07:07:46.745817 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.799835 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qj2wp"] Mar 18 07:07:47 crc kubenswrapper[4917]: E0318 07:07:47.800507 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c4981c-d457-45ea-8208-57085879a6f5" containerName="mariadb-database-create" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800521 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c4981c-d457-45ea-8208-57085879a6f5" containerName="mariadb-database-create" Mar 18 07:07:47 crc kubenswrapper[4917]: E0318 07:07:47.800539 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5" containerName="mariadb-database-create" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800547 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5" containerName="mariadb-database-create" Mar 18 07:07:47 crc kubenswrapper[4917]: E0318 07:07:47.800568 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe31ee4c-2020-409c-a29c-e91f4107a3f3" containerName="mariadb-account-create-update" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800576 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe31ee4c-2020-409c-a29c-e91f4107a3f3" containerName="mariadb-account-create-update" Mar 18 07:07:47 crc kubenswrapper[4917]: E0318 07:07:47.800618 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f48d7dad-f180-4cbd-bd77-906daa3558ed" containerName="mariadb-account-create-update" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800627 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48d7dad-f180-4cbd-bd77-906daa3558ed" containerName="mariadb-account-create-update" Mar 18 07:07:47 crc kubenswrapper[4917]: E0318 07:07:47.800647 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52afc7b2-83cd-41dc-bdb5-b764acb0af7b" containerName="mariadb-account-create-update" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800655 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="52afc7b2-83cd-41dc-bdb5-b764acb0af7b" containerName="mariadb-account-create-update" Mar 18 07:07:47 crc kubenswrapper[4917]: E0318 07:07:47.800678 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25920c8e-4c1a-42d3-b021-de1b7f8b39d8" containerName="mariadb-database-create" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800686 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="25920c8e-4c1a-42d3-b021-de1b7f8b39d8" containerName="mariadb-database-create" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800876 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe31ee4c-2020-409c-a29c-e91f4107a3f3" containerName="mariadb-account-create-update" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800888 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="25920c8e-4c1a-42d3-b021-de1b7f8b39d8" containerName="mariadb-database-create" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800897 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f48d7dad-f180-4cbd-bd77-906daa3558ed" containerName="mariadb-account-create-update" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800909 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5" containerName="mariadb-database-create" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800918 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="52afc7b2-83cd-41dc-bdb5-b764acb0af7b" containerName="mariadb-account-create-update" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.800930 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c4981c-d457-45ea-8208-57085879a6f5" containerName="mariadb-database-create" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.801444 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.804491 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7kdkf" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.804520 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.804660 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.820801 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qj2wp"] Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.906370 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-scripts\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.906455 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-config-data\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.906482 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:47 crc kubenswrapper[4917]: I0318 07:07:47.906546 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9gqp\" (UniqueName: \"kubernetes.io/projected/7011ea08-beba-47f5-95e3-bb13aead931a-kube-api-access-t9gqp\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.008133 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-scripts\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.009288 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-config-data\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.009337 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.009448 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9gqp\" (UniqueName: \"kubernetes.io/projected/7011ea08-beba-47f5-95e3-bb13aead931a-kube-api-access-t9gqp\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.011843 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-scripts\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.013056 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.014279 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-config-data\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.028020 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9gqp\" (UniqueName: \"kubernetes.io/projected/7011ea08-beba-47f5-95e3-bb13aead931a-kube-api-access-t9gqp\") pod \"nova-cell0-conductor-db-sync-qj2wp\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.131877 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.587998 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qj2wp"] Mar 18 07:07:48 crc kubenswrapper[4917]: W0318 07:07:48.589082 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7011ea08_beba_47f5_95e3_bb13aead931a.slice/crio-53a6304199a918f6a64c77d1d74bf49b67a5f010bf995f8b8cad4854f8b29978 WatchSource:0}: Error finding container 53a6304199a918f6a64c77d1d74bf49b67a5f010bf995f8b8cad4854f8b29978: Status 404 returned error can't find the container with id 53a6304199a918f6a64c77d1d74bf49b67a5f010bf995f8b8cad4854f8b29978 Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.675635 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qj2wp" event={"ID":"7011ea08-beba-47f5-95e3-bb13aead931a","Type":"ContainerStarted","Data":"53a6304199a918f6a64c77d1d74bf49b67a5f010bf995f8b8cad4854f8b29978"} Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.678512 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerStarted","Data":"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5"} Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.678693 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="ceilometer-central-agent" containerID="cri-o://49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f" gracePeriod=30 Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.678909 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.679156 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="proxy-httpd" containerID="cri-o://8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5" gracePeriod=30 Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.679207 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="sg-core" containerID="cri-o://939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472" gracePeriod=30 Mar 18 07:07:48 crc kubenswrapper[4917]: I0318 07:07:48.679240 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="ceilometer-notification-agent" containerID="cri-o://6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f" gracePeriod=30 Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.469959 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.538704 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58sr6\" (UniqueName: \"kubernetes.io/projected/f6383507-902d-4956-9251-ed88757b2e98-kube-api-access-58sr6\") pod \"f6383507-902d-4956-9251-ed88757b2e98\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.538765 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-run-httpd\") pod \"f6383507-902d-4956-9251-ed88757b2e98\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.538850 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-log-httpd\") pod \"f6383507-902d-4956-9251-ed88757b2e98\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.538867 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-config-data\") pod \"f6383507-902d-4956-9251-ed88757b2e98\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.538919 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-combined-ca-bundle\") pod \"f6383507-902d-4956-9251-ed88757b2e98\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.538978 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-scripts\") pod \"f6383507-902d-4956-9251-ed88757b2e98\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.539050 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-sg-core-conf-yaml\") pod \"f6383507-902d-4956-9251-ed88757b2e98\" (UID: \"f6383507-902d-4956-9251-ed88757b2e98\") " Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.539363 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f6383507-902d-4956-9251-ed88757b2e98" (UID: "f6383507-902d-4956-9251-ed88757b2e98"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.539523 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f6383507-902d-4956-9251-ed88757b2e98" (UID: "f6383507-902d-4956-9251-ed88757b2e98"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.539551 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.557750 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-scripts" (OuterVolumeSpecName: "scripts") pod "f6383507-902d-4956-9251-ed88757b2e98" (UID: "f6383507-902d-4956-9251-ed88757b2e98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.574133 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6383507-902d-4956-9251-ed88757b2e98-kube-api-access-58sr6" (OuterVolumeSpecName: "kube-api-access-58sr6") pod "f6383507-902d-4956-9251-ed88757b2e98" (UID: "f6383507-902d-4956-9251-ed88757b2e98"). InnerVolumeSpecName "kube-api-access-58sr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.576656 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f6383507-902d-4956-9251-ed88757b2e98" (UID: "f6383507-902d-4956-9251-ed88757b2e98"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.640857 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58sr6\" (UniqueName: \"kubernetes.io/projected/f6383507-902d-4956-9251-ed88757b2e98-kube-api-access-58sr6\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.640892 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6383507-902d-4956-9251-ed88757b2e98-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.640905 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.640917 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.647404 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6383507-902d-4956-9251-ed88757b2e98" (UID: "f6383507-902d-4956-9251-ed88757b2e98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.653257 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-config-data" (OuterVolumeSpecName: "config-data") pod "f6383507-902d-4956-9251-ed88757b2e98" (UID: "f6383507-902d-4956-9251-ed88757b2e98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697088 4917 generic.go:334] "Generic (PLEG): container finished" podID="f6383507-902d-4956-9251-ed88757b2e98" containerID="8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5" exitCode=0 Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697143 4917 generic.go:334] "Generic (PLEG): container finished" podID="f6383507-902d-4956-9251-ed88757b2e98" containerID="939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472" exitCode=2 Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697152 4917 generic.go:334] "Generic (PLEG): container finished" podID="f6383507-902d-4956-9251-ed88757b2e98" containerID="6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f" exitCode=0 Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697162 4917 generic.go:334] "Generic (PLEG): container finished" podID="f6383507-902d-4956-9251-ed88757b2e98" containerID="49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f" exitCode=0 Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697210 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerDied","Data":"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5"} Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697237 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerDied","Data":"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472"} Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697248 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerDied","Data":"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f"} Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697256 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerDied","Data":"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f"} Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697290 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6383507-902d-4956-9251-ed88757b2e98","Type":"ContainerDied","Data":"3b7a33b40d2b6ba2fb38b5d81b2c8d62bc40976cdca5f01a992041532fcd6b7e"} Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697306 4917 scope.go:117] "RemoveContainer" containerID="8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.697486 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.733942 4917 scope.go:117] "RemoveContainer" containerID="939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.742785 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.742814 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6383507-902d-4956-9251-ed88757b2e98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.744466 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.763919 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.797793 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6383507-902d-4956-9251-ed88757b2e98" path="/var/lib/kubelet/pods/f6383507-902d-4956-9251-ed88757b2e98/volumes" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.798857 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:49 crc kubenswrapper[4917]: E0318 07:07:49.799864 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="sg-core" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.799888 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="sg-core" Mar 18 07:07:49 crc kubenswrapper[4917]: E0318 07:07:49.799911 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="ceilometer-central-agent" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.799918 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="ceilometer-central-agent" Mar 18 07:07:49 crc kubenswrapper[4917]: E0318 07:07:49.799935 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="ceilometer-notification-agent" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.799943 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="ceilometer-notification-agent" Mar 18 07:07:49 crc kubenswrapper[4917]: E0318 07:07:49.799957 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="proxy-httpd" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.799967 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="proxy-httpd" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.800415 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="ceilometer-central-agent" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.800436 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="proxy-httpd" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.800449 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="sg-core" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.800463 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6383507-902d-4956-9251-ed88757b2e98" containerName="ceilometer-notification-agent" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.804120 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.806241 4917 scope.go:117] "RemoveContainer" containerID="6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.806763 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.806815 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.806824 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.838046 4917 scope.go:117] "RemoveContainer" containerID="49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.859444 4917 scope.go:117] "RemoveContainer" containerID="8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5" Mar 18 07:07:49 crc kubenswrapper[4917]: E0318 07:07:49.859840 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5\": container with ID starting with 8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5 not found: ID does not exist" containerID="8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.859885 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5"} err="failed to get container status \"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5\": rpc error: code = NotFound desc = could not find container \"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5\": container with ID starting with 8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5 not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.859910 4917 scope.go:117] "RemoveContainer" containerID="939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472" Mar 18 07:07:49 crc kubenswrapper[4917]: E0318 07:07:49.860506 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472\": container with ID starting with 939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472 not found: ID does not exist" containerID="939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.860538 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472"} err="failed to get container status \"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472\": rpc error: code = NotFound desc = could not find container \"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472\": container with ID starting with 939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472 not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.860558 4917 scope.go:117] "RemoveContainer" containerID="6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f" Mar 18 07:07:49 crc kubenswrapper[4917]: E0318 07:07:49.860850 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f\": container with ID starting with 6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f not found: ID does not exist" containerID="6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.860874 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f"} err="failed to get container status \"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f\": rpc error: code = NotFound desc = could not find container \"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f\": container with ID starting with 6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.860898 4917 scope.go:117] "RemoveContainer" containerID="49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f" Mar 18 07:07:49 crc kubenswrapper[4917]: E0318 07:07:49.861247 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f\": container with ID starting with 49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f not found: ID does not exist" containerID="49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.861271 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f"} err="failed to get container status \"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f\": rpc error: code = NotFound desc = could not find container \"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f\": container with ID starting with 49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.861284 4917 scope.go:117] "RemoveContainer" containerID="8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.861727 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5"} err="failed to get container status \"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5\": rpc error: code = NotFound desc = could not find container \"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5\": container with ID starting with 8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5 not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.861752 4917 scope.go:117] "RemoveContainer" containerID="939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.862018 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472"} err="failed to get container status \"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472\": rpc error: code = NotFound desc = could not find container \"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472\": container with ID starting with 939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472 not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.862035 4917 scope.go:117] "RemoveContainer" containerID="6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.862348 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f"} err="failed to get container status \"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f\": rpc error: code = NotFound desc = could not find container \"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f\": container with ID starting with 6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.862365 4917 scope.go:117] "RemoveContainer" containerID="49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.862648 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f"} err="failed to get container status \"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f\": rpc error: code = NotFound desc = could not find container \"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f\": container with ID starting with 49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.862678 4917 scope.go:117] "RemoveContainer" containerID="8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.862994 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5"} err="failed to get container status \"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5\": rpc error: code = NotFound desc = could not find container \"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5\": container with ID starting with 8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5 not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.863017 4917 scope.go:117] "RemoveContainer" containerID="939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.863317 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472"} err="failed to get container status \"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472\": rpc error: code = NotFound desc = could not find container \"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472\": container with ID starting with 939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472 not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.863355 4917 scope.go:117] "RemoveContainer" containerID="6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.863638 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f"} err="failed to get container status \"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f\": rpc error: code = NotFound desc = could not find container \"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f\": container with ID starting with 6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.863663 4917 scope.go:117] "RemoveContainer" containerID="49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.864073 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f"} err="failed to get container status \"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f\": rpc error: code = NotFound desc = could not find container \"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f\": container with ID starting with 49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.864094 4917 scope.go:117] "RemoveContainer" containerID="8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.864435 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5"} err="failed to get container status \"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5\": rpc error: code = NotFound desc = could not find container \"8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5\": container with ID starting with 8832c828f473a1ee6add7b62b731b1141de0d3d2c8da3abf60cc02555626d2a5 not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.864454 4917 scope.go:117] "RemoveContainer" containerID="939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.864819 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472"} err="failed to get container status \"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472\": rpc error: code = NotFound desc = could not find container \"939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472\": container with ID starting with 939461f6d8eedfdc01f949b9a54d2b08a31ef7e4db1008ff697521a9237bb472 not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.864838 4917 scope.go:117] "RemoveContainer" containerID="6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.865240 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f"} err="failed to get container status \"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f\": rpc error: code = NotFound desc = could not find container \"6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f\": container with ID starting with 6ef4d51a1f6c39f57e66409987d9db8923ff6b8949294ee4f74f92f8ef01a67f not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.865260 4917 scope.go:117] "RemoveContainer" containerID="49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.865678 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f"} err="failed to get container status \"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f\": rpc error: code = NotFound desc = could not find container \"49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f\": container with ID starting with 49b7297f0a10c78caf2e34df6d24485a7fae1e2329e949c45fe417e6275c2c7f not found: ID does not exist" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.945884 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-config-data\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.945935 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-log-httpd\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.945954 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n9cq\" (UniqueName: \"kubernetes.io/projected/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-kube-api-access-2n9cq\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.945972 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.945996 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-run-httpd\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.946025 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-scripts\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.946076 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:49 crc kubenswrapper[4917]: I0318 07:07:49.964222 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-df967876c-494l9" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.034092 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5587f99f7b-qj9dj"] Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.034329 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5587f99f7b-qj9dj" podUID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerName="neutron-api" containerID="cri-o://e928e0f57982db1ec1486f972001647c43daaf622da5dba2ab54070f700afb60" gracePeriod=30 Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.034414 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5587f99f7b-qj9dj" podUID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerName="neutron-httpd" containerID="cri-o://f3989c857b08a9bc817fbf1d1c6766f330bfa4cc2c6d613a35e5eab43f105777" gracePeriod=30 Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.065070 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.065485 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-config-data\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.065530 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-log-httpd\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.065551 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n9cq\" (UniqueName: \"kubernetes.io/projected/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-kube-api-access-2n9cq\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.065572 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.065639 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-run-httpd\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.065700 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-scripts\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.066356 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-log-httpd\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.068920 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-run-httpd\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.077278 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.077321 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-scripts\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.078062 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-config-data\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.078308 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.095614 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n9cq\" (UniqueName: \"kubernetes.io/projected/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-kube-api-access-2n9cq\") pod \"ceilometer-0\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.124346 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.624304 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:50 crc kubenswrapper[4917]: W0318 07:07:50.632434 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ddcb8ec_31f8_4d6e_91f4_83d5537a20dc.slice/crio-9a3b6dd61bc6f9c07b9d11bd1976764e1ef627c91eb2398adf0bef287cc7ea75 WatchSource:0}: Error finding container 9a3b6dd61bc6f9c07b9d11bd1976764e1ef627c91eb2398adf0bef287cc7ea75: Status 404 returned error can't find the container with id 9a3b6dd61bc6f9c07b9d11bd1976764e1ef627c91eb2398adf0bef287cc7ea75 Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.683938 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.687404 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.710913 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerStarted","Data":"9a3b6dd61bc6f9c07b9d11bd1976764e1ef627c91eb2398adf0bef287cc7ea75"} Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.714101 4917 generic.go:334] "Generic (PLEG): container finished" podID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerID="f3989c857b08a9bc817fbf1d1c6766f330bfa4cc2c6d613a35e5eab43f105777" exitCode=0 Mar 18 07:07:50 crc kubenswrapper[4917]: I0318 07:07:50.714131 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5587f99f7b-qj9dj" event={"ID":"09ab5b75-3af1-4097-acb4-3dca0e3986c6","Type":"ContainerDied","Data":"f3989c857b08a9bc817fbf1d1c6766f330bfa4cc2c6d613a35e5eab43f105777"} Mar 18 07:07:51 crc kubenswrapper[4917]: I0318 07:07:51.726424 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerStarted","Data":"245ba55e672fdb718ef99d4f553f1eb46be5e64e2ad362be6d8902aa8556ed41"} Mar 18 07:07:52 crc kubenswrapper[4917]: I0318 07:07:52.737095 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerStarted","Data":"9b936e59ea438fb9339e8163868ab0d80ef3aa21be40b1d3519fc252bb7cee7c"} Mar 18 07:07:52 crc kubenswrapper[4917]: I0318 07:07:52.909525 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 07:07:52 crc kubenswrapper[4917]: I0318 07:07:52.909948 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 07:07:52 crc kubenswrapper[4917]: I0318 07:07:52.959640 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 07:07:52 crc kubenswrapper[4917]: I0318 07:07:52.966497 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 07:07:53 crc kubenswrapper[4917]: I0318 07:07:53.745320 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 07:07:53 crc kubenswrapper[4917]: I0318 07:07:53.745379 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 07:07:55 crc kubenswrapper[4917]: I0318 07:07:55.687942 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 07:07:55 crc kubenswrapper[4917]: I0318 07:07:55.692128 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 07:07:55 crc kubenswrapper[4917]: I0318 07:07:55.793537 4917 generic.go:334] "Generic (PLEG): container finished" podID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerID="e928e0f57982db1ec1486f972001647c43daaf622da5dba2ab54070f700afb60" exitCode=0 Mar 18 07:07:55 crc kubenswrapper[4917]: I0318 07:07:55.818843 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5587f99f7b-qj9dj" event={"ID":"09ab5b75-3af1-4097-acb4-3dca0e3986c6","Type":"ContainerDied","Data":"e928e0f57982db1ec1486f972001647c43daaf622da5dba2ab54070f700afb60"} Mar 18 07:07:56 crc kubenswrapper[4917]: I0318 07:07:56.518452 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.130184 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.211109 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.286604 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7fc585cd46-xw4c5"] Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.286844 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7fc585cd46-xw4c5" podUID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerName="placement-log" containerID="cri-o://82b032445208a2c8a4834db5814b65f14434c9626d0d60b8ce4d9d30da355c1b" gracePeriod=30 Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.286972 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7fc585cd46-xw4c5" podUID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerName="placement-api" containerID="cri-o://bec7bb8157826d333559c4ee554e54a0b97bbeb1ae1a10fa37e602246aa02d77" gracePeriod=30 Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.554669 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.666723 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-config\") pod \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.667053 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-httpd-config\") pod \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.667124 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-ovndb-tls-certs\") pod \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.667260 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-combined-ca-bundle\") pod \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.667326 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmn5s\" (UniqueName: \"kubernetes.io/projected/09ab5b75-3af1-4097-acb4-3dca0e3986c6-kube-api-access-bmn5s\") pod \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\" (UID: \"09ab5b75-3af1-4097-acb4-3dca0e3986c6\") " Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.673530 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "09ab5b75-3af1-4097-acb4-3dca0e3986c6" (UID: "09ab5b75-3af1-4097-acb4-3dca0e3986c6"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.678090 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ab5b75-3af1-4097-acb4-3dca0e3986c6-kube-api-access-bmn5s" (OuterVolumeSpecName: "kube-api-access-bmn5s") pod "09ab5b75-3af1-4097-acb4-3dca0e3986c6" (UID: "09ab5b75-3af1-4097-acb4-3dca0e3986c6"). InnerVolumeSpecName "kube-api-access-bmn5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.745320 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09ab5b75-3af1-4097-acb4-3dca0e3986c6" (UID: "09ab5b75-3af1-4097-acb4-3dca0e3986c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.755479 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-config" (OuterVolumeSpecName: "config") pod "09ab5b75-3af1-4097-acb4-3dca0e3986c6" (UID: "09ab5b75-3af1-4097-acb4-3dca0e3986c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.777656 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.777696 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmn5s\" (UniqueName: \"kubernetes.io/projected/09ab5b75-3af1-4097-acb4-3dca0e3986c6-kube-api-access-bmn5s\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.777707 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.777717 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.788293 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "09ab5b75-3af1-4097-acb4-3dca0e3986c6" (UID: "09ab5b75-3af1-4097-acb4-3dca0e3986c6"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.859377 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5587f99f7b-qj9dj" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.859389 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5587f99f7b-qj9dj" event={"ID":"09ab5b75-3af1-4097-acb4-3dca0e3986c6","Type":"ContainerDied","Data":"e895464ae6ccbbe0b9c8d16edfd365abd8cf7141614b582bbf86320c0555a362"} Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.859518 4917 scope.go:117] "RemoveContainer" containerID="f3989c857b08a9bc817fbf1d1c6766f330bfa4cc2c6d613a35e5eab43f105777" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.862948 4917 generic.go:334] "Generic (PLEG): container finished" podID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerID="82b032445208a2c8a4834db5814b65f14434c9626d0d60b8ce4d9d30da355c1b" exitCode=143 Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.863110 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fc585cd46-xw4c5" event={"ID":"c9492f8d-33b1-4ea7-9f85-0137cc2443ed","Type":"ContainerDied","Data":"82b032445208a2c8a4834db5814b65f14434c9626d0d60b8ce4d9d30da355c1b"} Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.879326 4917 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09ab5b75-3af1-4097-acb4-3dca0e3986c6-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.887568 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5587f99f7b-qj9dj"] Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.892037 4917 scope.go:117] "RemoveContainer" containerID="e928e0f57982db1ec1486f972001647c43daaf622da5dba2ab54070f700afb60" Mar 18 07:07:59 crc kubenswrapper[4917]: I0318 07:07:59.894746 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5587f99f7b-qj9dj"] Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.131989 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563628-44zvp"] Mar 18 07:08:00 crc kubenswrapper[4917]: E0318 07:08:00.132365 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerName="neutron-httpd" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.132379 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerName="neutron-httpd" Mar 18 07:08:00 crc kubenswrapper[4917]: E0318 07:08:00.132403 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerName="neutron-api" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.132413 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerName="neutron-api" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.132641 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerName="neutron-httpd" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.132665 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" containerName="neutron-api" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.133282 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563628-44zvp" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.136858 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.137057 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.137282 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.152348 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563628-44zvp"] Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.285750 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c9n6\" (UniqueName: \"kubernetes.io/projected/87c2b17b-d754-406a-98e6-466bf76cf5a0-kube-api-access-4c9n6\") pod \"auto-csr-approver-29563628-44zvp\" (UID: \"87c2b17b-d754-406a-98e6-466bf76cf5a0\") " pod="openshift-infra/auto-csr-approver-29563628-44zvp" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.387733 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c9n6\" (UniqueName: \"kubernetes.io/projected/87c2b17b-d754-406a-98e6-466bf76cf5a0-kube-api-access-4c9n6\") pod \"auto-csr-approver-29563628-44zvp\" (UID: \"87c2b17b-d754-406a-98e6-466bf76cf5a0\") " pod="openshift-infra/auto-csr-approver-29563628-44zvp" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.406286 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c9n6\" (UniqueName: \"kubernetes.io/projected/87c2b17b-d754-406a-98e6-466bf76cf5a0-kube-api-access-4c9n6\") pod \"auto-csr-approver-29563628-44zvp\" (UID: \"87c2b17b-d754-406a-98e6-466bf76cf5a0\") " pod="openshift-infra/auto-csr-approver-29563628-44zvp" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.453746 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563628-44zvp" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.872904 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qj2wp" event={"ID":"7011ea08-beba-47f5-95e3-bb13aead931a","Type":"ContainerStarted","Data":"d0540dd64f9371f312f25eb071dea28339a516150812d0bc566670670bc1cd75"} Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.874768 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerStarted","Data":"46dba83aa670356fa4004906b5fa0253b13f834d3150c8b025c8282d92229617"} Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.891168 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-qj2wp" podStartSLOduration=2.788947128 podStartE2EDuration="13.891147731s" podCreationTimestamp="2026-03-18 07:07:47 +0000 UTC" firstStartedPulling="2026-03-18 07:07:48.590991251 +0000 UTC m=+1253.532145965" lastFinishedPulling="2026-03-18 07:07:59.693191854 +0000 UTC m=+1264.634346568" observedRunningTime="2026-03-18 07:08:00.887399391 +0000 UTC m=+1265.828554105" watchObservedRunningTime="2026-03-18 07:08:00.891147731 +0000 UTC m=+1265.832302445" Mar 18 07:08:00 crc kubenswrapper[4917]: I0318 07:08:00.944099 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563628-44zvp"] Mar 18 07:08:01 crc kubenswrapper[4917]: I0318 07:08:01.784803 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ab5b75-3af1-4097-acb4-3dca0e3986c6" path="/var/lib/kubelet/pods/09ab5b75-3af1-4097-acb4-3dca0e3986c6/volumes" Mar 18 07:08:01 crc kubenswrapper[4917]: I0318 07:08:01.899464 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563628-44zvp" event={"ID":"87c2b17b-d754-406a-98e6-466bf76cf5a0","Type":"ContainerStarted","Data":"4f6b0ff51da2908582e193575bc3b981e50ae42ce6f87fd26b6ecea75b696149"} Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.910043 4917 generic.go:334] "Generic (PLEG): container finished" podID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerID="bec7bb8157826d333559c4ee554e54a0b97bbeb1ae1a10fa37e602246aa02d77" exitCode=0 Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.910104 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fc585cd46-xw4c5" event={"ID":"c9492f8d-33b1-4ea7-9f85-0137cc2443ed","Type":"ContainerDied","Data":"bec7bb8157826d333559c4ee554e54a0b97bbeb1ae1a10fa37e602246aa02d77"} Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.910710 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7fc585cd46-xw4c5" event={"ID":"c9492f8d-33b1-4ea7-9f85-0137cc2443ed","Type":"ContainerDied","Data":"459b3361d8db3725080c2527685e226c0066d82d93763c9b404e8a525760513a"} Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.910727 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="459b3361d8db3725080c2527685e226c0066d82d93763c9b404e8a525760513a" Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.915758 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerStarted","Data":"fac743df19fc27567027fac378bd664127918b928ea3a9d066d90306e3e5e18a"} Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.915958 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="ceilometer-central-agent" containerID="cri-o://245ba55e672fdb718ef99d4f553f1eb46be5e64e2ad362be6d8902aa8556ed41" gracePeriod=30 Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.915998 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.916061 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="sg-core" containerID="cri-o://46dba83aa670356fa4004906b5fa0253b13f834d3150c8b025c8282d92229617" gracePeriod=30 Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.916132 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="proxy-httpd" containerID="cri-o://fac743df19fc27567027fac378bd664127918b928ea3a9d066d90306e3e5e18a" gracePeriod=30 Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.916121 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="ceilometer-notification-agent" containerID="cri-o://9b936e59ea438fb9339e8163868ab0d80ef3aa21be40b1d3519fc252bb7cee7c" gracePeriod=30 Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.948346 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:08:02 crc kubenswrapper[4917]: I0318 07:08:02.953013 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.073851532 podStartE2EDuration="13.952989533s" podCreationTimestamp="2026-03-18 07:07:49 +0000 UTC" firstStartedPulling="2026-03-18 07:07:50.635171157 +0000 UTC m=+1255.576325881" lastFinishedPulling="2026-03-18 07:08:02.514309158 +0000 UTC m=+1267.455463882" observedRunningTime="2026-03-18 07:08:02.942737235 +0000 UTC m=+1267.883891979" watchObservedRunningTime="2026-03-18 07:08:02.952989533 +0000 UTC m=+1267.894144277" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.048935 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-internal-tls-certs\") pod \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.048973 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-scripts\") pod \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.049038 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-combined-ca-bundle\") pod \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.049088 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm7dq\" (UniqueName: \"kubernetes.io/projected/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-kube-api-access-xm7dq\") pod \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.049147 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-config-data\") pod \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.049167 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-public-tls-certs\") pod \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.049214 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-logs\") pod \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.049845 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-logs" (OuterVolumeSpecName: "logs") pod "c9492f8d-33b1-4ea7-9f85-0137cc2443ed" (UID: "c9492f8d-33b1-4ea7-9f85-0137cc2443ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.055429 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-kube-api-access-xm7dq" (OuterVolumeSpecName: "kube-api-access-xm7dq") pod "c9492f8d-33b1-4ea7-9f85-0137cc2443ed" (UID: "c9492f8d-33b1-4ea7-9f85-0137cc2443ed"). InnerVolumeSpecName "kube-api-access-xm7dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.056302 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-scripts" (OuterVolumeSpecName: "scripts") pod "c9492f8d-33b1-4ea7-9f85-0137cc2443ed" (UID: "c9492f8d-33b1-4ea7-9f85-0137cc2443ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.109345 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-config-data" (OuterVolumeSpecName: "config-data") pod "c9492f8d-33b1-4ea7-9f85-0137cc2443ed" (UID: "c9492f8d-33b1-4ea7-9f85-0137cc2443ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.111030 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9492f8d-33b1-4ea7-9f85-0137cc2443ed" (UID: "c9492f8d-33b1-4ea7-9f85-0137cc2443ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.150957 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c9492f8d-33b1-4ea7-9f85-0137cc2443ed" (UID: "c9492f8d-33b1-4ea7-9f85-0137cc2443ed"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.151051 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-internal-tls-certs\") pod \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\" (UID: \"c9492f8d-33b1-4ea7-9f85-0137cc2443ed\") " Mar 18 07:08:03 crc kubenswrapper[4917]: W0318 07:08:03.151325 4917 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c9492f8d-33b1-4ea7-9f85-0137cc2443ed/volumes/kubernetes.io~secret/internal-tls-certs Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.151343 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c9492f8d-33b1-4ea7-9f85-0137cc2443ed" (UID: "c9492f8d-33b1-4ea7-9f85-0137cc2443ed"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.151537 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.151556 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm7dq\" (UniqueName: \"kubernetes.io/projected/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-kube-api-access-xm7dq\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.151565 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.151576 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.151607 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.151618 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.181698 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c9492f8d-33b1-4ea7-9f85-0137cc2443ed" (UID: "c9492f8d-33b1-4ea7-9f85-0137cc2443ed"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.252929 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9492f8d-33b1-4ea7-9f85-0137cc2443ed-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.933194 4917 generic.go:334] "Generic (PLEG): container finished" podID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerID="46dba83aa670356fa4004906b5fa0253b13f834d3150c8b025c8282d92229617" exitCode=2 Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.933236 4917 generic.go:334] "Generic (PLEG): container finished" podID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerID="9b936e59ea438fb9339e8163868ab0d80ef3aa21be40b1d3519fc252bb7cee7c" exitCode=0 Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.933246 4917 generic.go:334] "Generic (PLEG): container finished" podID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerID="245ba55e672fdb718ef99d4f553f1eb46be5e64e2ad362be6d8902aa8556ed41" exitCode=0 Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.933294 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerDied","Data":"46dba83aa670356fa4004906b5fa0253b13f834d3150c8b025c8282d92229617"} Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.933327 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerDied","Data":"9b936e59ea438fb9339e8163868ab0d80ef3aa21be40b1d3519fc252bb7cee7c"} Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.933339 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerDied","Data":"245ba55e672fdb718ef99d4f553f1eb46be5e64e2ad362be6d8902aa8556ed41"} Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.935473 4917 generic.go:334] "Generic (PLEG): container finished" podID="87c2b17b-d754-406a-98e6-466bf76cf5a0" containerID="5654a91cf8a897ae0481e23ff9270d8a66879056e36416c6479086df34c2176d" exitCode=0 Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.935566 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7fc585cd46-xw4c5" Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.937517 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563628-44zvp" event={"ID":"87c2b17b-d754-406a-98e6-466bf76cf5a0","Type":"ContainerDied","Data":"5654a91cf8a897ae0481e23ff9270d8a66879056e36416c6479086df34c2176d"} Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.979151 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7fc585cd46-xw4c5"] Mar 18 07:08:03 crc kubenswrapper[4917]: I0318 07:08:03.985739 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7fc585cd46-xw4c5"] Mar 18 07:08:05 crc kubenswrapper[4917]: I0318 07:08:05.302943 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563628-44zvp" Mar 18 07:08:05 crc kubenswrapper[4917]: I0318 07:08:05.490523 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c9n6\" (UniqueName: \"kubernetes.io/projected/87c2b17b-d754-406a-98e6-466bf76cf5a0-kube-api-access-4c9n6\") pod \"87c2b17b-d754-406a-98e6-466bf76cf5a0\" (UID: \"87c2b17b-d754-406a-98e6-466bf76cf5a0\") " Mar 18 07:08:05 crc kubenswrapper[4917]: I0318 07:08:05.496291 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c2b17b-d754-406a-98e6-466bf76cf5a0-kube-api-access-4c9n6" (OuterVolumeSpecName: "kube-api-access-4c9n6") pod "87c2b17b-d754-406a-98e6-466bf76cf5a0" (UID: "87c2b17b-d754-406a-98e6-466bf76cf5a0"). InnerVolumeSpecName "kube-api-access-4c9n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:05 crc kubenswrapper[4917]: I0318 07:08:05.592603 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c9n6\" (UniqueName: \"kubernetes.io/projected/87c2b17b-d754-406a-98e6-466bf76cf5a0-kube-api-access-4c9n6\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:05 crc kubenswrapper[4917]: I0318 07:08:05.794806 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" path="/var/lib/kubelet/pods/c9492f8d-33b1-4ea7-9f85-0137cc2443ed/volumes" Mar 18 07:08:05 crc kubenswrapper[4917]: E0318 07:08:05.856949 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87c2b17b_d754_406a_98e6_466bf76cf5a0.slice/crio-4f6b0ff51da2908582e193575bc3b981e50ae42ce6f87fd26b6ecea75b696149\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87c2b17b_d754_406a_98e6_466bf76cf5a0.slice\": RecentStats: unable to find data in memory cache]" Mar 18 07:08:05 crc kubenswrapper[4917]: I0318 07:08:05.958104 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563628-44zvp" event={"ID":"87c2b17b-d754-406a-98e6-466bf76cf5a0","Type":"ContainerDied","Data":"4f6b0ff51da2908582e193575bc3b981e50ae42ce6f87fd26b6ecea75b696149"} Mar 18 07:08:05 crc kubenswrapper[4917]: I0318 07:08:05.958165 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f6b0ff51da2908582e193575bc3b981e50ae42ce6f87fd26b6ecea75b696149" Mar 18 07:08:05 crc kubenswrapper[4917]: I0318 07:08:05.958222 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563628-44zvp" Mar 18 07:08:06 crc kubenswrapper[4917]: I0318 07:08:06.399848 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563622-qxl74"] Mar 18 07:08:06 crc kubenswrapper[4917]: I0318 07:08:06.409415 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563622-qxl74"] Mar 18 07:08:07 crc kubenswrapper[4917]: I0318 07:08:07.791024 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bdcf478-11b5-4797-8f30-ba7446642b04" path="/var/lib/kubelet/pods/1bdcf478-11b5-4797-8f30-ba7446642b04/volumes" Mar 18 07:08:09 crc kubenswrapper[4917]: I0318 07:08:09.999011 4917 generic.go:334] "Generic (PLEG): container finished" podID="7011ea08-beba-47f5-95e3-bb13aead931a" containerID="d0540dd64f9371f312f25eb071dea28339a516150812d0bc566670670bc1cd75" exitCode=0 Mar 18 07:08:09 crc kubenswrapper[4917]: I0318 07:08:09.999136 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qj2wp" event={"ID":"7011ea08-beba-47f5-95e3-bb13aead931a","Type":"ContainerDied","Data":"d0540dd64f9371f312f25eb071dea28339a516150812d0bc566670670bc1cd75"} Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.385936 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.413191 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-config-data\") pod \"7011ea08-beba-47f5-95e3-bb13aead931a\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.413258 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9gqp\" (UniqueName: \"kubernetes.io/projected/7011ea08-beba-47f5-95e3-bb13aead931a-kube-api-access-t9gqp\") pod \"7011ea08-beba-47f5-95e3-bb13aead931a\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.413375 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-scripts\") pod \"7011ea08-beba-47f5-95e3-bb13aead931a\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.413457 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-combined-ca-bundle\") pod \"7011ea08-beba-47f5-95e3-bb13aead931a\" (UID: \"7011ea08-beba-47f5-95e3-bb13aead931a\") " Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.422937 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-scripts" (OuterVolumeSpecName: "scripts") pod "7011ea08-beba-47f5-95e3-bb13aead931a" (UID: "7011ea08-beba-47f5-95e3-bb13aead931a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.429709 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7011ea08-beba-47f5-95e3-bb13aead931a-kube-api-access-t9gqp" (OuterVolumeSpecName: "kube-api-access-t9gqp") pod "7011ea08-beba-47f5-95e3-bb13aead931a" (UID: "7011ea08-beba-47f5-95e3-bb13aead931a"). InnerVolumeSpecName "kube-api-access-t9gqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.440774 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7011ea08-beba-47f5-95e3-bb13aead931a" (UID: "7011ea08-beba-47f5-95e3-bb13aead931a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.447792 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-config-data" (OuterVolumeSpecName: "config-data") pod "7011ea08-beba-47f5-95e3-bb13aead931a" (UID: "7011ea08-beba-47f5-95e3-bb13aead931a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.515287 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.515324 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.515338 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9gqp\" (UniqueName: \"kubernetes.io/projected/7011ea08-beba-47f5-95e3-bb13aead931a-kube-api-access-t9gqp\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.515349 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7011ea08-beba-47f5-95e3-bb13aead931a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:11 crc kubenswrapper[4917]: I0318 07:08:11.903932 4917 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod3519edfd-1fd3-415a-913a-71cd289d524a"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod3519edfd-1fd3-415a-913a-71cd289d524a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3519edfd_1fd3_415a_913a_71cd289d524a.slice" Mar 18 07:08:11 crc kubenswrapper[4917]: E0318 07:08:11.903989 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod3519edfd-1fd3-415a-913a-71cd289d524a] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod3519edfd-1fd3-415a-913a-71cd289d524a] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3519edfd_1fd3_415a_913a_71cd289d524a.slice" pod="openstack/glance-default-internal-api-0" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.022014 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-qj2wp" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.022019 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.021994 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-qj2wp" event={"ID":"7011ea08-beba-47f5-95e3-bb13aead931a","Type":"ContainerDied","Data":"53a6304199a918f6a64c77d1d74bf49b67a5f010bf995f8b8cad4854f8b29978"} Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.022290 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a6304199a918f6a64c77d1d74bf49b67a5f010bf995f8b8cad4854f8b29978" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.065084 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.089186 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.159836 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:08:12 crc kubenswrapper[4917]: E0318 07:08:12.160488 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7011ea08-beba-47f5-95e3-bb13aead931a" containerName="nova-cell0-conductor-db-sync" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.160506 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7011ea08-beba-47f5-95e3-bb13aead931a" containerName="nova-cell0-conductor-db-sync" Mar 18 07:08:12 crc kubenswrapper[4917]: E0318 07:08:12.160609 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerName="placement-api" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.160620 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerName="placement-api" Mar 18 07:08:12 crc kubenswrapper[4917]: E0318 07:08:12.160647 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c2b17b-d754-406a-98e6-466bf76cf5a0" containerName="oc" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.160656 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c2b17b-d754-406a-98e6-466bf76cf5a0" containerName="oc" Mar 18 07:08:12 crc kubenswrapper[4917]: E0318 07:08:12.160668 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerName="placement-log" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.160675 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerName="placement-log" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.161033 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c2b17b-d754-406a-98e6-466bf76cf5a0" containerName="oc" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.161056 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerName="placement-log" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.161073 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7011ea08-beba-47f5-95e3-bb13aead931a" containerName="nova-cell0-conductor-db-sync" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.161091 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9492f8d-33b1-4ea7-9f85-0137cc2443ed" containerName="placement-api" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.166154 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.168436 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.168511 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.194821 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.208075 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.209171 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.211551 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.211883 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7kdkf" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.230597 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.230658 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.230700 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.230755 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.230830 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.230862 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.230938 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.230998 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29ln5\" (UniqueName: \"kubernetes.io/projected/71bf8cc3-5674-418a-a126-f43e3d2f092d-kube-api-access-29ln5\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.231021 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-logs\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.231043 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.231096 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlb5g\" (UniqueName: \"kubernetes.io/projected/42cfbb53-8521-4c39-98ee-2d666b5682d3-kube-api-access-wlb5g\") pod \"nova-cell0-conductor-0\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.232786 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.332445 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.332785 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.332810 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.332843 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.332877 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29ln5\" (UniqueName: \"kubernetes.io/projected/71bf8cc3-5674-418a-a126-f43e3d2f092d-kube-api-access-29ln5\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.332893 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-logs\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.332909 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.332942 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlb5g\" (UniqueName: \"kubernetes.io/projected/42cfbb53-8521-4c39-98ee-2d666b5682d3-kube-api-access-wlb5g\") pod \"nova-cell0-conductor-0\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.332976 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.333000 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.333027 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.333522 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.333564 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-logs\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.333656 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.336735 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.337147 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.337814 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.338389 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.339406 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.354542 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.354835 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29ln5\" (UniqueName: \"kubernetes.io/projected/71bf8cc3-5674-418a-a126-f43e3d2f092d-kube-api-access-29ln5\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.359553 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlb5g\" (UniqueName: \"kubernetes.io/projected/42cfbb53-8521-4c39-98ee-2d666b5682d3-kube-api-access-wlb5g\") pod \"nova-cell0-conductor-0\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.365479 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.493072 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:12 crc kubenswrapper[4917]: I0318 07:08:12.528098 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:13 crc kubenswrapper[4917]: W0318 07:08:13.081149 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42cfbb53_8521_4c39_98ee_2d666b5682d3.slice/crio-a2d4cc981f3c655f89ab55a6561adaa9b77a2bed3671e51864801f4be995d8d3 WatchSource:0}: Error finding container a2d4cc981f3c655f89ab55a6561adaa9b77a2bed3671e51864801f4be995d8d3: Status 404 returned error can't find the container with id a2d4cc981f3c655f89ab55a6561adaa9b77a2bed3671e51864801f4be995d8d3 Mar 18 07:08:13 crc kubenswrapper[4917]: I0318 07:08:13.086885 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 07:08:13 crc kubenswrapper[4917]: W0318 07:08:13.165655 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71bf8cc3_5674_418a_a126_f43e3d2f092d.slice/crio-991692db54759527ea3b256f066d63b3660277c203f3b5066206bdec96242ce9 WatchSource:0}: Error finding container 991692db54759527ea3b256f066d63b3660277c203f3b5066206bdec96242ce9: Status 404 returned error can't find the container with id 991692db54759527ea3b256f066d63b3660277c203f3b5066206bdec96242ce9 Mar 18 07:08:13 crc kubenswrapper[4917]: I0318 07:08:13.171652 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:08:13 crc kubenswrapper[4917]: I0318 07:08:13.784573 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3519edfd-1fd3-415a-913a-71cd289d524a" path="/var/lib/kubelet/pods/3519edfd-1fd3-415a-913a-71cd289d524a/volumes" Mar 18 07:08:14 crc kubenswrapper[4917]: I0318 07:08:14.047142 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71bf8cc3-5674-418a-a126-f43e3d2f092d","Type":"ContainerStarted","Data":"a57908c5b95dcdf307be2044978ccc9678a0b22f5e60956f8fbd25abaea6d4fe"} Mar 18 07:08:14 crc kubenswrapper[4917]: I0318 07:08:14.047184 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71bf8cc3-5674-418a-a126-f43e3d2f092d","Type":"ContainerStarted","Data":"991692db54759527ea3b256f066d63b3660277c203f3b5066206bdec96242ce9"} Mar 18 07:08:14 crc kubenswrapper[4917]: I0318 07:08:14.049556 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"42cfbb53-8521-4c39-98ee-2d666b5682d3","Type":"ContainerStarted","Data":"ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8"} Mar 18 07:08:14 crc kubenswrapper[4917]: I0318 07:08:14.049639 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"42cfbb53-8521-4c39-98ee-2d666b5682d3","Type":"ContainerStarted","Data":"a2d4cc981f3c655f89ab55a6561adaa9b77a2bed3671e51864801f4be995d8d3"} Mar 18 07:08:14 crc kubenswrapper[4917]: I0318 07:08:14.049743 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:14 crc kubenswrapper[4917]: I0318 07:08:14.070518 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.070498545 podStartE2EDuration="2.070498545s" podCreationTimestamp="2026-03-18 07:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:14.067548784 +0000 UTC m=+1279.008703498" watchObservedRunningTime="2026-03-18 07:08:14.070498545 +0000 UTC m=+1279.011653259" Mar 18 07:08:15 crc kubenswrapper[4917]: I0318 07:08:15.062143 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71bf8cc3-5674-418a-a126-f43e3d2f092d","Type":"ContainerStarted","Data":"5cf727c6f8e456b706e1dae3227882a5fd58a35e0cebf945f9304e0df9d9673a"} Mar 18 07:08:15 crc kubenswrapper[4917]: I0318 07:08:15.096049 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.096025006 podStartE2EDuration="3.096025006s" podCreationTimestamp="2026-03-18 07:08:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:15.087919481 +0000 UTC m=+1280.029074225" watchObservedRunningTime="2026-03-18 07:08:15.096025006 +0000 UTC m=+1280.037179750" Mar 18 07:08:20 crc kubenswrapper[4917]: I0318 07:08:20.131797 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 18 07:08:22 crc kubenswrapper[4917]: I0318 07:08:22.494174 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:22 crc kubenswrapper[4917]: I0318 07:08:22.494551 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:22 crc kubenswrapper[4917]: I0318 07:08:22.548072 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:22 crc kubenswrapper[4917]: I0318 07:08:22.561665 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:22 crc kubenswrapper[4917]: I0318 07:08:22.594942 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.088414 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-jfsv2"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.090035 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.092670 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.092922 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.100919 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jfsv2"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.158823 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.158868 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.255890 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-scripts\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.256572 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bjhz\" (UniqueName: \"kubernetes.io/projected/00252edd-28b8-4859-b52a-eccc6226bde2-kube-api-access-2bjhz\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.256745 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-config-data\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.256867 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.315457 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.317879 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.320916 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.353123 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.363757 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-config-data\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.363836 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.363872 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-scripts\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.363960 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bjhz\" (UniqueName: \"kubernetes.io/projected/00252edd-28b8-4859-b52a-eccc6226bde2-kube-api-access-2bjhz\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.373919 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-config-data\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.382469 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-scripts\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.383105 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.406103 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bjhz\" (UniqueName: \"kubernetes.io/projected/00252edd-28b8-4859-b52a-eccc6226bde2-kube-api-access-2bjhz\") pod \"nova-cell0-cell-mapping-jfsv2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.426964 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.465292 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.465333 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx6zk\" (UniqueName: \"kubernetes.io/projected/971c9dbd-0838-4d6b-b802-21845680c727-kube-api-access-jx6zk\") pod \"nova-scheduler-0\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.465413 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-config-data\") pod \"nova-scheduler-0\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.490445 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.497070 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.511735 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.513283 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.518699 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.519387 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.523949 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.557305 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.567750 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-config-data\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.567814 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kssl\" (UniqueName: \"kubernetes.io/projected/63c02844-3a7d-44c5-bae9-3719173f4c48-kube-api-access-5kssl\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.567838 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx6zk\" (UniqueName: \"kubernetes.io/projected/971c9dbd-0838-4d6b-b802-21845680c727-kube-api-access-jx6zk\") pod \"nova-scheduler-0\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.567860 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.567891 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63c02844-3a7d-44c5-bae9-3719173f4c48-logs\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.567955 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-config-data\") pod \"nova-scheduler-0\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.567983 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.586343 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.589637 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.590873 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.603114 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-config-data\") pod \"nova-scheduler-0\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.605534 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.605915 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.680914 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63c02844-3a7d-44c5-bae9-3719173f4c48-logs\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.681539 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/9f40bc95-334c-48d8-a06d-de4605fdb1c6-kube-api-access-4zw22\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.681659 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmnc6\" (UniqueName: \"kubernetes.io/projected/187c06ba-b0e4-46de-a7ec-ab9276e58265-kube-api-access-dmnc6\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.681780 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.681907 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-config-data\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.682001 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.682115 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.682211 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.682322 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187c06ba-b0e4-46de-a7ec-ab9276e58265-logs\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.682494 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-config-data\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.682596 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kssl\" (UniqueName: \"kubernetes.io/projected/63c02844-3a7d-44c5-bae9-3719173f4c48-kube-api-access-5kssl\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.683260 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63c02844-3a7d-44c5-bae9-3719173f4c48-logs\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.697951 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-config-data\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.705924 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.710241 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx6zk\" (UniqueName: \"kubernetes.io/projected/971c9dbd-0838-4d6b-b802-21845680c727-kube-api-access-jx6zk\") pod \"nova-scheduler-0\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.744596 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kssl\" (UniqueName: \"kubernetes.io/projected/63c02844-3a7d-44c5-bae9-3719173f4c48-kube-api-access-5kssl\") pod \"nova-metadata-0\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.797243 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/9f40bc95-334c-48d8-a06d-de4605fdb1c6-kube-api-access-4zw22\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.797293 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmnc6\" (UniqueName: \"kubernetes.io/projected/187c06ba-b0e4-46de-a7ec-ab9276e58265-kube-api-access-dmnc6\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.797335 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.797373 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-config-data\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.797403 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.797416 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.797443 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187c06ba-b0e4-46de-a7ec-ab9276e58265-logs\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.836830 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187c06ba-b0e4-46de-a7ec-ab9276e58265-logs\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.848223 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.850179 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-thr9v"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.851213 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.851409 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-thr9v"] Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.851491 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.863319 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.863728 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmnc6\" (UniqueName: \"kubernetes.io/projected/187c06ba-b0e4-46de-a7ec-ab9276e58265-kube-api-access-dmnc6\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.864196 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-config-data\") pod \"nova-api-0\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.883123 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.893809 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.896856 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/9f40bc95-334c-48d8-a06d-de4605fdb1c6-kube-api-access-4zw22\") pod \"nova-cell1-novncproxy-0\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.956742 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:08:23 crc kubenswrapper[4917]: I0318 07:08:23.970678 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-jfsv2"] Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.002464 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-swift-storage-0\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.002564 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-config\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.002597 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-svc\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.002633 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-sb\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.002676 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-nb\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.002694 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98djh\" (UniqueName: \"kubernetes.io/projected/4fa0ca14-1497-499a-8c7d-5dd248b638a0-kube-api-access-98djh\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.005764 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.104393 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-sb\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.104910 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-nb\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.104944 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98djh\" (UniqueName: \"kubernetes.io/projected/4fa0ca14-1497-499a-8c7d-5dd248b638a0-kube-api-access-98djh\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.105008 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-swift-storage-0\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.105123 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-config\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.105167 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-svc\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.105440 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-sb\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.106110 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-svc\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.106440 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-nb\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.106865 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-swift-storage-0\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.107000 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-config\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.134250 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98djh\" (UniqueName: \"kubernetes.io/projected/4fa0ca14-1497-499a-8c7d-5dd248b638a0-kube-api-access-98djh\") pod \"dnsmasq-dns-84bb6d55fc-thr9v\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.180477 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jfsv2" event={"ID":"00252edd-28b8-4859-b52a-eccc6226bde2","Type":"ContainerStarted","Data":"f2204ae051b0108c105a6deac2f1bb6e0babe6e7602b7d7945df5b3b57a68048"} Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.221083 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.416916 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.466389 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w8q9f"] Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.467895 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.471343 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.471561 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.480551 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w8q9f"] Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.542150 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.577634 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.619080 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-scripts\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.619179 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.619245 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-config-data\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.619447 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k7nf\" (UniqueName: \"kubernetes.io/projected/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-kube-api-access-7k7nf\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.632788 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.721318 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k7nf\" (UniqueName: \"kubernetes.io/projected/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-kube-api-access-7k7nf\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.721405 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-scripts\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.721494 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.721572 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-config-data\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.727675 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-config-data\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.728633 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.730892 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-scripts\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.738289 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k7nf\" (UniqueName: \"kubernetes.io/projected/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-kube-api-access-7k7nf\") pod \"nova-cell1-conductor-db-sync-w8q9f\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.825460 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:24 crc kubenswrapper[4917]: I0318 07:08:24.843744 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-thr9v"] Mar 18 07:08:24 crc kubenswrapper[4917]: W0318 07:08:24.859959 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fa0ca14_1497_499a_8c7d_5dd248b638a0.slice/crio-d537e77dd7f45e6d3ff37a870195702a86407bfe137358daee71166a94289af6 WatchSource:0}: Error finding container d537e77dd7f45e6d3ff37a870195702a86407bfe137358daee71166a94289af6: Status 404 returned error can't find the container with id d537e77dd7f45e6d3ff37a870195702a86407bfe137358daee71166a94289af6 Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.200837 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63c02844-3a7d-44c5-bae9-3719173f4c48","Type":"ContainerStarted","Data":"4fb0e54228036dc4b93ac35c4ed568a940fe435352a77cffbc2c4fa1e2053c08"} Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.202158 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"971c9dbd-0838-4d6b-b802-21845680c727","Type":"ContainerStarted","Data":"58152cf58464293738777542db8eb140ab02c896d0d87badab03b78feba6365b"} Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.204448 4917 generic.go:334] "Generic (PLEG): container finished" podID="4fa0ca14-1497-499a-8c7d-5dd248b638a0" containerID="060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69" exitCode=0 Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.204516 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" event={"ID":"4fa0ca14-1497-499a-8c7d-5dd248b638a0","Type":"ContainerDied","Data":"060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69"} Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.204543 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" event={"ID":"4fa0ca14-1497-499a-8c7d-5dd248b638a0","Type":"ContainerStarted","Data":"d537e77dd7f45e6d3ff37a870195702a86407bfe137358daee71166a94289af6"} Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.216374 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"187c06ba-b0e4-46de-a7ec-ab9276e58265","Type":"ContainerStarted","Data":"710c586d87aa3b8cfb4742d27191b01482ba49d4b99d50088c5f6b97ab0950e7"} Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.226845 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9f40bc95-334c-48d8-a06d-de4605fdb1c6","Type":"ContainerStarted","Data":"02ecf9925ac6c137af8dc03c33d1183bf652a5e4b0a5087b431c4bf3f957c666"} Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.245607 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jfsv2" event={"ID":"00252edd-28b8-4859-b52a-eccc6226bde2","Type":"ContainerStarted","Data":"8e200a7db258824eecca5378a51a1a017254e035828c262f2cbb5e58066808d0"} Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.265518 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-jfsv2" podStartSLOduration=2.265499236 podStartE2EDuration="2.265499236s" podCreationTimestamp="2026-03-18 07:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:25.257751339 +0000 UTC m=+1290.198906053" watchObservedRunningTime="2026-03-18 07:08:25.265499236 +0000 UTC m=+1290.206653950" Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.330668 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w8q9f"] Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.658494 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.658991 4917 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 18 07:08:25 crc kubenswrapper[4917]: I0318 07:08:25.743156 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 07:08:26 crc kubenswrapper[4917]: I0318 07:08:26.271547 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" event={"ID":"4fa0ca14-1497-499a-8c7d-5dd248b638a0","Type":"ContainerStarted","Data":"3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3"} Mar 18 07:08:26 crc kubenswrapper[4917]: I0318 07:08:26.271701 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:26 crc kubenswrapper[4917]: I0318 07:08:26.277969 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w8q9f" event={"ID":"a8af5f47-3cc1-4d7f-a99f-0eae80c27416","Type":"ContainerStarted","Data":"2cbe01ea23a91faf50474f6bd4a4594d76a5f55fad33ee844a2f70a508be810a"} Mar 18 07:08:26 crc kubenswrapper[4917]: I0318 07:08:26.278012 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w8q9f" event={"ID":"a8af5f47-3cc1-4d7f-a99f-0eae80c27416","Type":"ContainerStarted","Data":"ec22a21adfc80a69c27219cc9082eab2c4ea9beb0baf728ce2de2923b47ad8d4"} Mar 18 07:08:26 crc kubenswrapper[4917]: I0318 07:08:26.301555 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" podStartSLOduration=3.30153189 podStartE2EDuration="3.30153189s" podCreationTimestamp="2026-03-18 07:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:26.29781042 +0000 UTC m=+1291.238965134" watchObservedRunningTime="2026-03-18 07:08:26.30153189 +0000 UTC m=+1291.242686604" Mar 18 07:08:26 crc kubenswrapper[4917]: I0318 07:08:26.320357 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-w8q9f" podStartSLOduration=2.320340573 podStartE2EDuration="2.320340573s" podCreationTimestamp="2026-03-18 07:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:26.316852268 +0000 UTC m=+1291.258006992" watchObservedRunningTime="2026-03-18 07:08:26.320340573 +0000 UTC m=+1291.261495287" Mar 18 07:08:27 crc kubenswrapper[4917]: I0318 07:08:27.394418 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:08:27 crc kubenswrapper[4917]: I0318 07:08:27.405268 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:28 crc kubenswrapper[4917]: I0318 07:08:28.301244 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"971c9dbd-0838-4d6b-b802-21845680c727","Type":"ContainerStarted","Data":"ca6ed31aebb315b90b7a769b73bcd1538d54c0d4691d586e5092e562eca73107"} Mar 18 07:08:28 crc kubenswrapper[4917]: I0318 07:08:28.305222 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"187c06ba-b0e4-46de-a7ec-ab9276e58265","Type":"ContainerStarted","Data":"d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303"} Mar 18 07:08:28 crc kubenswrapper[4917]: I0318 07:08:28.308493 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9f40bc95-334c-48d8-a06d-de4605fdb1c6","Type":"ContainerStarted","Data":"5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71"} Mar 18 07:08:28 crc kubenswrapper[4917]: I0318 07:08:28.308703 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9f40bc95-334c-48d8-a06d-de4605fdb1c6" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71" gracePeriod=30 Mar 18 07:08:28 crc kubenswrapper[4917]: I0318 07:08:28.311455 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63c02844-3a7d-44c5-bae9-3719173f4c48","Type":"ContainerStarted","Data":"533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1"} Mar 18 07:08:28 crc kubenswrapper[4917]: I0318 07:08:28.320266 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.097556806 podStartE2EDuration="5.320248511s" podCreationTimestamp="2026-03-18 07:08:23 +0000 UTC" firstStartedPulling="2026-03-18 07:08:24.594095591 +0000 UTC m=+1289.535250305" lastFinishedPulling="2026-03-18 07:08:27.816787286 +0000 UTC m=+1292.757942010" observedRunningTime="2026-03-18 07:08:28.316526792 +0000 UTC m=+1293.257681506" watchObservedRunningTime="2026-03-18 07:08:28.320248511 +0000 UTC m=+1293.261403225" Mar 18 07:08:28 crc kubenswrapper[4917]: I0318 07:08:28.956856 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 07:08:29 crc kubenswrapper[4917]: I0318 07:08:29.010576 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:29 crc kubenswrapper[4917]: I0318 07:08:29.329071 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"187c06ba-b0e4-46de-a7ec-ab9276e58265","Type":"ContainerStarted","Data":"7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00"} Mar 18 07:08:29 crc kubenswrapper[4917]: I0318 07:08:29.335950 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerName="nova-metadata-log" containerID="cri-o://533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1" gracePeriod=30 Mar 18 07:08:29 crc kubenswrapper[4917]: I0318 07:08:29.336452 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63c02844-3a7d-44c5-bae9-3719173f4c48","Type":"ContainerStarted","Data":"9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e"} Mar 18 07:08:29 crc kubenswrapper[4917]: I0318 07:08:29.336574 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerName="nova-metadata-metadata" containerID="cri-o://9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e" gracePeriod=30 Mar 18 07:08:29 crc kubenswrapper[4917]: I0318 07:08:29.367382 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.095558734 podStartE2EDuration="6.367349772s" podCreationTimestamp="2026-03-18 07:08:23 +0000 UTC" firstStartedPulling="2026-03-18 07:08:24.549851884 +0000 UTC m=+1289.491006588" lastFinishedPulling="2026-03-18 07:08:27.821642892 +0000 UTC m=+1292.762797626" observedRunningTime="2026-03-18 07:08:29.359240506 +0000 UTC m=+1294.300395220" watchObservedRunningTime="2026-03-18 07:08:29.367349772 +0000 UTC m=+1294.308504546" Mar 18 07:08:29 crc kubenswrapper[4917]: I0318 07:08:29.378851 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.242182578 podStartE2EDuration="6.378829569s" podCreationTimestamp="2026-03-18 07:08:23 +0000 UTC" firstStartedPulling="2026-03-18 07:08:24.674502769 +0000 UTC m=+1289.615657483" lastFinishedPulling="2026-03-18 07:08:27.81114975 +0000 UTC m=+1292.752304474" observedRunningTime="2026-03-18 07:08:28.338272196 +0000 UTC m=+1293.279426910" watchObservedRunningTime="2026-03-18 07:08:29.378829569 +0000 UTC m=+1294.319984313" Mar 18 07:08:29 crc kubenswrapper[4917]: I0318 07:08:29.391310 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.008926185 podStartE2EDuration="6.391288189s" podCreationTimestamp="2026-03-18 07:08:23 +0000 UTC" firstStartedPulling="2026-03-18 07:08:24.42772172 +0000 UTC m=+1289.368876434" lastFinishedPulling="2026-03-18 07:08:27.810083724 +0000 UTC m=+1292.751238438" observedRunningTime="2026-03-18 07:08:29.391112545 +0000 UTC m=+1294.332267269" watchObservedRunningTime="2026-03-18 07:08:29.391288189 +0000 UTC m=+1294.332442913" Mar 18 07:08:29 crc kubenswrapper[4917]: I0318 07:08:29.921008 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.053854 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-combined-ca-bundle\") pod \"63c02844-3a7d-44c5-bae9-3719173f4c48\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.053990 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-config-data\") pod \"63c02844-3a7d-44c5-bae9-3719173f4c48\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.054016 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kssl\" (UniqueName: \"kubernetes.io/projected/63c02844-3a7d-44c5-bae9-3719173f4c48-kube-api-access-5kssl\") pod \"63c02844-3a7d-44c5-bae9-3719173f4c48\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.054077 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63c02844-3a7d-44c5-bae9-3719173f4c48-logs\") pod \"63c02844-3a7d-44c5-bae9-3719173f4c48\" (UID: \"63c02844-3a7d-44c5-bae9-3719173f4c48\") " Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.054686 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63c02844-3a7d-44c5-bae9-3719173f4c48-logs" (OuterVolumeSpecName: "logs") pod "63c02844-3a7d-44c5-bae9-3719173f4c48" (UID: "63c02844-3a7d-44c5-bae9-3719173f4c48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.063185 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c02844-3a7d-44c5-bae9-3719173f4c48-kube-api-access-5kssl" (OuterVolumeSpecName: "kube-api-access-5kssl") pod "63c02844-3a7d-44c5-bae9-3719173f4c48" (UID: "63c02844-3a7d-44c5-bae9-3719173f4c48"). InnerVolumeSpecName "kube-api-access-5kssl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.088999 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63c02844-3a7d-44c5-bae9-3719173f4c48" (UID: "63c02844-3a7d-44c5-bae9-3719173f4c48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.094807 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-config-data" (OuterVolumeSpecName: "config-data") pod "63c02844-3a7d-44c5-bae9-3719173f4c48" (UID: "63c02844-3a7d-44c5-bae9-3719173f4c48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.156615 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.156892 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63c02844-3a7d-44c5-bae9-3719173f4c48-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.157082 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kssl\" (UniqueName: \"kubernetes.io/projected/63c02844-3a7d-44c5-bae9-3719173f4c48-kube-api-access-5kssl\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.157219 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63c02844-3a7d-44c5-bae9-3719173f4c48-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.346082 4917 generic.go:334] "Generic (PLEG): container finished" podID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerID="9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e" exitCode=0 Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.346122 4917 generic.go:334] "Generic (PLEG): container finished" podID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerID="533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1" exitCode=143 Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.346124 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63c02844-3a7d-44c5-bae9-3719173f4c48","Type":"ContainerDied","Data":"9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e"} Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.346161 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63c02844-3a7d-44c5-bae9-3719173f4c48","Type":"ContainerDied","Data":"533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1"} Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.346176 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"63c02844-3a7d-44c5-bae9-3719173f4c48","Type":"ContainerDied","Data":"4fb0e54228036dc4b93ac35c4ed568a940fe435352a77cffbc2c4fa1e2053c08"} Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.346194 4917 scope.go:117] "RemoveContainer" containerID="9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.347504 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.375985 4917 scope.go:117] "RemoveContainer" containerID="533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.425712 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.428344 4917 scope.go:117] "RemoveContainer" containerID="9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e" Mar 18 07:08:30 crc kubenswrapper[4917]: E0318 07:08:30.431110 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e\": container with ID starting with 9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e not found: ID does not exist" containerID="9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.431188 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e"} err="failed to get container status \"9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e\": rpc error: code = NotFound desc = could not find container \"9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e\": container with ID starting with 9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e not found: ID does not exist" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.431240 4917 scope.go:117] "RemoveContainer" containerID="533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1" Mar 18 07:08:30 crc kubenswrapper[4917]: E0318 07:08:30.434079 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1\": container with ID starting with 533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1 not found: ID does not exist" containerID="533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.434148 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1"} err="failed to get container status \"533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1\": rpc error: code = NotFound desc = could not find container \"533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1\": container with ID starting with 533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1 not found: ID does not exist" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.434190 4917 scope.go:117] "RemoveContainer" containerID="9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.440064 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e"} err="failed to get container status \"9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e\": rpc error: code = NotFound desc = could not find container \"9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e\": container with ID starting with 9b99630903ee85418db3d873952ae9c3963bf8a3428243b3ae0d524c4f4e792e not found: ID does not exist" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.440120 4917 scope.go:117] "RemoveContainer" containerID="533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.440614 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1"} err="failed to get container status \"533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1\": rpc error: code = NotFound desc = could not find container \"533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1\": container with ID starting with 533271a820652e5f97c10a368954c30a26b397ee8b2554a4cb955de2721a7dc1 not found: ID does not exist" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.448303 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.464894 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:30 crc kubenswrapper[4917]: E0318 07:08:30.465294 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerName="nova-metadata-log" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.465308 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerName="nova-metadata-log" Mar 18 07:08:30 crc kubenswrapper[4917]: E0318 07:08:30.465340 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerName="nova-metadata-metadata" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.465346 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerName="nova-metadata-metadata" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.465525 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerName="nova-metadata-metadata" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.465533 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c02844-3a7d-44c5-bae9-3719173f4c48" containerName="nova-metadata-log" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.466475 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.475345 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.486558 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.486560 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.564700 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4929ca54-e5ca-49c0-9254-2709a9d17782-logs\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.564849 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-config-data\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.564927 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.565311 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmg78\" (UniqueName: \"kubernetes.io/projected/4929ca54-e5ca-49c0-9254-2709a9d17782-kube-api-access-nmg78\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.565379 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.666778 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmg78\" (UniqueName: \"kubernetes.io/projected/4929ca54-e5ca-49c0-9254-2709a9d17782-kube-api-access-nmg78\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.666841 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.666933 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4929ca54-e5ca-49c0-9254-2709a9d17782-logs\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.666984 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-config-data\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.667026 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.667527 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4929ca54-e5ca-49c0-9254-2709a9d17782-logs\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.673358 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.675243 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-config-data\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.676835 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.687280 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmg78\" (UniqueName: \"kubernetes.io/projected/4929ca54-e5ca-49c0-9254-2709a9d17782-kube-api-access-nmg78\") pod \"nova-metadata-0\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " pod="openstack/nova-metadata-0" Mar 18 07:08:30 crc kubenswrapper[4917]: I0318 07:08:30.821433 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:31 crc kubenswrapper[4917]: I0318 07:08:31.290450 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:31 crc kubenswrapper[4917]: I0318 07:08:31.357016 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4929ca54-e5ca-49c0-9254-2709a9d17782","Type":"ContainerStarted","Data":"b7cf1d42c0ca26a1070194006a63f8bb18f392a97c69926faa5ec44f545a992b"} Mar 18 07:08:31 crc kubenswrapper[4917]: I0318 07:08:31.785685 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c02844-3a7d-44c5-bae9-3719173f4c48" path="/var/lib/kubelet/pods/63c02844-3a7d-44c5-bae9-3719173f4c48/volumes" Mar 18 07:08:32 crc kubenswrapper[4917]: I0318 07:08:32.370229 4917 generic.go:334] "Generic (PLEG): container finished" podID="00252edd-28b8-4859-b52a-eccc6226bde2" containerID="8e200a7db258824eecca5378a51a1a017254e035828c262f2cbb5e58066808d0" exitCode=0 Mar 18 07:08:32 crc kubenswrapper[4917]: I0318 07:08:32.370303 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jfsv2" event={"ID":"00252edd-28b8-4859-b52a-eccc6226bde2","Type":"ContainerDied","Data":"8e200a7db258824eecca5378a51a1a017254e035828c262f2cbb5e58066808d0"} Mar 18 07:08:32 crc kubenswrapper[4917]: I0318 07:08:32.372771 4917 generic.go:334] "Generic (PLEG): container finished" podID="a8af5f47-3cc1-4d7f-a99f-0eae80c27416" containerID="2cbe01ea23a91faf50474f6bd4a4594d76a5f55fad33ee844a2f70a508be810a" exitCode=0 Mar 18 07:08:32 crc kubenswrapper[4917]: I0318 07:08:32.372896 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w8q9f" event={"ID":"a8af5f47-3cc1-4d7f-a99f-0eae80c27416","Type":"ContainerDied","Data":"2cbe01ea23a91faf50474f6bd4a4594d76a5f55fad33ee844a2f70a508be810a"} Mar 18 07:08:32 crc kubenswrapper[4917]: I0318 07:08:32.376514 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4929ca54-e5ca-49c0-9254-2709a9d17782","Type":"ContainerStarted","Data":"23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605"} Mar 18 07:08:32 crc kubenswrapper[4917]: I0318 07:08:32.376660 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4929ca54-e5ca-49c0-9254-2709a9d17782","Type":"ContainerStarted","Data":"f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b"} Mar 18 07:08:32 crc kubenswrapper[4917]: I0318 07:08:32.454842 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.454821157 podStartE2EDuration="2.454821157s" podCreationTimestamp="2026-03-18 07:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:32.454025738 +0000 UTC m=+1297.395180462" watchObservedRunningTime="2026-03-18 07:08:32.454821157 +0000 UTC m=+1297.395975881" Mar 18 07:08:32 crc kubenswrapper[4917]: I0318 07:08:32.929863 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:08:32 crc kubenswrapper[4917]: I0318 07:08:32.930289 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.389870 4917 generic.go:334] "Generic (PLEG): container finished" podID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerID="fac743df19fc27567027fac378bd664127918b928ea3a9d066d90306e3e5e18a" exitCode=137 Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.389938 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerDied","Data":"fac743df19fc27567027fac378bd664127918b928ea3a9d066d90306e3e5e18a"} Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.389992 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc","Type":"ContainerDied","Data":"9a3b6dd61bc6f9c07b9d11bd1976764e1ef627c91eb2398adf0bef287cc7ea75"} Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.390006 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a3b6dd61bc6f9c07b9d11bd1976764e1ef627c91eb2398adf0bef287cc7ea75" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.402664 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.525980 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-config-data\") pod \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.526058 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-sg-core-conf-yaml\") pod \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.526150 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-combined-ca-bundle\") pod \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.526192 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-log-httpd\") pod \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.526250 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n9cq\" (UniqueName: \"kubernetes.io/projected/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-kube-api-access-2n9cq\") pod \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.526308 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-scripts\") pod \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.526367 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-run-httpd\") pod \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\" (UID: \"2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.528153 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" (UID: "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.528363 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" (UID: "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.534178 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-scripts" (OuterVolumeSpecName: "scripts") pod "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" (UID: "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.538142 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-kube-api-access-2n9cq" (OuterVolumeSpecName: "kube-api-access-2n9cq") pod "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" (UID: "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc"). InnerVolumeSpecName "kube-api-access-2n9cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.579230 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" (UID: "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.630038 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.630121 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.630136 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.630148 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n9cq\" (UniqueName: \"kubernetes.io/projected/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-kube-api-access-2n9cq\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.630159 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.633748 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" (UID: "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.648785 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-config-data" (OuterVolumeSpecName: "config-data") pod "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" (UID: "2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.735007 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.735049 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.749070 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.809047 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.836564 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-combined-ca-bundle\") pod \"00252edd-28b8-4859-b52a-eccc6226bde2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.836870 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-config-data\") pod \"00252edd-28b8-4859-b52a-eccc6226bde2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.837064 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bjhz\" (UniqueName: \"kubernetes.io/projected/00252edd-28b8-4859-b52a-eccc6226bde2-kube-api-access-2bjhz\") pod \"00252edd-28b8-4859-b52a-eccc6226bde2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.837214 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-scripts\") pod \"00252edd-28b8-4859-b52a-eccc6226bde2\" (UID: \"00252edd-28b8-4859-b52a-eccc6226bde2\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.841032 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-scripts" (OuterVolumeSpecName: "scripts") pod "00252edd-28b8-4859-b52a-eccc6226bde2" (UID: "00252edd-28b8-4859-b52a-eccc6226bde2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.841234 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00252edd-28b8-4859-b52a-eccc6226bde2-kube-api-access-2bjhz" (OuterVolumeSpecName: "kube-api-access-2bjhz") pod "00252edd-28b8-4859-b52a-eccc6226bde2" (UID: "00252edd-28b8-4859-b52a-eccc6226bde2"). InnerVolumeSpecName "kube-api-access-2bjhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.864776 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-config-data" (OuterVolumeSpecName: "config-data") pod "00252edd-28b8-4859-b52a-eccc6226bde2" (UID: "00252edd-28b8-4859-b52a-eccc6226bde2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.867539 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00252edd-28b8-4859-b52a-eccc6226bde2" (UID: "00252edd-28b8-4859-b52a-eccc6226bde2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.884835 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.884887 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.938771 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-config-data\") pod \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.938893 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-scripts\") pod \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.938934 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k7nf\" (UniqueName: \"kubernetes.io/projected/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-kube-api-access-7k7nf\") pod \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.939069 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-combined-ca-bundle\") pod \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\" (UID: \"a8af5f47-3cc1-4d7f-a99f-0eae80c27416\") " Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.939550 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bjhz\" (UniqueName: \"kubernetes.io/projected/00252edd-28b8-4859-b52a-eccc6226bde2-kube-api-access-2bjhz\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.939572 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.939600 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.939612 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00252edd-28b8-4859-b52a-eccc6226bde2-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.941881 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-kube-api-access-7k7nf" (OuterVolumeSpecName: "kube-api-access-7k7nf") pod "a8af5f47-3cc1-4d7f-a99f-0eae80c27416" (UID: "a8af5f47-3cc1-4d7f-a99f-0eae80c27416"). InnerVolumeSpecName "kube-api-access-7k7nf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.943133 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-scripts" (OuterVolumeSpecName: "scripts") pod "a8af5f47-3cc1-4d7f-a99f-0eae80c27416" (UID: "a8af5f47-3cc1-4d7f-a99f-0eae80c27416"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.957411 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.968004 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8af5f47-3cc1-4d7f-a99f-0eae80c27416" (UID: "a8af5f47-3cc1-4d7f-a99f-0eae80c27416"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:33 crc kubenswrapper[4917]: I0318 07:08:33.993444 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-config-data" (OuterVolumeSpecName: "config-data") pod "a8af5f47-3cc1-4d7f-a99f-0eae80c27416" (UID: "a8af5f47-3cc1-4d7f-a99f-0eae80c27416"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.010769 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.041481 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.041878 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.042044 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k7nf\" (UniqueName: \"kubernetes.io/projected/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-kube-api-access-7k7nf\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.042192 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8af5f47-3cc1-4d7f-a99f-0eae80c27416-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.222820 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.300083 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-cjnd8"] Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.300320 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" podUID="61ec9fd8-58fa-4279-a404-61d4ef4f4c32" containerName="dnsmasq-dns" containerID="cri-o://01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe" gracePeriod=10 Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.406827 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w8q9f" event={"ID":"a8af5f47-3cc1-4d7f-a99f-0eae80c27416","Type":"ContainerDied","Data":"ec22a21adfc80a69c27219cc9082eab2c4ea9beb0baf728ce2de2923b47ad8d4"} Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.406896 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec22a21adfc80a69c27219cc9082eab2c4ea9beb0baf728ce2de2923b47ad8d4" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.406992 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w8q9f" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.412821 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.414375 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-jfsv2" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.422696 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-jfsv2" event={"ID":"00252edd-28b8-4859-b52a-eccc6226bde2","Type":"ContainerDied","Data":"f2204ae051b0108c105a6deac2f1bb6e0babe6e7602b7d7945df5b3b57a68048"} Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.422745 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2204ae051b0108c105a6deac2f1bb6e0babe6e7602b7d7945df5b3b57a68048" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.486941 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.588643 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.608992 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.625515 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 07:08:34 crc kubenswrapper[4917]: E0318 07:08:34.626479 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="ceilometer-notification-agent" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626507 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="ceilometer-notification-agent" Mar 18 07:08:34 crc kubenswrapper[4917]: E0318 07:08:34.626532 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00252edd-28b8-4859-b52a-eccc6226bde2" containerName="nova-manage" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626541 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="00252edd-28b8-4859-b52a-eccc6226bde2" containerName="nova-manage" Mar 18 07:08:34 crc kubenswrapper[4917]: E0318 07:08:34.626555 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8af5f47-3cc1-4d7f-a99f-0eae80c27416" containerName="nova-cell1-conductor-db-sync" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626561 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8af5f47-3cc1-4d7f-a99f-0eae80c27416" containerName="nova-cell1-conductor-db-sync" Mar 18 07:08:34 crc kubenswrapper[4917]: E0318 07:08:34.626646 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="ceilometer-central-agent" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626658 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="ceilometer-central-agent" Mar 18 07:08:34 crc kubenswrapper[4917]: E0318 07:08:34.626667 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="proxy-httpd" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626675 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="proxy-httpd" Mar 18 07:08:34 crc kubenswrapper[4917]: E0318 07:08:34.626686 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="sg-core" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626693 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="sg-core" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626917 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="00252edd-28b8-4859-b52a-eccc6226bde2" containerName="nova-manage" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626929 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="ceilometer-notification-agent" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626945 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="ceilometer-central-agent" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626965 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="proxy-httpd" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626975 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" containerName="sg-core" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.626989 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8af5f47-3cc1-4d7f-a99f-0eae80c27416" containerName="nova-cell1-conductor-db-sync" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.627997 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.631376 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.636433 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.639411 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.643036 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.643458 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.666264 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.676336 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.686750 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w64nt\" (UniqueName: \"kubernetes.io/projected/dd4ec623-4dba-48e7-89f5-6cd3eadce847-kube-api-access-w64nt\") pod \"nova-cell1-conductor-0\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.686838 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.687034 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.750474 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.751168 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-log" containerID="cri-o://d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303" gracePeriod=30 Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.751872 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-api" containerID="cri-o://7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00" gracePeriod=30 Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.768836 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": EOF" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.768975 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": EOF" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.780104 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.789488 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-log-httpd\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.789576 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w64nt\" (UniqueName: \"kubernetes.io/projected/dd4ec623-4dba-48e7-89f5-6cd3eadce847-kube-api-access-w64nt\") pod \"nova-cell1-conductor-0\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.789928 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-scripts\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.789949 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.789986 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.790012 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.790078 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.790129 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-config-data\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.790144 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-run-httpd\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.790160 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkrz6\" (UniqueName: \"kubernetes.io/projected/5a6619bc-1eae-40c1-93a8-baf26424051b-kube-api-access-pkrz6\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.790811 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerName="nova-metadata-log" containerID="cri-o://f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b" gracePeriod=30 Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.791264 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerName="nova-metadata-metadata" containerID="cri-o://23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605" gracePeriod=30 Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.798389 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.802201 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.820190 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w64nt\" (UniqueName: \"kubernetes.io/projected/dd4ec623-4dba-48e7-89f5-6cd3eadce847-kube-api-access-w64nt\") pod \"nova-cell1-conductor-0\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.892078 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-scripts\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.892174 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.892231 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.892336 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-run-httpd\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.892358 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-config-data\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.892378 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkrz6\" (UniqueName: \"kubernetes.io/projected/5a6619bc-1eae-40c1-93a8-baf26424051b-kube-api-access-pkrz6\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.892419 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-log-httpd\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.893310 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-run-httpd\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.896218 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-log-httpd\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.903212 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.903863 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.911482 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-config-data\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.914436 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-scripts\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.916745 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkrz6\" (UniqueName: \"kubernetes.io/projected/5a6619bc-1eae-40c1-93a8-baf26424051b-kube-api-access-pkrz6\") pod \"ceilometer-0\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " pod="openstack/ceilometer-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.947551 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.959434 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:34 crc kubenswrapper[4917]: I0318 07:08:34.966804 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.071674 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.094604 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t6md\" (UniqueName: \"kubernetes.io/projected/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-kube-api-access-9t6md\") pod \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.094677 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-config\") pod \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.094730 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-svc\") pod \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.094790 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-sb\") pod \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.094823 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-swift-storage-0\") pod \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.094859 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-nb\") pod \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\" (UID: \"61ec9fd8-58fa-4279-a404-61d4ef4f4c32\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.107046 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-kube-api-access-9t6md" (OuterVolumeSpecName: "kube-api-access-9t6md") pod "61ec9fd8-58fa-4279-a404-61d4ef4f4c32" (UID: "61ec9fd8-58fa-4279-a404-61d4ef4f4c32"). InnerVolumeSpecName "kube-api-access-9t6md". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.168156 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "61ec9fd8-58fa-4279-a404-61d4ef4f4c32" (UID: "61ec9fd8-58fa-4279-a404-61d4ef4f4c32"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.175807 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61ec9fd8-58fa-4279-a404-61d4ef4f4c32" (UID: "61ec9fd8-58fa-4279-a404-61d4ef4f4c32"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.177121 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61ec9fd8-58fa-4279-a404-61d4ef4f4c32" (UID: "61ec9fd8-58fa-4279-a404-61d4ef4f4c32"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.197471 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.197691 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t6md\" (UniqueName: \"kubernetes.io/projected/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-kube-api-access-9t6md\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.197779 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.197837 4917 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.198227 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "61ec9fd8-58fa-4279-a404-61d4ef4f4c32" (UID: "61ec9fd8-58fa-4279-a404-61d4ef4f4c32"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.206386 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-config" (OuterVolumeSpecName: "config") pod "61ec9fd8-58fa-4279-a404-61d4ef4f4c32" (UID: "61ec9fd8-58fa-4279-a404-61d4ef4f4c32"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.299695 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.299728 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/61ec9fd8-58fa-4279-a404-61d4ef4f4c32-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.392238 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.446735 4917 generic.go:334] "Generic (PLEG): container finished" podID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerID="23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605" exitCode=0 Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.447135 4917 generic.go:334] "Generic (PLEG): container finished" podID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerID="f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b" exitCode=143 Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.446803 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.446823 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4929ca54-e5ca-49c0-9254-2709a9d17782","Type":"ContainerDied","Data":"23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605"} Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.447320 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4929ca54-e5ca-49c0-9254-2709a9d17782","Type":"ContainerDied","Data":"f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b"} Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.447336 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4929ca54-e5ca-49c0-9254-2709a9d17782","Type":"ContainerDied","Data":"b7cf1d42c0ca26a1070194006a63f8bb18f392a97c69926faa5ec44f545a992b"} Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.447353 4917 scope.go:117] "RemoveContainer" containerID="23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.451676 4917 generic.go:334] "Generic (PLEG): container finished" podID="61ec9fd8-58fa-4279-a404-61d4ef4f4c32" containerID="01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe" exitCode=0 Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.451720 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" event={"ID":"61ec9fd8-58fa-4279-a404-61d4ef4f4c32","Type":"ContainerDied","Data":"01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe"} Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.451736 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" event={"ID":"61ec9fd8-58fa-4279-a404-61d4ef4f4c32","Type":"ContainerDied","Data":"68605f20477f94fa82fb849e360c08c87fae23b68adfdefa0e303982cfb1f7f9"} Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.451781 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-94c999df7-cjnd8" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.465094 4917 generic.go:334] "Generic (PLEG): container finished" podID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerID="d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303" exitCode=143 Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.465797 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"187c06ba-b0e4-46de-a7ec-ab9276e58265","Type":"ContainerDied","Data":"d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303"} Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.474915 4917 scope.go:117] "RemoveContainer" containerID="f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.496951 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-cjnd8"] Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.503166 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-nova-metadata-tls-certs\") pod \"4929ca54-e5ca-49c0-9254-2709a9d17782\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.503252 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-combined-ca-bundle\") pod \"4929ca54-e5ca-49c0-9254-2709a9d17782\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.503323 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4929ca54-e5ca-49c0-9254-2709a9d17782-logs\") pod \"4929ca54-e5ca-49c0-9254-2709a9d17782\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.503468 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-config-data\") pod \"4929ca54-e5ca-49c0-9254-2709a9d17782\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.503535 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmg78\" (UniqueName: \"kubernetes.io/projected/4929ca54-e5ca-49c0-9254-2709a9d17782-kube-api-access-nmg78\") pod \"4929ca54-e5ca-49c0-9254-2709a9d17782\" (UID: \"4929ca54-e5ca-49c0-9254-2709a9d17782\") " Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.504916 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-94c999df7-cjnd8"] Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.505243 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4929ca54-e5ca-49c0-9254-2709a9d17782-logs" (OuterVolumeSpecName: "logs") pod "4929ca54-e5ca-49c0-9254-2709a9d17782" (UID: "4929ca54-e5ca-49c0-9254-2709a9d17782"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.508727 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4929ca54-e5ca-49c0-9254-2709a9d17782-kube-api-access-nmg78" (OuterVolumeSpecName: "kube-api-access-nmg78") pod "4929ca54-e5ca-49c0-9254-2709a9d17782" (UID: "4929ca54-e5ca-49c0-9254-2709a9d17782"). InnerVolumeSpecName "kube-api-access-nmg78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.514432 4917 scope.go:117] "RemoveContainer" containerID="23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605" Mar 18 07:08:35 crc kubenswrapper[4917]: E0318 07:08:35.515047 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605\": container with ID starting with 23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605 not found: ID does not exist" containerID="23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.515079 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605"} err="failed to get container status \"23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605\": rpc error: code = NotFound desc = could not find container \"23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605\": container with ID starting with 23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605 not found: ID does not exist" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.515103 4917 scope.go:117] "RemoveContainer" containerID="f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b" Mar 18 07:08:35 crc kubenswrapper[4917]: E0318 07:08:35.515542 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b\": container with ID starting with f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b not found: ID does not exist" containerID="f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.515608 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b"} err="failed to get container status \"f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b\": rpc error: code = NotFound desc = could not find container \"f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b\": container with ID starting with f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b not found: ID does not exist" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.515638 4917 scope.go:117] "RemoveContainer" containerID="23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.515933 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605"} err="failed to get container status \"23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605\": rpc error: code = NotFound desc = could not find container \"23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605\": container with ID starting with 23f3b15de4d791985b3d60aac05af81f49f9b8d35077040d6131f86b6b6a6605 not found: ID does not exist" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.515952 4917 scope.go:117] "RemoveContainer" containerID="f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.516187 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b"} err="failed to get container status \"f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b\": rpc error: code = NotFound desc = could not find container \"f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b\": container with ID starting with f2bcb7b4306bea1a886a970cbfbde9f8d3f22627f18d7e06ca9d883176e9033b not found: ID does not exist" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.516218 4917 scope.go:117] "RemoveContainer" containerID="01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.576675 4917 scope.go:117] "RemoveContainer" containerID="3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.586727 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4929ca54-e5ca-49c0-9254-2709a9d17782" (UID: "4929ca54-e5ca-49c0-9254-2709a9d17782"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.587778 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "4929ca54-e5ca-49c0-9254-2709a9d17782" (UID: "4929ca54-e5ca-49c0-9254-2709a9d17782"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.617405 4917 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.617433 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.617441 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4929ca54-e5ca-49c0-9254-2709a9d17782-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.617451 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmg78\" (UniqueName: \"kubernetes.io/projected/4929ca54-e5ca-49c0-9254-2709a9d17782-kube-api-access-nmg78\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.620818 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-config-data" (OuterVolumeSpecName: "config-data") pod "4929ca54-e5ca-49c0-9254-2709a9d17782" (UID: "4929ca54-e5ca-49c0-9254-2709a9d17782"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.654814 4917 scope.go:117] "RemoveContainer" containerID="01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe" Mar 18 07:08:35 crc kubenswrapper[4917]: E0318 07:08:35.661720 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe\": container with ID starting with 01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe not found: ID does not exist" containerID="01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.661759 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe"} err="failed to get container status \"01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe\": rpc error: code = NotFound desc = could not find container \"01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe\": container with ID starting with 01ba56ad3b8e781229fbd1441f3b737274cc0bdfc9134f9099dd54b54d1022fe not found: ID does not exist" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.661786 4917 scope.go:117] "RemoveContainer" containerID="3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da" Mar 18 07:08:35 crc kubenswrapper[4917]: E0318 07:08:35.686208 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da\": container with ID starting with 3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da not found: ID does not exist" containerID="3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.686249 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da"} err="failed to get container status \"3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da\": rpc error: code = NotFound desc = could not find container \"3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da\": container with ID starting with 3ba12ec56b7c7ce02305efd4ce9870fa51ac68fde4e5e2dafe81c6343d8167da not found: ID does not exist" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.719303 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4929ca54-e5ca-49c0-9254-2709a9d17782-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.726575 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.755095 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.816820 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc" path="/var/lib/kubelet/pods/2ddcb8ec-31f8-4d6e-91f4-83d5537a20dc/volumes" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.817484 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61ec9fd8-58fa-4279-a404-61d4ef4f4c32" path="/var/lib/kubelet/pods/61ec9fd8-58fa-4279-a404-61d4ef4f4c32/volumes" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.889214 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.906649 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.918705 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:35 crc kubenswrapper[4917]: E0318 07:08:35.919144 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerName="nova-metadata-metadata" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.919161 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerName="nova-metadata-metadata" Mar 18 07:08:35 crc kubenswrapper[4917]: E0318 07:08:35.919173 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ec9fd8-58fa-4279-a404-61d4ef4f4c32" containerName="dnsmasq-dns" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.919179 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ec9fd8-58fa-4279-a404-61d4ef4f4c32" containerName="dnsmasq-dns" Mar 18 07:08:35 crc kubenswrapper[4917]: E0318 07:08:35.919196 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerName="nova-metadata-log" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.919203 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerName="nova-metadata-log" Mar 18 07:08:35 crc kubenswrapper[4917]: E0318 07:08:35.919216 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61ec9fd8-58fa-4279-a404-61d4ef4f4c32" containerName="init" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.919222 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="61ec9fd8-58fa-4279-a404-61d4ef4f4c32" containerName="init" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.919386 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="61ec9fd8-58fa-4279-a404-61d4ef4f4c32" containerName="dnsmasq-dns" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.919408 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerName="nova-metadata-metadata" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.919416 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4929ca54-e5ca-49c0-9254-2709a9d17782" containerName="nova-metadata-log" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.920315 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.922631 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.924449 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 07:08:35 crc kubenswrapper[4917]: I0318 07:08:35.945505 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.023695 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.024008 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsfgz\" (UniqueName: \"kubernetes.io/projected/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-kube-api-access-rsfgz\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.024083 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.024134 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-logs\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.024154 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-config-data\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.126317 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.126441 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsfgz\" (UniqueName: \"kubernetes.io/projected/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-kube-api-access-rsfgz\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.126622 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.126748 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-logs\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.126801 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-config-data\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.127400 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-logs\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.133558 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.134274 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-config-data\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.136276 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.152260 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsfgz\" (UniqueName: \"kubernetes.io/projected/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-kube-api-access-rsfgz\") pod \"nova-metadata-0\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.236232 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.500879 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerStarted","Data":"62eebe1f433dcc2f6fd6d6a0b2197c3784817958a8c2a14a337710f0b6e445ff"} Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.501234 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerStarted","Data":"660ad12e0cc45f0b2a4c71fb5a42530ab85b3330cc7a89195093c62277986952"} Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.504780 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="971c9dbd-0838-4d6b-b802-21845680c727" containerName="nova-scheduler-scheduler" containerID="cri-o://ca6ed31aebb315b90b7a769b73bcd1538d54c0d4691d586e5092e562eca73107" gracePeriod=30 Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.505159 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd4ec623-4dba-48e7-89f5-6cd3eadce847","Type":"ContainerStarted","Data":"d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89"} Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.505175 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd4ec623-4dba-48e7-89f5-6cd3eadce847","Type":"ContainerStarted","Data":"747956bbe9b21ee96566757b3eb20cac4ee3bc212347d4a35bed675a9a4e4d1f"} Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.505237 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.536182 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.536160395 podStartE2EDuration="2.536160395s" podCreationTimestamp="2026-03-18 07:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:36.524308141 +0000 UTC m=+1301.465462855" watchObservedRunningTime="2026-03-18 07:08:36.536160395 +0000 UTC m=+1301.477315109" Mar 18 07:08:36 crc kubenswrapper[4917]: I0318 07:08:36.752970 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:08:37 crc kubenswrapper[4917]: I0318 07:08:37.521983 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb","Type":"ContainerStarted","Data":"599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2"} Mar 18 07:08:37 crc kubenswrapper[4917]: I0318 07:08:37.522503 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb","Type":"ContainerStarted","Data":"61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da"} Mar 18 07:08:37 crc kubenswrapper[4917]: I0318 07:08:37.522523 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb","Type":"ContainerStarted","Data":"a6fe1cf3727db5260225c36cd064ad240e1ce963bcbc50062b8f3057a1c47dad"} Mar 18 07:08:37 crc kubenswrapper[4917]: I0318 07:08:37.535762 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerStarted","Data":"8a6bb71f2dfe24c78e6a2bf3af4e48e7741e16bc216ef60059b6cef779e885c4"} Mar 18 07:08:37 crc kubenswrapper[4917]: I0318 07:08:37.567451 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.567431176 podStartE2EDuration="2.567431176s" podCreationTimestamp="2026-03-18 07:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:37.548291656 +0000 UTC m=+1302.489446390" watchObservedRunningTime="2026-03-18 07:08:37.567431176 +0000 UTC m=+1302.508585890" Mar 18 07:08:37 crc kubenswrapper[4917]: I0318 07:08:37.784089 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4929ca54-e5ca-49c0-9254-2709a9d17782" path="/var/lib/kubelet/pods/4929ca54-e5ca-49c0-9254-2709a9d17782/volumes" Mar 18 07:08:38 crc kubenswrapper[4917]: I0318 07:08:38.569559 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerStarted","Data":"8cba398ec1ccafa32da019112452a99b4d1037f2f616d5b1a6f4a5261f704db5"} Mar 18 07:08:38 crc kubenswrapper[4917]: E0318 07:08:38.959272 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca6ed31aebb315b90b7a769b73bcd1538d54c0d4691d586e5092e562eca73107" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 07:08:38 crc kubenswrapper[4917]: E0318 07:08:38.961595 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca6ed31aebb315b90b7a769b73bcd1538d54c0d4691d586e5092e562eca73107" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 07:08:38 crc kubenswrapper[4917]: E0318 07:08:38.963627 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ca6ed31aebb315b90b7a769b73bcd1538d54c0d4691d586e5092e562eca73107" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 07:08:38 crc kubenswrapper[4917]: E0318 07:08:38.963664 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="971c9dbd-0838-4d6b-b802-21845680c727" containerName="nova-scheduler-scheduler" Mar 18 07:08:39 crc kubenswrapper[4917]: I0318 07:08:39.581524 4917 generic.go:334] "Generic (PLEG): container finished" podID="971c9dbd-0838-4d6b-b802-21845680c727" containerID="ca6ed31aebb315b90b7a769b73bcd1538d54c0d4691d586e5092e562eca73107" exitCode=0 Mar 18 07:08:39 crc kubenswrapper[4917]: I0318 07:08:39.581657 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"971c9dbd-0838-4d6b-b802-21845680c727","Type":"ContainerDied","Data":"ca6ed31aebb315b90b7a769b73bcd1538d54c0d4691d586e5092e562eca73107"} Mar 18 07:08:39 crc kubenswrapper[4917]: I0318 07:08:39.879754 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:08:39 crc kubenswrapper[4917]: I0318 07:08:39.901106 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-combined-ca-bundle\") pod \"971c9dbd-0838-4d6b-b802-21845680c727\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " Mar 18 07:08:39 crc kubenswrapper[4917]: I0318 07:08:39.901179 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-config-data\") pod \"971c9dbd-0838-4d6b-b802-21845680c727\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " Mar 18 07:08:39 crc kubenswrapper[4917]: I0318 07:08:39.901493 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx6zk\" (UniqueName: \"kubernetes.io/projected/971c9dbd-0838-4d6b-b802-21845680c727-kube-api-access-jx6zk\") pod \"971c9dbd-0838-4d6b-b802-21845680c727\" (UID: \"971c9dbd-0838-4d6b-b802-21845680c727\") " Mar 18 07:08:39 crc kubenswrapper[4917]: I0318 07:08:39.909692 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/971c9dbd-0838-4d6b-b802-21845680c727-kube-api-access-jx6zk" (OuterVolumeSpecName: "kube-api-access-jx6zk") pod "971c9dbd-0838-4d6b-b802-21845680c727" (UID: "971c9dbd-0838-4d6b-b802-21845680c727"). InnerVolumeSpecName "kube-api-access-jx6zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:39 crc kubenswrapper[4917]: I0318 07:08:39.929257 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "971c9dbd-0838-4d6b-b802-21845680c727" (UID: "971c9dbd-0838-4d6b-b802-21845680c727"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:39 crc kubenswrapper[4917]: I0318 07:08:39.933750 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-config-data" (OuterVolumeSpecName: "config-data") pod "971c9dbd-0838-4d6b-b802-21845680c727" (UID: "971c9dbd-0838-4d6b-b802-21845680c727"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.007341 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx6zk\" (UniqueName: \"kubernetes.io/projected/971c9dbd-0838-4d6b-b802-21845680c727-kube-api-access-jx6zk\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.007373 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.007382 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/971c9dbd-0838-4d6b-b802-21845680c727-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.469289 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.515907 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187c06ba-b0e4-46de-a7ec-ab9276e58265-logs\") pod \"187c06ba-b0e4-46de-a7ec-ab9276e58265\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.516062 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-config-data\") pod \"187c06ba-b0e4-46de-a7ec-ab9276e58265\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.516089 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-combined-ca-bundle\") pod \"187c06ba-b0e4-46de-a7ec-ab9276e58265\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.516187 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmnc6\" (UniqueName: \"kubernetes.io/projected/187c06ba-b0e4-46de-a7ec-ab9276e58265-kube-api-access-dmnc6\") pod \"187c06ba-b0e4-46de-a7ec-ab9276e58265\" (UID: \"187c06ba-b0e4-46de-a7ec-ab9276e58265\") " Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.523024 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187c06ba-b0e4-46de-a7ec-ab9276e58265-logs" (OuterVolumeSpecName: "logs") pod "187c06ba-b0e4-46de-a7ec-ab9276e58265" (UID: "187c06ba-b0e4-46de-a7ec-ab9276e58265"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.524780 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187c06ba-b0e4-46de-a7ec-ab9276e58265-kube-api-access-dmnc6" (OuterVolumeSpecName: "kube-api-access-dmnc6") pod "187c06ba-b0e4-46de-a7ec-ab9276e58265" (UID: "187c06ba-b0e4-46de-a7ec-ab9276e58265"). InnerVolumeSpecName "kube-api-access-dmnc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.554095 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-config-data" (OuterVolumeSpecName: "config-data") pod "187c06ba-b0e4-46de-a7ec-ab9276e58265" (UID: "187c06ba-b0e4-46de-a7ec-ab9276e58265"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.554158 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "187c06ba-b0e4-46de-a7ec-ab9276e58265" (UID: "187c06ba-b0e4-46de-a7ec-ab9276e58265"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.596549 4917 generic.go:334] "Generic (PLEG): container finished" podID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerID="7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00" exitCode=0 Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.596630 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"187c06ba-b0e4-46de-a7ec-ab9276e58265","Type":"ContainerDied","Data":"7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00"} Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.596655 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"187c06ba-b0e4-46de-a7ec-ab9276e58265","Type":"ContainerDied","Data":"710c586d87aa3b8cfb4742d27191b01482ba49d4b99d50088c5f6b97ab0950e7"} Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.596671 4917 scope.go:117] "RemoveContainer" containerID="7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.596788 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.604431 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerStarted","Data":"011327977cfb25ec7ef2c2bdef1ab1802509fa4211e327ff2dd452f13eb9f106"} Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.604741 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.610174 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"971c9dbd-0838-4d6b-b802-21845680c727","Type":"ContainerDied","Data":"58152cf58464293738777542db8eb140ab02c896d0d87badab03b78feba6365b"} Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.610255 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.619307 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.619339 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187c06ba-b0e4-46de-a7ec-ab9276e58265-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.619350 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmnc6\" (UniqueName: \"kubernetes.io/projected/187c06ba-b0e4-46de-a7ec-ab9276e58265-kube-api-access-dmnc6\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.619359 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187c06ba-b0e4-46de-a7ec-ab9276e58265-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.653834 4917 scope.go:117] "RemoveContainer" containerID="d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.655648 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.077676847 podStartE2EDuration="6.655628884s" podCreationTimestamp="2026-03-18 07:08:34 +0000 UTC" firstStartedPulling="2026-03-18 07:08:35.745695206 +0000 UTC m=+1300.686849920" lastFinishedPulling="2026-03-18 07:08:40.323647243 +0000 UTC m=+1305.264801957" observedRunningTime="2026-03-18 07:08:40.630753297 +0000 UTC m=+1305.571908001" watchObservedRunningTime="2026-03-18 07:08:40.655628884 +0000 UTC m=+1305.596783598" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.662311 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.682526 4917 scope.go:117] "RemoveContainer" containerID="7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.687042 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:08:40 crc kubenswrapper[4917]: E0318 07:08:40.688635 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00\": container with ID starting with 7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00 not found: ID does not exist" containerID="7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.688678 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00"} err="failed to get container status \"7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00\": rpc error: code = NotFound desc = could not find container \"7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00\": container with ID starting with 7dce881344feeb611eaa753377218a6903c838b15f8eb62b45ae4b24857b8d00 not found: ID does not exist" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.688700 4917 scope.go:117] "RemoveContainer" containerID="d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303" Mar 18 07:08:40 crc kubenswrapper[4917]: E0318 07:08:40.689317 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303\": container with ID starting with d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303 not found: ID does not exist" containerID="d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.689370 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303"} err="failed to get container status \"d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303\": rpc error: code = NotFound desc = could not find container \"d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303\": container with ID starting with d9c524ab02948a7943409a0348aa1750e6cb16eb6f40f54c099c58a2f8bdc303 not found: ID does not exist" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.689400 4917 scope.go:117] "RemoveContainer" containerID="ca6ed31aebb315b90b7a769b73bcd1538d54c0d4691d586e5092e562eca73107" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.694680 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.716041 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 07:08:40 crc kubenswrapper[4917]: E0318 07:08:40.716404 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-api" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.716421 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-api" Mar 18 07:08:40 crc kubenswrapper[4917]: E0318 07:08:40.716438 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="971c9dbd-0838-4d6b-b802-21845680c727" containerName="nova-scheduler-scheduler" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.716444 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="971c9dbd-0838-4d6b-b802-21845680c727" containerName="nova-scheduler-scheduler" Mar 18 07:08:40 crc kubenswrapper[4917]: E0318 07:08:40.716458 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-log" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.716464 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-log" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.716631 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-log" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.716649 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" containerName="nova-api-api" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.716659 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="971c9dbd-0838-4d6b-b802-21845680c727" containerName="nova-scheduler-scheduler" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.717544 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.725834 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.738359 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.746994 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.758529 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.759909 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.761878 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.779338 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.821615 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-config-data\") pod \"nova-scheduler-0\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.821680 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-logs\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.821737 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjvrz\" (UniqueName: \"kubernetes.io/projected/32929e66-f413-418e-a69a-ff9fc6ce3ea9-kube-api-access-rjvrz\") pod \"nova-scheduler-0\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.821781 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.821872 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpt4w\" (UniqueName: \"kubernetes.io/projected/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-kube-api-access-bpt4w\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.821952 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-config-data\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.821976 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.922829 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.923143 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpt4w\" (UniqueName: \"kubernetes.io/projected/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-kube-api-access-bpt4w\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.923193 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-config-data\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.923209 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.923253 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-config-data\") pod \"nova-scheduler-0\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.923273 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-logs\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.923294 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjvrz\" (UniqueName: \"kubernetes.io/projected/32929e66-f413-418e-a69a-ff9fc6ce3ea9-kube-api-access-rjvrz\") pod \"nova-scheduler-0\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.925452 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-logs\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.927756 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.936795 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-config-data\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.938753 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpt4w\" (UniqueName: \"kubernetes.io/projected/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-kube-api-access-bpt4w\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.939788 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-config-data\") pod \"nova-scheduler-0\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.940166 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " pod="openstack/nova-api-0" Mar 18 07:08:40 crc kubenswrapper[4917]: I0318 07:08:40.940175 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjvrz\" (UniqueName: \"kubernetes.io/projected/32929e66-f413-418e-a69a-ff9fc6ce3ea9-kube-api-access-rjvrz\") pod \"nova-scheduler-0\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " pod="openstack/nova-scheduler-0" Mar 18 07:08:41 crc kubenswrapper[4917]: I0318 07:08:41.039454 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:08:41 crc kubenswrapper[4917]: I0318 07:08:41.080988 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:08:41 crc kubenswrapper[4917]: I0318 07:08:41.544118 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:08:41 crc kubenswrapper[4917]: I0318 07:08:41.630348 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:08:41 crc kubenswrapper[4917]: I0318 07:08:41.634054 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c","Type":"ContainerStarted","Data":"f5e56a42f9af7c3364b4e0973584ccbe5219da3424f600021638dfc3b4b8e6ef"} Mar 18 07:08:41 crc kubenswrapper[4917]: I0318 07:08:41.785809 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187c06ba-b0e4-46de-a7ec-ab9276e58265" path="/var/lib/kubelet/pods/187c06ba-b0e4-46de-a7ec-ab9276e58265/volumes" Mar 18 07:08:41 crc kubenswrapper[4917]: I0318 07:08:41.786648 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="971c9dbd-0838-4d6b-b802-21845680c727" path="/var/lib/kubelet/pods/971c9dbd-0838-4d6b-b802-21845680c727/volumes" Mar 18 07:08:42 crc kubenswrapper[4917]: I0318 07:08:42.649781 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c","Type":"ContainerStarted","Data":"ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc"} Mar 18 07:08:42 crc kubenswrapper[4917]: I0318 07:08:42.650041 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c","Type":"ContainerStarted","Data":"e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c"} Mar 18 07:08:42 crc kubenswrapper[4917]: I0318 07:08:42.651378 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32929e66-f413-418e-a69a-ff9fc6ce3ea9","Type":"ContainerStarted","Data":"137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154"} Mar 18 07:08:42 crc kubenswrapper[4917]: I0318 07:08:42.651398 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32929e66-f413-418e-a69a-ff9fc6ce3ea9","Type":"ContainerStarted","Data":"247eb60b1d2d670e38d57a7d522f9456358ea803f327902dfe143e513cd3c00b"} Mar 18 07:08:42 crc kubenswrapper[4917]: I0318 07:08:42.684516 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.684498557 podStartE2EDuration="2.684498557s" podCreationTimestamp="2026-03-18 07:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:42.669516038 +0000 UTC m=+1307.610670752" watchObservedRunningTime="2026-03-18 07:08:42.684498557 +0000 UTC m=+1307.625653271" Mar 18 07:08:42 crc kubenswrapper[4917]: I0318 07:08:42.713385 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.713366921 podStartE2EDuration="2.713366921s" podCreationTimestamp="2026-03-18 07:08:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:08:42.687979321 +0000 UTC m=+1307.629134045" watchObservedRunningTime="2026-03-18 07:08:42.713366921 +0000 UTC m=+1307.654521635" Mar 18 07:08:44 crc kubenswrapper[4917]: I0318 07:08:44.994174 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 07:08:46 crc kubenswrapper[4917]: I0318 07:08:46.082451 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 07:08:46 crc kubenswrapper[4917]: I0318 07:08:46.237091 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 07:08:46 crc kubenswrapper[4917]: I0318 07:08:46.237158 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 07:08:47 crc kubenswrapper[4917]: I0318 07:08:47.258862 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 07:08:47 crc kubenswrapper[4917]: I0318 07:08:47.258880 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 07:08:51 crc kubenswrapper[4917]: I0318 07:08:51.039990 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 07:08:51 crc kubenswrapper[4917]: I0318 07:08:51.040695 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 07:08:51 crc kubenswrapper[4917]: I0318 07:08:51.087149 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 07:08:51 crc kubenswrapper[4917]: I0318 07:08:51.299550 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 07:08:51 crc kubenswrapper[4917]: I0318 07:08:51.795847 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 07:08:52 crc kubenswrapper[4917]: I0318 07:08:52.122852 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 07:08:52 crc kubenswrapper[4917]: I0318 07:08:52.122937 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 07:08:54 crc kubenswrapper[4917]: I0318 07:08:54.236827 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 07:08:54 crc kubenswrapper[4917]: I0318 07:08:54.237299 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 07:08:56 crc kubenswrapper[4917]: I0318 07:08:56.247432 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 07:08:56 crc kubenswrapper[4917]: I0318 07:08:56.249554 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 07:08:56 crc kubenswrapper[4917]: I0318 07:08:56.261133 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 07:08:56 crc kubenswrapper[4917]: I0318 07:08:56.806647 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.780820 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.824341 4917 generic.go:334] "Generic (PLEG): container finished" podID="9f40bc95-334c-48d8-a06d-de4605fdb1c6" containerID="5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71" exitCode=137 Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.824403 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.824417 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9f40bc95-334c-48d8-a06d-de4605fdb1c6","Type":"ContainerDied","Data":"5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71"} Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.824461 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9f40bc95-334c-48d8-a06d-de4605fdb1c6","Type":"ContainerDied","Data":"02ecf9925ac6c137af8dc03c33d1183bf652a5e4b0a5087b431c4bf3f957c666"} Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.824478 4917 scope.go:117] "RemoveContainer" containerID="5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71" Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.852759 4917 scope.go:117] "RemoveContainer" containerID="5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71" Mar 18 07:08:58 crc kubenswrapper[4917]: E0318 07:08:58.853276 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71\": container with ID starting with 5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71 not found: ID does not exist" containerID="5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71" Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.853320 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71"} err="failed to get container status \"5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71\": rpc error: code = NotFound desc = could not find container \"5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71\": container with ID starting with 5b86491dec4f36972c0d9672b6ec32071cdbfe2531bb6a95c2a6dc26994fdb71 not found: ID does not exist" Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.902699 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-config-data\") pod \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.902777 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-combined-ca-bundle\") pod \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.902846 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/9f40bc95-334c-48d8-a06d-de4605fdb1c6-kube-api-access-4zw22\") pod \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\" (UID: \"9f40bc95-334c-48d8-a06d-de4605fdb1c6\") " Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.910881 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f40bc95-334c-48d8-a06d-de4605fdb1c6-kube-api-access-4zw22" (OuterVolumeSpecName: "kube-api-access-4zw22") pod "9f40bc95-334c-48d8-a06d-de4605fdb1c6" (UID: "9f40bc95-334c-48d8-a06d-de4605fdb1c6"). InnerVolumeSpecName "kube-api-access-4zw22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.950977 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f40bc95-334c-48d8-a06d-de4605fdb1c6" (UID: "9f40bc95-334c-48d8-a06d-de4605fdb1c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:58 crc kubenswrapper[4917]: I0318 07:08:58.951447 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-config-data" (OuterVolumeSpecName: "config-data") pod "9f40bc95-334c-48d8-a06d-de4605fdb1c6" (UID: "9f40bc95-334c-48d8-a06d-de4605fdb1c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.005565 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.005623 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f40bc95-334c-48d8-a06d-de4605fdb1c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.005643 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zw22\" (UniqueName: \"kubernetes.io/projected/9f40bc95-334c-48d8-a06d-de4605fdb1c6-kube-api-access-4zw22\") on node \"crc\" DevicePath \"\"" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.040421 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.040497 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.175694 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.190218 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.217699 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:08:59 crc kubenswrapper[4917]: E0318 07:08:59.218287 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f40bc95-334c-48d8-a06d-de4605fdb1c6" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.218317 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f40bc95-334c-48d8-a06d-de4605fdb1c6" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.218635 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f40bc95-334c-48d8-a06d-de4605fdb1c6" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.219578 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.234703 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.249936 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.250719 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.250750 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.315535 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.315652 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.315677 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.315716 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9bsq\" (UniqueName: \"kubernetes.io/projected/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-kube-api-access-g9bsq\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.315753 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.416809 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.416887 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9bsq\" (UniqueName: \"kubernetes.io/projected/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-kube-api-access-g9bsq\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.416933 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.416973 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.417054 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.421308 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.422135 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.422368 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.440242 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.442566 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9bsq\" (UniqueName: \"kubernetes.io/projected/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-kube-api-access-g9bsq\") pod \"nova-cell1-novncproxy-0\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.565014 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.799621 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f40bc95-334c-48d8-a06d-de4605fdb1c6" path="/var/lib/kubelet/pods/9f40bc95-334c-48d8-a06d-de4605fdb1c6/volumes" Mar 18 07:08:59 crc kubenswrapper[4917]: I0318 07:08:59.837360 4917 scope.go:117] "RemoveContainer" containerID="4250ccdf8ecf8d41da280ff13cad4625a476403d650a3122ccab601098fea96e" Mar 18 07:09:00 crc kubenswrapper[4917]: I0318 07:09:00.031159 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:09:00 crc kubenswrapper[4917]: W0318 07:09:00.043856 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4affeee_7968_4e70_b6dd_d8d0f17cfa92.slice/crio-8f801b94c21d98ddc1316b5073c91c34de6be89ce63b0c7026607977fac37c69 WatchSource:0}: Error finding container 8f801b94c21d98ddc1316b5073c91c34de6be89ce63b0c7026607977fac37c69: Status 404 returned error can't find the container with id 8f801b94c21d98ddc1316b5073c91c34de6be89ce63b0c7026607977fac37c69 Mar 18 07:09:00 crc kubenswrapper[4917]: I0318 07:09:00.848995 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e4affeee-7968-4e70-b6dd-d8d0f17cfa92","Type":"ContainerStarted","Data":"b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db"} Mar 18 07:09:00 crc kubenswrapper[4917]: I0318 07:09:00.849350 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e4affeee-7968-4e70-b6dd-d8d0f17cfa92","Type":"ContainerStarted","Data":"8f801b94c21d98ddc1316b5073c91c34de6be89ce63b0c7026607977fac37c69"} Mar 18 07:09:00 crc kubenswrapper[4917]: I0318 07:09:00.881219 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.881185054 podStartE2EDuration="1.881185054s" podCreationTimestamp="2026-03-18 07:08:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:09:00.875658981 +0000 UTC m=+1325.816813735" watchObservedRunningTime="2026-03-18 07:09:00.881185054 +0000 UTC m=+1325.822339788" Mar 18 07:09:01 crc kubenswrapper[4917]: I0318 07:09:01.045464 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 07:09:01 crc kubenswrapper[4917]: I0318 07:09:01.048662 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 07:09:01 crc kubenswrapper[4917]: I0318 07:09:01.055860 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 07:09:01 crc kubenswrapper[4917]: I0318 07:09:01.864526 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.061957 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-hkm5m"] Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.063392 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.084962 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-hkm5m"] Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.182409 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.182480 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-svc\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.182499 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-sb\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.182719 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q5jp\" (UniqueName: \"kubernetes.io/projected/fae3a765-1a72-4207-8066-d3f8926e8641-kube-api-access-7q5jp\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.182802 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-swift-storage-0\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.182987 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-config\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.284366 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-config\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.284534 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.284670 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-svc\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.284696 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-sb\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.285519 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-nb\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.285536 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-config\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.285743 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q5jp\" (UniqueName: \"kubernetes.io/projected/fae3a765-1a72-4207-8066-d3f8926e8641-kube-api-access-7q5jp\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.285818 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-swift-storage-0\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.286010 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-svc\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.286046 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-sb\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.286461 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-swift-storage-0\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.309844 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q5jp\" (UniqueName: \"kubernetes.io/projected/fae3a765-1a72-4207-8066-d3f8926e8641-kube-api-access-7q5jp\") pod \"dnsmasq-dns-7b6d8fd79c-hkm5m\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.435467 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.929066 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.929417 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:09:02 crc kubenswrapper[4917]: I0318 07:09:02.960000 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-hkm5m"] Mar 18 07:09:02 crc kubenswrapper[4917]: W0318 07:09:02.962728 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfae3a765_1a72_4207_8066_d3f8926e8641.slice/crio-63f2ce8a5446b6ec29e34554521410cd73e22fec6b8cea1118b0712182c091c2 WatchSource:0}: Error finding container 63f2ce8a5446b6ec29e34554521410cd73e22fec6b8cea1118b0712182c091c2: Status 404 returned error can't find the container with id 63f2ce8a5446b6ec29e34554521410cd73e22fec6b8cea1118b0712182c091c2 Mar 18 07:09:03 crc kubenswrapper[4917]: I0318 07:09:03.887617 4917 generic.go:334] "Generic (PLEG): container finished" podID="fae3a765-1a72-4207-8066-d3f8926e8641" containerID="1c56939532f2bc0f9729b861c994e25e5f036cb964bc64d8c59110624f5b3f66" exitCode=0 Mar 18 07:09:03 crc kubenswrapper[4917]: I0318 07:09:03.887670 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" event={"ID":"fae3a765-1a72-4207-8066-d3f8926e8641","Type":"ContainerDied","Data":"1c56939532f2bc0f9729b861c994e25e5f036cb964bc64d8c59110624f5b3f66"} Mar 18 07:09:03 crc kubenswrapper[4917]: I0318 07:09:03.887867 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" event={"ID":"fae3a765-1a72-4207-8066-d3f8926e8641","Type":"ContainerStarted","Data":"63f2ce8a5446b6ec29e34554521410cd73e22fec6b8cea1118b0712182c091c2"} Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.213396 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.215054 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="ceilometer-central-agent" containerID="cri-o://62eebe1f433dcc2f6fd6d6a0b2197c3784817958a8c2a14a337710f0b6e445ff" gracePeriod=30 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.215823 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="proxy-httpd" containerID="cri-o://011327977cfb25ec7ef2c2bdef1ab1802509fa4211e327ff2dd452f13eb9f106" gracePeriod=30 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.215910 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="ceilometer-notification-agent" containerID="cri-o://8a6bb71f2dfe24c78e6a2bf3af4e48e7741e16bc216ef60059b6cef779e885c4" gracePeriod=30 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.216048 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="sg-core" containerID="cri-o://8cba398ec1ccafa32da019112452a99b4d1037f2f616d5b1a6f4a5261f704db5" gracePeriod=30 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.231283 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.283113 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.565633 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.899650 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" event={"ID":"fae3a765-1a72-4207-8066-d3f8926e8641","Type":"ContainerStarted","Data":"dff717b59d7480b68ce1be64dd75f3257e481d3d0ab97f70359760ba880d621c"} Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.899884 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905264 4917 generic.go:334] "Generic (PLEG): container finished" podID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerID="011327977cfb25ec7ef2c2bdef1ab1802509fa4211e327ff2dd452f13eb9f106" exitCode=0 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905298 4917 generic.go:334] "Generic (PLEG): container finished" podID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerID="8cba398ec1ccafa32da019112452a99b4d1037f2f616d5b1a6f4a5261f704db5" exitCode=2 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905308 4917 generic.go:334] "Generic (PLEG): container finished" podID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerID="8a6bb71f2dfe24c78e6a2bf3af4e48e7741e16bc216ef60059b6cef779e885c4" exitCode=0 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905318 4917 generic.go:334] "Generic (PLEG): container finished" podID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerID="62eebe1f433dcc2f6fd6d6a0b2197c3784817958a8c2a14a337710f0b6e445ff" exitCode=0 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905311 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerDied","Data":"011327977cfb25ec7ef2c2bdef1ab1802509fa4211e327ff2dd452f13eb9f106"} Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905364 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerDied","Data":"8cba398ec1ccafa32da019112452a99b4d1037f2f616d5b1a6f4a5261f704db5"} Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905379 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerDied","Data":"8a6bb71f2dfe24c78e6a2bf3af4e48e7741e16bc216ef60059b6cef779e885c4"} Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905391 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerDied","Data":"62eebe1f433dcc2f6fd6d6a0b2197c3784817958a8c2a14a337710f0b6e445ff"} Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905546 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-log" containerID="cri-o://e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c" gracePeriod=30 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.905695 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-api" containerID="cri-o://ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc" gracePeriod=30 Mar 18 07:09:04 crc kubenswrapper[4917]: I0318 07:09:04.922980 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" podStartSLOduration=2.9229622170000003 podStartE2EDuration="2.922962217s" podCreationTimestamp="2026-03-18 07:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:09:04.918762076 +0000 UTC m=+1329.859916800" watchObservedRunningTime="2026-03-18 07:09:04.922962217 +0000 UTC m=+1329.864116921" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.161394 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.247561 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-combined-ca-bundle\") pod \"5a6619bc-1eae-40c1-93a8-baf26424051b\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.247645 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-log-httpd\") pod \"5a6619bc-1eae-40c1-93a8-baf26424051b\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.247751 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-scripts\") pod \"5a6619bc-1eae-40c1-93a8-baf26424051b\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.247776 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkrz6\" (UniqueName: \"kubernetes.io/projected/5a6619bc-1eae-40c1-93a8-baf26424051b-kube-api-access-pkrz6\") pod \"5a6619bc-1eae-40c1-93a8-baf26424051b\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.247842 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-sg-core-conf-yaml\") pod \"5a6619bc-1eae-40c1-93a8-baf26424051b\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.247874 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-config-data\") pod \"5a6619bc-1eae-40c1-93a8-baf26424051b\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.247888 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-run-httpd\") pod \"5a6619bc-1eae-40c1-93a8-baf26424051b\" (UID: \"5a6619bc-1eae-40c1-93a8-baf26424051b\") " Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.248487 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5a6619bc-1eae-40c1-93a8-baf26424051b" (UID: "5a6619bc-1eae-40c1-93a8-baf26424051b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.248636 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5a6619bc-1eae-40c1-93a8-baf26424051b" (UID: "5a6619bc-1eae-40c1-93a8-baf26424051b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.253770 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-scripts" (OuterVolumeSpecName: "scripts") pod "5a6619bc-1eae-40c1-93a8-baf26424051b" (UID: "5a6619bc-1eae-40c1-93a8-baf26424051b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.254092 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a6619bc-1eae-40c1-93a8-baf26424051b-kube-api-access-pkrz6" (OuterVolumeSpecName: "kube-api-access-pkrz6") pod "5a6619bc-1eae-40c1-93a8-baf26424051b" (UID: "5a6619bc-1eae-40c1-93a8-baf26424051b"). InnerVolumeSpecName "kube-api-access-pkrz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.276052 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5a6619bc-1eae-40c1-93a8-baf26424051b" (UID: "5a6619bc-1eae-40c1-93a8-baf26424051b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.335845 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a6619bc-1eae-40c1-93a8-baf26424051b" (UID: "5a6619bc-1eae-40c1-93a8-baf26424051b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.351691 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.352038 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.352164 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.352321 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5a6619bc-1eae-40c1-93a8-baf26424051b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.352443 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.352530 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkrz6\" (UniqueName: \"kubernetes.io/projected/5a6619bc-1eae-40c1-93a8-baf26424051b-kube-api-access-pkrz6\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.368816 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-config-data" (OuterVolumeSpecName: "config-data") pod "5a6619bc-1eae-40c1-93a8-baf26424051b" (UID: "5a6619bc-1eae-40c1-93a8-baf26424051b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.454125 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a6619bc-1eae-40c1-93a8-baf26424051b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.916510 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5a6619bc-1eae-40c1-93a8-baf26424051b","Type":"ContainerDied","Data":"660ad12e0cc45f0b2a4c71fb5a42530ab85b3330cc7a89195093c62277986952"} Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.916567 4917 scope.go:117] "RemoveContainer" containerID="011327977cfb25ec7ef2c2bdef1ab1802509fa4211e327ff2dd452f13eb9f106" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.916734 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.923857 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c","Type":"ContainerDied","Data":"e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c"} Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.923808 4917 generic.go:334] "Generic (PLEG): container finished" podID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerID="e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c" exitCode=143 Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.939987 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.948829 4917 scope.go:117] "RemoveContainer" containerID="8cba398ec1ccafa32da019112452a99b4d1037f2f616d5b1a6f4a5261f704db5" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.956868 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.974928 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:05 crc kubenswrapper[4917]: E0318 07:09:05.975284 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="ceilometer-central-agent" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.975299 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="ceilometer-central-agent" Mar 18 07:09:05 crc kubenswrapper[4917]: E0318 07:09:05.975320 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="sg-core" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.975326 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="sg-core" Mar 18 07:09:05 crc kubenswrapper[4917]: E0318 07:09:05.975339 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="ceilometer-notification-agent" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.975345 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="ceilometer-notification-agent" Mar 18 07:09:05 crc kubenswrapper[4917]: E0318 07:09:05.975359 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="proxy-httpd" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.975365 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="proxy-httpd" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.975529 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="proxy-httpd" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.975548 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="ceilometer-notification-agent" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.975561 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="sg-core" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.975571 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="ceilometer-central-agent" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.977043 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.978793 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 07:09:05 crc kubenswrapper[4917]: I0318 07:09:05.980767 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:05.984870 4917 scope.go:117] "RemoveContainer" containerID="8a6bb71f2dfe24c78e6a2bf3af4e48e7741e16bc216ef60059b6cef779e885c4" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:05.987962 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.029911 4917 scope.go:117] "RemoveContainer" containerID="62eebe1f433dcc2f6fd6d6a0b2197c3784817958a8c2a14a337710f0b6e445ff" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.064025 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt7qm\" (UniqueName: \"kubernetes.io/projected/dec8f661-50c4-4051-9c1c-cd3c19a63c91-kube-api-access-vt7qm\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.064152 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.064239 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-run-httpd\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.064273 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-log-httpd\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.064298 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-scripts\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.064321 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.064343 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-config-data\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.166011 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-run-httpd\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.166060 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-log-httpd\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.166081 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-scripts\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.166098 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.166117 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-config-data\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.166166 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt7qm\" (UniqueName: \"kubernetes.io/projected/dec8f661-50c4-4051-9c1c-cd3c19a63c91-kube-api-access-vt7qm\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.166228 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.167017 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-run-httpd\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.167077 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-log-httpd\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.171506 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-config-data\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.171668 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-scripts\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.173174 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.185157 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.204311 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt7qm\" (UniqueName: \"kubernetes.io/projected/dec8f661-50c4-4051-9c1c-cd3c19a63c91-kube-api-access-vt7qm\") pod \"ceilometer-0\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.325689 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.351433 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.761532 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:06 crc kubenswrapper[4917]: I0318 07:09:06.932981 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerStarted","Data":"dd1b2ac8bab41047cb4941efc91653b57d455fd40f908db6302dc22821d44eb1"} Mar 18 07:09:07 crc kubenswrapper[4917]: I0318 07:09:07.785602 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" path="/var/lib/kubelet/pods/5a6619bc-1eae-40c1-93a8-baf26424051b/volumes" Mar 18 07:09:07 crc kubenswrapper[4917]: I0318 07:09:07.948191 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerStarted","Data":"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a"} Mar 18 07:09:08 crc kubenswrapper[4917]: E0318 07:09:08.334049 4917 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.543309 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.608486 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-logs\") pod \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.608688 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-combined-ca-bundle\") pod \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.608737 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-config-data\") pod \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.608820 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpt4w\" (UniqueName: \"kubernetes.io/projected/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-kube-api-access-bpt4w\") pod \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\" (UID: \"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c\") " Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.609069 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-logs" (OuterVolumeSpecName: "logs") pod "620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" (UID: "620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.609358 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.619144 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-kube-api-access-bpt4w" (OuterVolumeSpecName: "kube-api-access-bpt4w") pod "620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" (UID: "620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c"). InnerVolumeSpecName "kube-api-access-bpt4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.638670 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" (UID: "620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.645698 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-config-data" (OuterVolumeSpecName: "config-data") pod "620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" (UID: "620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.711273 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.711307 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.711319 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpt4w\" (UniqueName: \"kubernetes.io/projected/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c-kube-api-access-bpt4w\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.958926 4917 generic.go:334] "Generic (PLEG): container finished" podID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerID="ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc" exitCode=0 Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.958986 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.959036 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c","Type":"ContainerDied","Data":"ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc"} Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.959090 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c","Type":"ContainerDied","Data":"f5e56a42f9af7c3364b4e0973584ccbe5219da3424f600021638dfc3b4b8e6ef"} Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.959108 4917 scope.go:117] "RemoveContainer" containerID="ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc" Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.961262 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerStarted","Data":"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106"} Mar 18 07:09:08 crc kubenswrapper[4917]: I0318 07:09:08.990456 4917 scope.go:117] "RemoveContainer" containerID="e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.014555 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.024059 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.032323 4917 scope.go:117] "RemoveContainer" containerID="ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc" Mar 18 07:09:09 crc kubenswrapper[4917]: E0318 07:09:09.032772 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc\": container with ID starting with ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc not found: ID does not exist" containerID="ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.032868 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc"} err="failed to get container status \"ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc\": rpc error: code = NotFound desc = could not find container \"ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc\": container with ID starting with ddaabd1db31601822e31e3482bac691e4493b3836b347eb1ada26193e71ec8cc not found: ID does not exist" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.032946 4917 scope.go:117] "RemoveContainer" containerID="e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c" Mar 18 07:09:09 crc kubenswrapper[4917]: E0318 07:09:09.033172 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c\": container with ID starting with e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c not found: ID does not exist" containerID="e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.033255 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c"} err="failed to get container status \"e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c\": rpc error: code = NotFound desc = could not find container \"e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c\": container with ID starting with e2961ab0b99127ead1277fd29f494b50c2d59807c31ab96568aaf4fc1d71fd1c not found: ID does not exist" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.034678 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:09 crc kubenswrapper[4917]: E0318 07:09:09.035226 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-log" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.035256 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-log" Mar 18 07:09:09 crc kubenswrapper[4917]: E0318 07:09:09.035272 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-api" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.035281 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-api" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.035521 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-log" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.035560 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" containerName="nova-api-api" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.037093 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.040095 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.040308 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.041918 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.055435 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.117227 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxhtn\" (UniqueName: \"kubernetes.io/projected/594b8523-a82f-4df5-b5c8-f5a213154f7b-kube-api-access-jxhtn\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.117353 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.117468 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-config-data\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.117487 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594b8523-a82f-4df5-b5c8-f5a213154f7b-logs\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.117514 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-public-tls-certs\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.117548 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.219023 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxhtn\" (UniqueName: \"kubernetes.io/projected/594b8523-a82f-4df5-b5c8-f5a213154f7b-kube-api-access-jxhtn\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.219286 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.219444 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-config-data\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.220080 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594b8523-a82f-4df5-b5c8-f5a213154f7b-logs\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.220191 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-public-tls-certs\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.220282 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.220809 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594b8523-a82f-4df5-b5c8-f5a213154f7b-logs\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.225272 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.225655 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-config-data\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.225806 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-public-tls-certs\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.227998 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.240125 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxhtn\" (UniqueName: \"kubernetes.io/projected/594b8523-a82f-4df5-b5c8-f5a213154f7b-kube-api-access-jxhtn\") pod \"nova-api-0\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.357733 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.567996 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.586293 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.784403 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c" path="/var/lib/kubelet/pods/620f159b-ff7e-4ca9-ae5f-0a44c9cc8d8c/volumes" Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.856621 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:09 crc kubenswrapper[4917]: I0318 07:09:09.996919 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerStarted","Data":"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84"} Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.036505 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"594b8523-a82f-4df5-b5c8-f5a213154f7b","Type":"ContainerStarted","Data":"a3bf1c0388e646b91cccbb965059ee3486cc1bdcb64df013b849b013f9676296"} Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.051787 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.282348 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-82pcg"] Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.283664 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.286785 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.286925 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.295637 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-82pcg"] Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.345177 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-config-data\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.345553 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzd9l\" (UniqueName: \"kubernetes.io/projected/a8d30d15-8621-4653-b678-2693b301b35f-kube-api-access-dzd9l\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.345605 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.345638 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-scripts\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.447209 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-config-data\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.447279 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzd9l\" (UniqueName: \"kubernetes.io/projected/a8d30d15-8621-4653-b678-2693b301b35f-kube-api-access-dzd9l\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.447309 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.447352 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-scripts\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.452240 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-config-data\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.452293 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.452420 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-scripts\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.462016 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzd9l\" (UniqueName: \"kubernetes.io/projected/a8d30d15-8621-4653-b678-2693b301b35f-kube-api-access-dzd9l\") pod \"nova-cell1-cell-mapping-82pcg\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:10 crc kubenswrapper[4917]: I0318 07:09:10.614621 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:11 crc kubenswrapper[4917]: I0318 07:09:11.051520 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"594b8523-a82f-4df5-b5c8-f5a213154f7b","Type":"ContainerStarted","Data":"61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343"} Mar 18 07:09:11 crc kubenswrapper[4917]: I0318 07:09:11.051916 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"594b8523-a82f-4df5-b5c8-f5a213154f7b","Type":"ContainerStarted","Data":"3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412"} Mar 18 07:09:11 crc kubenswrapper[4917]: I0318 07:09:11.063355 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-82pcg"] Mar 18 07:09:11 crc kubenswrapper[4917]: W0318 07:09:11.068656 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8d30d15_8621_4653_b678_2693b301b35f.slice/crio-16d468caf135aef2a05496a5c5604bf5c11dcfe2444fadec40a9c867c31a3dad WatchSource:0}: Error finding container 16d468caf135aef2a05496a5c5604bf5c11dcfe2444fadec40a9c867c31a3dad: Status 404 returned error can't find the container with id 16d468caf135aef2a05496a5c5604bf5c11dcfe2444fadec40a9c867c31a3dad Mar 18 07:09:11 crc kubenswrapper[4917]: I0318 07:09:11.082215 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.082196932 podStartE2EDuration="3.082196932s" podCreationTimestamp="2026-03-18 07:09:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:09:11.073216686 +0000 UTC m=+1336.014371410" watchObservedRunningTime="2026-03-18 07:09:11.082196932 +0000 UTC m=+1336.023351646" Mar 18 07:09:12 crc kubenswrapper[4917]: I0318 07:09:12.062259 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-82pcg" event={"ID":"a8d30d15-8621-4653-b678-2693b301b35f","Type":"ContainerStarted","Data":"ea146ca86a61e4a78f1a0bb0d88a01ef5621fff4b0ee2505b6b7913116f8dc17"} Mar 18 07:09:12 crc kubenswrapper[4917]: I0318 07:09:12.062677 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-82pcg" event={"ID":"a8d30d15-8621-4653-b678-2693b301b35f","Type":"ContainerStarted","Data":"16d468caf135aef2a05496a5c5604bf5c11dcfe2444fadec40a9c867c31a3dad"} Mar 18 07:09:12 crc kubenswrapper[4917]: I0318 07:09:12.095806 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-82pcg" podStartSLOduration=2.095791568 podStartE2EDuration="2.095791568s" podCreationTimestamp="2026-03-18 07:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:09:12.090953992 +0000 UTC m=+1337.032108706" watchObservedRunningTime="2026-03-18 07:09:12.095791568 +0000 UTC m=+1337.036946282" Mar 18 07:09:12 crc kubenswrapper[4917]: I0318 07:09:12.437482 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:09:12 crc kubenswrapper[4917]: I0318 07:09:12.529425 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-thr9v"] Mar 18 07:09:12 crc kubenswrapper[4917]: I0318 07:09:12.530024 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" podUID="4fa0ca14-1497-499a-8c7d-5dd248b638a0" containerName="dnsmasq-dns" containerID="cri-o://3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3" gracePeriod=10 Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.043967 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.071842 4917 generic.go:334] "Generic (PLEG): container finished" podID="4fa0ca14-1497-499a-8c7d-5dd248b638a0" containerID="3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3" exitCode=0 Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.071903 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.071893 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" event={"ID":"4fa0ca14-1497-499a-8c7d-5dd248b638a0","Type":"ContainerDied","Data":"3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3"} Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.071968 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb6d55fc-thr9v" event={"ID":"4fa0ca14-1497-499a-8c7d-5dd248b638a0","Type":"ContainerDied","Data":"d537e77dd7f45e6d3ff37a870195702a86407bfe137358daee71166a94289af6"} Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.071989 4917 scope.go:117] "RemoveContainer" containerID="3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.098933 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-config\") pod \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.099000 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-sb\") pod \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.099051 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-nb\") pod \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.099119 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-svc\") pod \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.099180 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98djh\" (UniqueName: \"kubernetes.io/projected/4fa0ca14-1497-499a-8c7d-5dd248b638a0-kube-api-access-98djh\") pod \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.099201 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-swift-storage-0\") pod \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\" (UID: \"4fa0ca14-1497-499a-8c7d-5dd248b638a0\") " Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.113960 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fa0ca14-1497-499a-8c7d-5dd248b638a0-kube-api-access-98djh" (OuterVolumeSpecName: "kube-api-access-98djh") pod "4fa0ca14-1497-499a-8c7d-5dd248b638a0" (UID: "4fa0ca14-1497-499a-8c7d-5dd248b638a0"). InnerVolumeSpecName "kube-api-access-98djh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.123813 4917 scope.go:117] "RemoveContainer" containerID="060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.176402 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4fa0ca14-1497-499a-8c7d-5dd248b638a0" (UID: "4fa0ca14-1497-499a-8c7d-5dd248b638a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.190777 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4fa0ca14-1497-499a-8c7d-5dd248b638a0" (UID: "4fa0ca14-1497-499a-8c7d-5dd248b638a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.194361 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4fa0ca14-1497-499a-8c7d-5dd248b638a0" (UID: "4fa0ca14-1497-499a-8c7d-5dd248b638a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.198178 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-config" (OuterVolumeSpecName: "config") pod "4fa0ca14-1497-499a-8c7d-5dd248b638a0" (UID: "4fa0ca14-1497-499a-8c7d-5dd248b638a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.201575 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.201663 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.201677 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.201690 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98djh\" (UniqueName: \"kubernetes.io/projected/4fa0ca14-1497-499a-8c7d-5dd248b638a0-kube-api-access-98djh\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.201703 4917 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.219285 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4fa0ca14-1497-499a-8c7d-5dd248b638a0" (UID: "4fa0ca14-1497-499a-8c7d-5dd248b638a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.250259 4917 scope.go:117] "RemoveContainer" containerID="3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3" Mar 18 07:09:13 crc kubenswrapper[4917]: E0318 07:09:13.251169 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3\": container with ID starting with 3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3 not found: ID does not exist" containerID="3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.251206 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3"} err="failed to get container status \"3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3\": rpc error: code = NotFound desc = could not find container \"3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3\": container with ID starting with 3a4eb7484ac2273c06183efd2fd2080018cfe2032d2b6fde5ad984632a311ae3 not found: ID does not exist" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.251232 4917 scope.go:117] "RemoveContainer" containerID="060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69" Mar 18 07:09:13 crc kubenswrapper[4917]: E0318 07:09:13.251543 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69\": container with ID starting with 060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69 not found: ID does not exist" containerID="060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.251594 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69"} err="failed to get container status \"060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69\": rpc error: code = NotFound desc = could not find container \"060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69\": container with ID starting with 060010d98e2feb2ff3ad0a368ca86bda65f8d6d2028d2cf4ef79403a4da66a69 not found: ID does not exist" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.303551 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4fa0ca14-1497-499a-8c7d-5dd248b638a0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.410750 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-thr9v"] Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.419927 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb6d55fc-thr9v"] Mar 18 07:09:13 crc kubenswrapper[4917]: I0318 07:09:13.783032 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fa0ca14-1497-499a-8c7d-5dd248b638a0" path="/var/lib/kubelet/pods/4fa0ca14-1497-499a-8c7d-5dd248b638a0/volumes" Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.081362 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerStarted","Data":"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58"} Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.081896 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="ceilometer-central-agent" containerID="cri-o://6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a" gracePeriod=30 Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.082127 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.082561 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="proxy-httpd" containerID="cri-o://2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58" gracePeriod=30 Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.082644 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="sg-core" containerID="cri-o://6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84" gracePeriod=30 Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.082690 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="ceilometer-notification-agent" containerID="cri-o://9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106" gracePeriod=30 Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.108731 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.266146267 podStartE2EDuration="9.108712159s" podCreationTimestamp="2026-03-18 07:09:05 +0000 UTC" firstStartedPulling="2026-03-18 07:09:06.76690062 +0000 UTC m=+1331.708055334" lastFinishedPulling="2026-03-18 07:09:13.609466512 +0000 UTC m=+1338.550621226" observedRunningTime="2026-03-18 07:09:14.101268 +0000 UTC m=+1339.042422724" watchObservedRunningTime="2026-03-18 07:09:14.108712159 +0000 UTC m=+1339.049866863" Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.910813 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.935049 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-run-httpd\") pod \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.935148 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-combined-ca-bundle\") pod \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.935249 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-sg-core-conf-yaml\") pod \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.935329 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-config-data\") pod \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.935394 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-scripts\") pod \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.935522 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt7qm\" (UniqueName: \"kubernetes.io/projected/dec8f661-50c4-4051-9c1c-cd3c19a63c91-kube-api-access-vt7qm\") pod \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.935658 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-log-httpd\") pod \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\" (UID: \"dec8f661-50c4-4051-9c1c-cd3c19a63c91\") " Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.935388 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dec8f661-50c4-4051-9c1c-cd3c19a63c91" (UID: "dec8f661-50c4-4051-9c1c-cd3c19a63c91"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:09:14 crc kubenswrapper[4917]: I0318 07:09:14.936295 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:14.936705 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dec8f661-50c4-4051-9c1c-cd3c19a63c91" (UID: "dec8f661-50c4-4051-9c1c-cd3c19a63c91"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:14.942279 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec8f661-50c4-4051-9c1c-cd3c19a63c91-kube-api-access-vt7qm" (OuterVolumeSpecName: "kube-api-access-vt7qm") pod "dec8f661-50c4-4051-9c1c-cd3c19a63c91" (UID: "dec8f661-50c4-4051-9c1c-cd3c19a63c91"). InnerVolumeSpecName "kube-api-access-vt7qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:14.960386 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-scripts" (OuterVolumeSpecName: "scripts") pod "dec8f661-50c4-4051-9c1c-cd3c19a63c91" (UID: "dec8f661-50c4-4051-9c1c-cd3c19a63c91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.000488 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dec8f661-50c4-4051-9c1c-cd3c19a63c91" (UID: "dec8f661-50c4-4051-9c1c-cd3c19a63c91"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.038011 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.038047 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.038065 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt7qm\" (UniqueName: \"kubernetes.io/projected/dec8f661-50c4-4051-9c1c-cd3c19a63c91-kube-api-access-vt7qm\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.038082 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dec8f661-50c4-4051-9c1c-cd3c19a63c91-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.041242 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dec8f661-50c4-4051-9c1c-cd3c19a63c91" (UID: "dec8f661-50c4-4051-9c1c-cd3c19a63c91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.076375 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-config-data" (OuterVolumeSpecName: "config-data") pod "dec8f661-50c4-4051-9c1c-cd3c19a63c91" (UID: "dec8f661-50c4-4051-9c1c-cd3c19a63c91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.099934 4917 generic.go:334] "Generic (PLEG): container finished" podID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerID="2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58" exitCode=0 Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.099995 4917 generic.go:334] "Generic (PLEG): container finished" podID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerID="6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84" exitCode=2 Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.100005 4917 generic.go:334] "Generic (PLEG): container finished" podID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerID="9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106" exitCode=0 Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.100013 4917 generic.go:334] "Generic (PLEG): container finished" podID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerID="6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a" exitCode=0 Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.100048 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.100060 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerDied","Data":"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58"} Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.100090 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerDied","Data":"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84"} Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.100103 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerDied","Data":"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106"} Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.100144 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerDied","Data":"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a"} Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.100156 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dec8f661-50c4-4051-9c1c-cd3c19a63c91","Type":"ContainerDied","Data":"dd1b2ac8bab41047cb4941efc91653b57d455fd40f908db6302dc22821d44eb1"} Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.100174 4917 scope.go:117] "RemoveContainer" containerID="2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.121198 4917 scope.go:117] "RemoveContainer" containerID="6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.139805 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.139838 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec8f661-50c4-4051-9c1c-cd3c19a63c91-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.148612 4917 scope.go:117] "RemoveContainer" containerID="9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.164776 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.180615 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196057 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196171 4917 scope.go:117] "RemoveContainer" containerID="6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.196485 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="ceilometer-notification-agent" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196502 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="ceilometer-notification-agent" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.196520 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa0ca14-1497-499a-8c7d-5dd248b638a0" containerName="dnsmasq-dns" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196528 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa0ca14-1497-499a-8c7d-5dd248b638a0" containerName="dnsmasq-dns" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.196560 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="ceilometer-central-agent" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196568 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="ceilometer-central-agent" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.196602 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fa0ca14-1497-499a-8c7d-5dd248b638a0" containerName="init" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196610 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fa0ca14-1497-499a-8c7d-5dd248b638a0" containerName="init" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.196630 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="sg-core" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196640 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="sg-core" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.196654 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="proxy-httpd" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196659 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="proxy-httpd" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196922 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="proxy-httpd" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196936 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="sg-core" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196948 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fa0ca14-1497-499a-8c7d-5dd248b638a0" containerName="dnsmasq-dns" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196963 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="ceilometer-notification-agent" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.196972 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" containerName="ceilometer-central-agent" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.199612 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.202817 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.203542 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.206696 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.240822 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-scripts\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.240879 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvg2h\" (UniqueName: \"kubernetes.io/projected/06f1a177-9873-437e-9571-0e4731a15ee5-kube-api-access-bvg2h\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.240935 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.240980 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-run-httpd\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.241020 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-config-data\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.241038 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.241062 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-log-httpd\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.241253 4917 scope.go:117] "RemoveContainer" containerID="2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.242271 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58\": container with ID starting with 2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58 not found: ID does not exist" containerID="2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.242310 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58"} err="failed to get container status \"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58\": rpc error: code = NotFound desc = could not find container \"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58\": container with ID starting with 2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.242336 4917 scope.go:117] "RemoveContainer" containerID="6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.242576 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84\": container with ID starting with 6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84 not found: ID does not exist" containerID="6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.242617 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84"} err="failed to get container status \"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84\": rpc error: code = NotFound desc = could not find container \"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84\": container with ID starting with 6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.242638 4917 scope.go:117] "RemoveContainer" containerID="9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.242826 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106\": container with ID starting with 9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106 not found: ID does not exist" containerID="9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.242842 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106"} err="failed to get container status \"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106\": rpc error: code = NotFound desc = could not find container \"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106\": container with ID starting with 9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.242855 4917 scope.go:117] "RemoveContainer" containerID="6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a" Mar 18 07:09:15 crc kubenswrapper[4917]: E0318 07:09:15.243138 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a\": container with ID starting with 6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a not found: ID does not exist" containerID="6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.243155 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a"} err="failed to get container status \"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a\": rpc error: code = NotFound desc = could not find container \"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a\": container with ID starting with 6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.243194 4917 scope.go:117] "RemoveContainer" containerID="2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.243403 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58"} err="failed to get container status \"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58\": rpc error: code = NotFound desc = could not find container \"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58\": container with ID starting with 2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.243439 4917 scope.go:117] "RemoveContainer" containerID="6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.243620 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84"} err="failed to get container status \"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84\": rpc error: code = NotFound desc = could not find container \"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84\": container with ID starting with 6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.243647 4917 scope.go:117] "RemoveContainer" containerID="9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.243856 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106"} err="failed to get container status \"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106\": rpc error: code = NotFound desc = could not find container \"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106\": container with ID starting with 9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.243872 4917 scope.go:117] "RemoveContainer" containerID="6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244057 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a"} err="failed to get container status \"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a\": rpc error: code = NotFound desc = could not find container \"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a\": container with ID starting with 6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244072 4917 scope.go:117] "RemoveContainer" containerID="2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244219 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58"} err="failed to get container status \"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58\": rpc error: code = NotFound desc = could not find container \"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58\": container with ID starting with 2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244233 4917 scope.go:117] "RemoveContainer" containerID="6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244401 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84"} err="failed to get container status \"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84\": rpc error: code = NotFound desc = could not find container \"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84\": container with ID starting with 6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244415 4917 scope.go:117] "RemoveContainer" containerID="9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244619 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106"} err="failed to get container status \"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106\": rpc error: code = NotFound desc = could not find container \"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106\": container with ID starting with 9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244634 4917 scope.go:117] "RemoveContainer" containerID="6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244883 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a"} err="failed to get container status \"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a\": rpc error: code = NotFound desc = could not find container \"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a\": container with ID starting with 6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.244896 4917 scope.go:117] "RemoveContainer" containerID="2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.245078 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58"} err="failed to get container status \"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58\": rpc error: code = NotFound desc = could not find container \"2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58\": container with ID starting with 2bc9667442c2b898afa2f27be233c2e498e988645dd28e2699ffcc60c0e4aa58 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.245093 4917 scope.go:117] "RemoveContainer" containerID="6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.245264 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84"} err="failed to get container status \"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84\": rpc error: code = NotFound desc = could not find container \"6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84\": container with ID starting with 6eabb757eb0d928c22ed6cff25d37b138e3d43b4db9be7503c3e1330523e1d84 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.245277 4917 scope.go:117] "RemoveContainer" containerID="9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.245472 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106"} err="failed to get container status \"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106\": rpc error: code = NotFound desc = could not find container \"9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106\": container with ID starting with 9cd62331180da2de7d08ea82a6c83201c66f1bce6308d185b8a925b10efe7106 not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.245484 4917 scope.go:117] "RemoveContainer" containerID="6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.245660 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a"} err="failed to get container status \"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a\": rpc error: code = NotFound desc = could not find container \"6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a\": container with ID starting with 6fa4196a2a9851f6104ae4d9da2076cae655a07896f5296a651574672bffe65a not found: ID does not exist" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.346871 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.348086 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-run-httpd\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.348225 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-config-data\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.348260 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.348321 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-log-httpd\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.348434 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-scripts\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.348514 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvg2h\" (UniqueName: \"kubernetes.io/projected/06f1a177-9873-437e-9571-0e4731a15ee5-kube-api-access-bvg2h\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.349105 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-run-httpd\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.349697 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-log-httpd\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.351794 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.352358 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.352936 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-config-data\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.353066 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-scripts\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.365981 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvg2h\" (UniqueName: \"kubernetes.io/projected/06f1a177-9873-437e-9571-0e4731a15ee5-kube-api-access-bvg2h\") pod \"ceilometer-0\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.533552 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:15 crc kubenswrapper[4917]: I0318 07:09:15.783605 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec8f661-50c4-4051-9c1c-cd3c19a63c91" path="/var/lib/kubelet/pods/dec8f661-50c4-4051-9c1c-cd3c19a63c91/volumes" Mar 18 07:09:16 crc kubenswrapper[4917]: I0318 07:09:16.008098 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:16 crc kubenswrapper[4917]: I0318 07:09:16.111059 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerStarted","Data":"44ba4613efad0732f7a3530c6ef096602bfaa8a21a10dadd596015375cbbc767"} Mar 18 07:09:16 crc kubenswrapper[4917]: I0318 07:09:16.112369 4917 generic.go:334] "Generic (PLEG): container finished" podID="a8d30d15-8621-4653-b678-2693b301b35f" containerID="ea146ca86a61e4a78f1a0bb0d88a01ef5621fff4b0ee2505b6b7913116f8dc17" exitCode=0 Mar 18 07:09:16 crc kubenswrapper[4917]: I0318 07:09:16.112434 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-82pcg" event={"ID":"a8d30d15-8621-4653-b678-2693b301b35f","Type":"ContainerDied","Data":"ea146ca86a61e4a78f1a0bb0d88a01ef5621fff4b0ee2505b6b7913116f8dc17"} Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.132335 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerStarted","Data":"83146bf66612f0506b464c43f6ab639d2980a1b82b64ea0903f2ff02f0f017f4"} Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.557891 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.608801 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzd9l\" (UniqueName: \"kubernetes.io/projected/a8d30d15-8621-4653-b678-2693b301b35f-kube-api-access-dzd9l\") pod \"a8d30d15-8621-4653-b678-2693b301b35f\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.608882 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-combined-ca-bundle\") pod \"a8d30d15-8621-4653-b678-2693b301b35f\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.608932 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-config-data\") pod \"a8d30d15-8621-4653-b678-2693b301b35f\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.609015 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-scripts\") pod \"a8d30d15-8621-4653-b678-2693b301b35f\" (UID: \"a8d30d15-8621-4653-b678-2693b301b35f\") " Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.613410 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d30d15-8621-4653-b678-2693b301b35f-kube-api-access-dzd9l" (OuterVolumeSpecName: "kube-api-access-dzd9l") pod "a8d30d15-8621-4653-b678-2693b301b35f" (UID: "a8d30d15-8621-4653-b678-2693b301b35f"). InnerVolumeSpecName "kube-api-access-dzd9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.617143 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-scripts" (OuterVolumeSpecName: "scripts") pod "a8d30d15-8621-4653-b678-2693b301b35f" (UID: "a8d30d15-8621-4653-b678-2693b301b35f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.635559 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a8d30d15-8621-4653-b678-2693b301b35f" (UID: "a8d30d15-8621-4653-b678-2693b301b35f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.646574 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-config-data" (OuterVolumeSpecName: "config-data") pod "a8d30d15-8621-4653-b678-2693b301b35f" (UID: "a8d30d15-8621-4653-b678-2693b301b35f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.710996 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.711226 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzd9l\" (UniqueName: \"kubernetes.io/projected/a8d30d15-8621-4653-b678-2693b301b35f-kube-api-access-dzd9l\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.711236 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:17 crc kubenswrapper[4917]: I0318 07:09:17.711245 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d30d15-8621-4653-b678-2693b301b35f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.143135 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerStarted","Data":"6d42e0bf6a4bfb6e17bd2132ad5ec5bf615baf7df9835a838be4f53930f8691c"} Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.146345 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-82pcg" event={"ID":"a8d30d15-8621-4653-b678-2693b301b35f","Type":"ContainerDied","Data":"16d468caf135aef2a05496a5c5604bf5c11dcfe2444fadec40a9c867c31a3dad"} Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.146390 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16d468caf135aef2a05496a5c5604bf5c11dcfe2444fadec40a9c867c31a3dad" Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.146403 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-82pcg" Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.319797 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.320469 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerName="nova-api-api" containerID="cri-o://61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343" gracePeriod=30 Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.320644 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerName="nova-api-log" containerID="cri-o://3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412" gracePeriod=30 Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.341169 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.341362 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="32929e66-f413-418e-a69a-ff9fc6ce3ea9" containerName="nova-scheduler-scheduler" containerID="cri-o://137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154" gracePeriod=30 Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.369529 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.369802 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-log" containerID="cri-o://61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da" gracePeriod=30 Mar 18 07:09:18 crc kubenswrapper[4917]: I0318 07:09:18.369879 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-metadata" containerID="cri-o://599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2" gracePeriod=30 Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.109967 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.157161 4917 generic.go:334] "Generic (PLEG): container finished" podID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerID="61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da" exitCode=143 Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.157220 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb","Type":"ContainerDied","Data":"61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da"} Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.158994 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerStarted","Data":"5dfa69dc464494691f8d7d8acd362ecd6b1d01ec38eaf690fc56d60692ffe0a2"} Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.161242 4917 generic.go:334] "Generic (PLEG): container finished" podID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerID="61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343" exitCode=0 Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.161388 4917 generic.go:334] "Generic (PLEG): container finished" podID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerID="3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412" exitCode=143 Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.161340 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"594b8523-a82f-4df5-b5c8-f5a213154f7b","Type":"ContainerDied","Data":"61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343"} Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.161338 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.161684 4917 scope.go:117] "RemoveContainer" containerID="61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.161691 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"594b8523-a82f-4df5-b5c8-f5a213154f7b","Type":"ContainerDied","Data":"3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412"} Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.161837 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"594b8523-a82f-4df5-b5c8-f5a213154f7b","Type":"ContainerDied","Data":"a3bf1c0388e646b91cccbb965059ee3486cc1bdcb64df013b849b013f9676296"} Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.195812 4917 scope.go:117] "RemoveContainer" containerID="3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.238291 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594b8523-a82f-4df5-b5c8-f5a213154f7b-logs\") pod \"594b8523-a82f-4df5-b5c8-f5a213154f7b\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.238344 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-public-tls-certs\") pod \"594b8523-a82f-4df5-b5c8-f5a213154f7b\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.238451 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxhtn\" (UniqueName: \"kubernetes.io/projected/594b8523-a82f-4df5-b5c8-f5a213154f7b-kube-api-access-jxhtn\") pod \"594b8523-a82f-4df5-b5c8-f5a213154f7b\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.238500 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-internal-tls-certs\") pod \"594b8523-a82f-4df5-b5c8-f5a213154f7b\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.238527 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-config-data\") pod \"594b8523-a82f-4df5-b5c8-f5a213154f7b\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.238605 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-combined-ca-bundle\") pod \"594b8523-a82f-4df5-b5c8-f5a213154f7b\" (UID: \"594b8523-a82f-4df5-b5c8-f5a213154f7b\") " Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.238803 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/594b8523-a82f-4df5-b5c8-f5a213154f7b-logs" (OuterVolumeSpecName: "logs") pod "594b8523-a82f-4df5-b5c8-f5a213154f7b" (UID: "594b8523-a82f-4df5-b5c8-f5a213154f7b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.239216 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/594b8523-a82f-4df5-b5c8-f5a213154f7b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.248771 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/594b8523-a82f-4df5-b5c8-f5a213154f7b-kube-api-access-jxhtn" (OuterVolumeSpecName: "kube-api-access-jxhtn") pod "594b8523-a82f-4df5-b5c8-f5a213154f7b" (UID: "594b8523-a82f-4df5-b5c8-f5a213154f7b"). InnerVolumeSpecName "kube-api-access-jxhtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.262626 4917 scope.go:117] "RemoveContainer" containerID="61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343" Mar 18 07:09:19 crc kubenswrapper[4917]: E0318 07:09:19.264767 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343\": container with ID starting with 61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343 not found: ID does not exist" containerID="61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.264810 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343"} err="failed to get container status \"61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343\": rpc error: code = NotFound desc = could not find container \"61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343\": container with ID starting with 61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343 not found: ID does not exist" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.264855 4917 scope.go:117] "RemoveContainer" containerID="3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412" Mar 18 07:09:19 crc kubenswrapper[4917]: E0318 07:09:19.265293 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412\": container with ID starting with 3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412 not found: ID does not exist" containerID="3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.265315 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412"} err="failed to get container status \"3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412\": rpc error: code = NotFound desc = could not find container \"3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412\": container with ID starting with 3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412 not found: ID does not exist" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.265329 4917 scope.go:117] "RemoveContainer" containerID="61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.265698 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343"} err="failed to get container status \"61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343\": rpc error: code = NotFound desc = could not find container \"61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343\": container with ID starting with 61f674c8446e6c57f2407e2ed8c60bf1b7d4a7a2ab279feaeb7f148380a53343 not found: ID does not exist" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.265713 4917 scope.go:117] "RemoveContainer" containerID="3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.266554 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412"} err="failed to get container status \"3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412\": rpc error: code = NotFound desc = could not find container \"3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412\": container with ID starting with 3d7c94f13794780ee0154b1a05975b220c7f88a3ad35b16caff1e1ed7272e412 not found: ID does not exist" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.274076 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "594b8523-a82f-4df5-b5c8-f5a213154f7b" (UID: "594b8523-a82f-4df5-b5c8-f5a213154f7b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.277142 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-config-data" (OuterVolumeSpecName: "config-data") pod "594b8523-a82f-4df5-b5c8-f5a213154f7b" (UID: "594b8523-a82f-4df5-b5c8-f5a213154f7b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.311767 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "594b8523-a82f-4df5-b5c8-f5a213154f7b" (UID: "594b8523-a82f-4df5-b5c8-f5a213154f7b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.318565 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "594b8523-a82f-4df5-b5c8-f5a213154f7b" (UID: "594b8523-a82f-4df5-b5c8-f5a213154f7b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.340930 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.340972 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxhtn\" (UniqueName: \"kubernetes.io/projected/594b8523-a82f-4df5-b5c8-f5a213154f7b-kube-api-access-jxhtn\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.340983 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.340996 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.341004 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594b8523-a82f-4df5-b5c8-f5a213154f7b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.518857 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.545550 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.555809 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:19 crc kubenswrapper[4917]: E0318 07:09:19.556574 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerName="nova-api-api" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.556606 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerName="nova-api-api" Mar 18 07:09:19 crc kubenswrapper[4917]: E0318 07:09:19.556661 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerName="nova-api-log" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.556668 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerName="nova-api-log" Mar 18 07:09:19 crc kubenswrapper[4917]: E0318 07:09:19.556688 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d30d15-8621-4653-b678-2693b301b35f" containerName="nova-manage" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.556695 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d30d15-8621-4653-b678-2693b301b35f" containerName="nova-manage" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.557010 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d30d15-8621-4653-b678-2693b301b35f" containerName="nova-manage" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.557041 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerName="nova-api-api" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.557056 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="594b8523-a82f-4df5-b5c8-f5a213154f7b" containerName="nova-api-log" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.560804 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.573305 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.574221 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.574399 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.585665 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.648735 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-internal-tls-certs\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.648823 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-config-data\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.648871 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.648884 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-public-tls-certs\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.648927 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-logs\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.648945 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2wt6\" (UniqueName: \"kubernetes.io/projected/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-kube-api-access-c2wt6\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.750737 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-logs\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.750785 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2wt6\" (UniqueName: \"kubernetes.io/projected/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-kube-api-access-c2wt6\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.750833 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-internal-tls-certs\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.750908 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-config-data\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.750957 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.750971 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-public-tls-certs\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.751216 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-logs\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.755597 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-config-data\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.756277 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-internal-tls-certs\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.757105 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.757843 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-public-tls-certs\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.768566 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2wt6\" (UniqueName: \"kubernetes.io/projected/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-kube-api-access-c2wt6\") pod \"nova-api-0\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " pod="openstack/nova-api-0" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.791488 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="594b8523-a82f-4df5-b5c8-f5a213154f7b" path="/var/lib/kubelet/pods/594b8523-a82f-4df5-b5c8-f5a213154f7b/volumes" Mar 18 07:09:19 crc kubenswrapper[4917]: I0318 07:09:19.889423 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.043268 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.158431 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-config-data\") pod \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.158662 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-combined-ca-bundle\") pod \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.158703 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjvrz\" (UniqueName: \"kubernetes.io/projected/32929e66-f413-418e-a69a-ff9fc6ce3ea9-kube-api-access-rjvrz\") pod \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\" (UID: \"32929e66-f413-418e-a69a-ff9fc6ce3ea9\") " Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.166805 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32929e66-f413-418e-a69a-ff9fc6ce3ea9-kube-api-access-rjvrz" (OuterVolumeSpecName: "kube-api-access-rjvrz") pod "32929e66-f413-418e-a69a-ff9fc6ce3ea9" (UID: "32929e66-f413-418e-a69a-ff9fc6ce3ea9"). InnerVolumeSpecName "kube-api-access-rjvrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.182246 4917 generic.go:334] "Generic (PLEG): container finished" podID="32929e66-f413-418e-a69a-ff9fc6ce3ea9" containerID="137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154" exitCode=0 Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.182310 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32929e66-f413-418e-a69a-ff9fc6ce3ea9","Type":"ContainerDied","Data":"137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154"} Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.182338 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32929e66-f413-418e-a69a-ff9fc6ce3ea9","Type":"ContainerDied","Data":"247eb60b1d2d670e38d57a7d522f9456358ea803f327902dfe143e513cd3c00b"} Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.182357 4917 scope.go:117] "RemoveContainer" containerID="137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.182479 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.206630 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32929e66-f413-418e-a69a-ff9fc6ce3ea9" (UID: "32929e66-f413-418e-a69a-ff9fc6ce3ea9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.214386 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-config-data" (OuterVolumeSpecName: "config-data") pod "32929e66-f413-418e-a69a-ff9fc6ce3ea9" (UID: "32929e66-f413-418e-a69a-ff9fc6ce3ea9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.260480 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.260520 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjvrz\" (UniqueName: \"kubernetes.io/projected/32929e66-f413-418e-a69a-ff9fc6ce3ea9-kube-api-access-rjvrz\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.260535 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32929e66-f413-418e-a69a-ff9fc6ce3ea9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.286866 4917 scope.go:117] "RemoveContainer" containerID="137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154" Mar 18 07:09:20 crc kubenswrapper[4917]: E0318 07:09:20.287247 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154\": container with ID starting with 137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154 not found: ID does not exist" containerID="137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.287273 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154"} err="failed to get container status \"137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154\": rpc error: code = NotFound desc = could not find container \"137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154\": container with ID starting with 137201873f218e24b2f7193d2f92c47458c20ae6d7e327f252008b4cf0f23154 not found: ID does not exist" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.393168 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:09:20 crc kubenswrapper[4917]: W0318 07:09:20.395974 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52bd1e19_7e65_4ea9_94bd_ad7edee4be11.slice/crio-7d1947684feeb1d3d961873e89111d53eee000dbdde84ca653c38b0793e88787 WatchSource:0}: Error finding container 7d1947684feeb1d3d961873e89111d53eee000dbdde84ca653c38b0793e88787: Status 404 returned error can't find the container with id 7d1947684feeb1d3d961873e89111d53eee000dbdde84ca653c38b0793e88787 Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.544614 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.558023 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.569503 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:09:20 crc kubenswrapper[4917]: E0318 07:09:20.570095 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32929e66-f413-418e-a69a-ff9fc6ce3ea9" containerName="nova-scheduler-scheduler" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.570121 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="32929e66-f413-418e-a69a-ff9fc6ce3ea9" containerName="nova-scheduler-scheduler" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.570359 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="32929e66-f413-418e-a69a-ff9fc6ce3ea9" containerName="nova-scheduler-scheduler" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.571118 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.573072 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.581516 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.667480 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.667672 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-config-data\") pod \"nova-scheduler-0\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.667722 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr5mq\" (UniqueName: \"kubernetes.io/projected/a62df143-348a-4ec7-b331-04db3857e847-kube-api-access-xr5mq\") pod \"nova-scheduler-0\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.769134 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr5mq\" (UniqueName: \"kubernetes.io/projected/a62df143-348a-4ec7-b331-04db3857e847-kube-api-access-xr5mq\") pod \"nova-scheduler-0\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.769205 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.769311 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-config-data\") pod \"nova-scheduler-0\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.774648 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-config-data\") pod \"nova-scheduler-0\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.775091 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.787091 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr5mq\" (UniqueName: \"kubernetes.io/projected/a62df143-348a-4ec7-b331-04db3857e847-kube-api-access-xr5mq\") pod \"nova-scheduler-0\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " pod="openstack/nova-scheduler-0" Mar 18 07:09:20 crc kubenswrapper[4917]: I0318 07:09:20.887774 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.203005 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52bd1e19-7e65-4ea9-94bd-ad7edee4be11","Type":"ContainerStarted","Data":"528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2"} Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.203309 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52bd1e19-7e65-4ea9-94bd-ad7edee4be11","Type":"ContainerStarted","Data":"34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904"} Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.203321 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52bd1e19-7e65-4ea9-94bd-ad7edee4be11","Type":"ContainerStarted","Data":"7d1947684feeb1d3d961873e89111d53eee000dbdde84ca653c38b0793e88787"} Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.205244 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerStarted","Data":"f7912fdb78f56776725e59f0bed55a0e943287457315979d9dabb6fe22ef4567"} Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.205796 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.233519 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.233499858 podStartE2EDuration="2.233499858s" podCreationTimestamp="2026-03-18 07:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:09:21.22442931 +0000 UTC m=+1346.165584044" watchObservedRunningTime="2026-03-18 07:09:21.233499858 +0000 UTC m=+1346.174654572" Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.246694 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.167197805 podStartE2EDuration="6.246677284s" podCreationTimestamp="2026-03-18 07:09:15 +0000 UTC" firstStartedPulling="2026-03-18 07:09:16.064053718 +0000 UTC m=+1341.005208432" lastFinishedPulling="2026-03-18 07:09:20.143533197 +0000 UTC m=+1345.084687911" observedRunningTime="2026-03-18 07:09:21.244896312 +0000 UTC m=+1346.186051046" watchObservedRunningTime="2026-03-18 07:09:21.246677284 +0000 UTC m=+1346.187831998" Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.381095 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:09:21 crc kubenswrapper[4917]: W0318 07:09:21.390699 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda62df143_348a_4ec7_b331_04db3857e847.slice/crio-7371c9479f2ed60de873f794bb37ce7e955d10f29c11a42f847ae5ea1c5ea9c5 WatchSource:0}: Error finding container 7371c9479f2ed60de873f794bb37ce7e955d10f29c11a42f847ae5ea1c5ea9c5: Status 404 returned error can't find the container with id 7371c9479f2ed60de873f794bb37ce7e955d10f29c11a42f847ae5ea1c5ea9c5 Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.784261 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32929e66-f413-418e-a69a-ff9fc6ce3ea9" path="/var/lib/kubelet/pods/32929e66-f413-418e-a69a-ff9fc6ce3ea9/volumes" Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.900515 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.991574 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-logs\") pod \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.991697 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-nova-metadata-tls-certs\") pod \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.991738 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsfgz\" (UniqueName: \"kubernetes.io/projected/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-kube-api-access-rsfgz\") pod \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.991807 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-combined-ca-bundle\") pod \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.991998 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-config-data\") pod \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\" (UID: \"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb\") " Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.994298 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-logs" (OuterVolumeSpecName: "logs") pod "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" (UID: "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:09:21 crc kubenswrapper[4917]: I0318 07:09:21.998402 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-kube-api-access-rsfgz" (OuterVolumeSpecName: "kube-api-access-rsfgz") pod "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" (UID: "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb"). InnerVolumeSpecName "kube-api-access-rsfgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.026186 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-config-data" (OuterVolumeSpecName: "config-data") pod "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" (UID: "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.026785 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" (UID: "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.049503 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" (UID: "bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.096354 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.096393 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.096405 4917 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.096420 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsfgz\" (UniqueName: \"kubernetes.io/projected/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-kube-api-access-rsfgz\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.096432 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.215884 4917 generic.go:334] "Generic (PLEG): container finished" podID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerID="599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2" exitCode=0 Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.215965 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb","Type":"ContainerDied","Data":"599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2"} Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.215992 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb","Type":"ContainerDied","Data":"a6fe1cf3727db5260225c36cd064ad240e1ce963bcbc50062b8f3057a1c47dad"} Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.216010 4917 scope.go:117] "RemoveContainer" containerID="599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.216033 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.218881 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a62df143-348a-4ec7-b331-04db3857e847","Type":"ContainerStarted","Data":"3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567"} Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.218910 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a62df143-348a-4ec7-b331-04db3857e847","Type":"ContainerStarted","Data":"7371c9479f2ed60de873f794bb37ce7e955d10f29c11a42f847ae5ea1c5ea9c5"} Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.266318 4917 scope.go:117] "RemoveContainer" containerID="61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.272541 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.272523145 podStartE2EDuration="2.272523145s" podCreationTimestamp="2026-03-18 07:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:09:22.250096727 +0000 UTC m=+1347.191251471" watchObservedRunningTime="2026-03-18 07:09:22.272523145 +0000 UTC m=+1347.213677859" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.303373 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.305647 4917 scope.go:117] "RemoveContainer" containerID="599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2" Mar 18 07:09:22 crc kubenswrapper[4917]: E0318 07:09:22.306187 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2\": container with ID starting with 599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2 not found: ID does not exist" containerID="599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.306227 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2"} err="failed to get container status \"599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2\": rpc error: code = NotFound desc = could not find container \"599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2\": container with ID starting with 599643e12e2c771e72a24f54c87225268820a5dbd7ada8ab7d2ee17ed3f3b8f2 not found: ID does not exist" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.306255 4917 scope.go:117] "RemoveContainer" containerID="61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da" Mar 18 07:09:22 crc kubenswrapper[4917]: E0318 07:09:22.307726 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da\": container with ID starting with 61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da not found: ID does not exist" containerID="61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.308661 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da"} err="failed to get container status \"61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da\": rpc error: code = NotFound desc = could not find container \"61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da\": container with ID starting with 61c3940b7812962bc7eb376ddfd8c3d64aa02829847531e250e56f4aad0022da not found: ID does not exist" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.315418 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.323438 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:09:22 crc kubenswrapper[4917]: E0318 07:09:22.323917 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-metadata" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.323930 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-metadata" Mar 18 07:09:22 crc kubenswrapper[4917]: E0318 07:09:22.323956 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-log" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.323962 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-log" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.324133 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-log" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.324150 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" containerName="nova-metadata-metadata" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.325131 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.327915 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.329878 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.330540 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.406828 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hfq\" (UniqueName: \"kubernetes.io/projected/06578cec-8f13-4daf-966e-89f743a134fe-kube-api-access-r2hfq\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.407134 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06578cec-8f13-4daf-966e-89f743a134fe-logs\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.407177 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.407202 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-config-data\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.407229 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.508795 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hfq\" (UniqueName: \"kubernetes.io/projected/06578cec-8f13-4daf-966e-89f743a134fe-kube-api-access-r2hfq\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.508882 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06578cec-8f13-4daf-966e-89f743a134fe-logs\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.508967 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.509023 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-config-data\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.509075 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.509504 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06578cec-8f13-4daf-966e-89f743a134fe-logs\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.524435 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.527001 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-config-data\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.535649 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hfq\" (UniqueName: \"kubernetes.io/projected/06578cec-8f13-4daf-966e-89f743a134fe-kube-api-access-r2hfq\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.540445 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.652800 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:09:22 crc kubenswrapper[4917]: I0318 07:09:22.986284 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:09:23 crc kubenswrapper[4917]: I0318 07:09:23.228291 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06578cec-8f13-4daf-966e-89f743a134fe","Type":"ContainerStarted","Data":"0c9345749605256155e36788fa5aacbab82152b73c33088f71c542c3f012582d"} Mar 18 07:09:23 crc kubenswrapper[4917]: I0318 07:09:23.228345 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06578cec-8f13-4daf-966e-89f743a134fe","Type":"ContainerStarted","Data":"f1772958fffce7a652deb17c726739295a79c9470b96d3caddcdfc923ec8b81e"} Mar 18 07:09:23 crc kubenswrapper[4917]: I0318 07:09:23.785623 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb" path="/var/lib/kubelet/pods/bf95dcf5-f8ed-4e34-a098-bfd02ce09bdb/volumes" Mar 18 07:09:24 crc kubenswrapper[4917]: I0318 07:09:24.240939 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06578cec-8f13-4daf-966e-89f743a134fe","Type":"ContainerStarted","Data":"a8adb26cc0c60d07e23f6ba71eefaac96183dc1c6cfa0982a9403686a18c8986"} Mar 18 07:09:24 crc kubenswrapper[4917]: I0318 07:09:24.270018 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.269993144 podStartE2EDuration="2.269993144s" podCreationTimestamp="2026-03-18 07:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 07:09:24.265239461 +0000 UTC m=+1349.206394185" watchObservedRunningTime="2026-03-18 07:09:24.269993144 +0000 UTC m=+1349.211147888" Mar 18 07:09:25 crc kubenswrapper[4917]: I0318 07:09:25.888173 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 07:09:29 crc kubenswrapper[4917]: I0318 07:09:29.890161 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 07:09:29 crc kubenswrapper[4917]: I0318 07:09:29.890539 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 07:09:30 crc kubenswrapper[4917]: I0318 07:09:30.888916 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 07:09:30 crc kubenswrapper[4917]: I0318 07:09:30.903780 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 07:09:30 crc kubenswrapper[4917]: I0318 07:09:30.904114 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 07:09:30 crc kubenswrapper[4917]: I0318 07:09:30.943460 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 07:09:31 crc kubenswrapper[4917]: I0318 07:09:31.379532 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 07:09:32 crc kubenswrapper[4917]: I0318 07:09:32.653962 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 07:09:32 crc kubenswrapper[4917]: I0318 07:09:32.654007 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 07:09:32 crc kubenswrapper[4917]: I0318 07:09:32.929230 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:09:32 crc kubenswrapper[4917]: I0318 07:09:32.929306 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:09:32 crc kubenswrapper[4917]: I0318 07:09:32.929365 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:09:32 crc kubenswrapper[4917]: I0318 07:09:32.930414 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed205158cfd88c24a2618c8398681343eec6e1ff531ca763ff821abed75c51f1"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:09:32 crc kubenswrapper[4917]: I0318 07:09:32.930530 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://ed205158cfd88c24a2618c8398681343eec6e1ff531ca763ff821abed75c51f1" gracePeriod=600 Mar 18 07:09:33 crc kubenswrapper[4917]: I0318 07:09:33.374604 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="ed205158cfd88c24a2618c8398681343eec6e1ff531ca763ff821abed75c51f1" exitCode=0 Mar 18 07:09:33 crc kubenswrapper[4917]: I0318 07:09:33.374653 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"ed205158cfd88c24a2618c8398681343eec6e1ff531ca763ff821abed75c51f1"} Mar 18 07:09:33 crc kubenswrapper[4917]: I0318 07:09:33.374939 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"dcff2e5f0e4be43f74115b69504e568cfe4f21b6f21979f45d6bebc1faf21caf"} Mar 18 07:09:33 crc kubenswrapper[4917]: I0318 07:09:33.374958 4917 scope.go:117] "RemoveContainer" containerID="d0854ed049fe9b7cfdc3675b0efe75d5c58d0e2de88a3782f41bc4ebb18b9f74" Mar 18 07:09:33 crc kubenswrapper[4917]: I0318 07:09:33.668739 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 07:09:33 crc kubenswrapper[4917]: I0318 07:09:33.668759 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 07:09:34 crc kubenswrapper[4917]: I0318 07:09:34.967514 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5a6619bc-1eae-40c1-93a8-baf26424051b" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.198:3000/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 07:09:37 crc kubenswrapper[4917]: I0318 07:09:37.890440 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 07:09:37 crc kubenswrapper[4917]: I0318 07:09:37.891037 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 07:09:39 crc kubenswrapper[4917]: I0318 07:09:39.903408 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 07:09:39 crc kubenswrapper[4917]: I0318 07:09:39.905355 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 07:09:39 crc kubenswrapper[4917]: I0318 07:09:39.917732 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 07:09:39 crc kubenswrapper[4917]: I0318 07:09:39.918017 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 07:09:40 crc kubenswrapper[4917]: I0318 07:09:40.654440 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 07:09:40 crc kubenswrapper[4917]: I0318 07:09:40.655110 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 07:09:42 crc kubenswrapper[4917]: I0318 07:09:42.661346 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 07:09:42 crc kubenswrapper[4917]: I0318 07:09:42.672789 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 07:09:42 crc kubenswrapper[4917]: I0318 07:09:42.672880 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 07:09:43 crc kubenswrapper[4917]: I0318 07:09:43.535828 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 07:09:45 crc kubenswrapper[4917]: I0318 07:09:45.549957 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.298752 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.299531 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9" containerName="kube-state-metrics" containerID="cri-o://c0ced854d56d471c5e1933ff8991e4dd562acc177e136ffc7eb0c0cb1963647e" gracePeriod=30 Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.621561 4917 generic.go:334] "Generic (PLEG): container finished" podID="d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9" containerID="c0ced854d56d471c5e1933ff8991e4dd562acc177e136ffc7eb0c0cb1963647e" exitCode=2 Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.621762 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9","Type":"ContainerDied","Data":"c0ced854d56d471c5e1933ff8991e4dd562acc177e136ffc7eb0c0cb1963647e"} Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.804939 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.873823 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2tpj\" (UniqueName: \"kubernetes.io/projected/d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9-kube-api-access-k2tpj\") pod \"d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9\" (UID: \"d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9\") " Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.880807 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9-kube-api-access-k2tpj" (OuterVolumeSpecName: "kube-api-access-k2tpj") pod "d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9" (UID: "d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9"). InnerVolumeSpecName "kube-api-access-k2tpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.961195 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tstgx"] Mar 18 07:09:49 crc kubenswrapper[4917]: E0318 07:09:49.961738 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9" containerName="kube-state-metrics" Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.961758 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9" containerName="kube-state-metrics" Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.962002 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9" containerName="kube-state-metrics" Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.963483 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.972396 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tstgx"] Mar 18 07:09:49 crc kubenswrapper[4917]: I0318 07:09:49.976456 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2tpj\" (UniqueName: \"kubernetes.io/projected/d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9-kube-api-access-k2tpj\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.078230 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5rzb\" (UniqueName: \"kubernetes.io/projected/8e52450f-029a-42eb-aa2d-33f00609d0d5-kube-api-access-z5rzb\") pod \"certified-operators-tstgx\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.078432 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-catalog-content\") pod \"certified-operators-tstgx\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.078451 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-utilities\") pod \"certified-operators-tstgx\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.180957 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-catalog-content\") pod \"certified-operators-tstgx\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.181518 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-catalog-content\") pod \"certified-operators-tstgx\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.181677 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-utilities\") pod \"certified-operators-tstgx\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.181992 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-utilities\") pod \"certified-operators-tstgx\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.182069 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5rzb\" (UniqueName: \"kubernetes.io/projected/8e52450f-029a-42eb-aa2d-33f00609d0d5-kube-api-access-z5rzb\") pod \"certified-operators-tstgx\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.209806 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5rzb\" (UniqueName: \"kubernetes.io/projected/8e52450f-029a-42eb-aa2d-33f00609d0d5-kube-api-access-z5rzb\") pod \"certified-operators-tstgx\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.283334 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.642334 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9","Type":"ContainerDied","Data":"0fd4500833042da0f4dec5b497927133b09d9854806eb8f23781610a96b872d1"} Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.642652 4917 scope.go:117] "RemoveContainer" containerID="c0ced854d56d471c5e1933ff8991e4dd562acc177e136ffc7eb0c0cb1963647e" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.642425 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.711814 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.723987 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.741496 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.743154 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.746370 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.746543 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.763093 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.797167 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v4rg\" (UniqueName: \"kubernetes.io/projected/9f997531-b81c-41be-96aa-5f20fe185369-kube-api-access-8v4rg\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.797240 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.797295 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.797326 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.831405 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tstgx"] Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.898739 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v4rg\" (UniqueName: \"kubernetes.io/projected/9f997531-b81c-41be-96aa-5f20fe185369-kube-api-access-8v4rg\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.898820 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.898872 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.898900 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.909520 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.918440 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.924007 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:50 crc kubenswrapper[4917]: I0318 07:09:50.931952 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v4rg\" (UniqueName: \"kubernetes.io/projected/9f997531-b81c-41be-96aa-5f20fe185369-kube-api-access-8v4rg\") pod \"kube-state-metrics-0\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " pod="openstack/kube-state-metrics-0" Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.068500 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 07:09:51 crc kubenswrapper[4917]: W0318 07:09:51.582957 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f997531_b81c_41be_96aa_5f20fe185369.slice/crio-17e9d91fb1d676e3afead599c257a6b07d3b1d575de0085c26816acf488d8a55 WatchSource:0}: Error finding container 17e9d91fb1d676e3afead599c257a6b07d3b1d575de0085c26816acf488d8a55: Status 404 returned error can't find the container with id 17e9d91fb1d676e3afead599c257a6b07d3b1d575de0085c26816acf488d8a55 Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.588746 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.651316 4917 generic.go:334] "Generic (PLEG): container finished" podID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerID="3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653" exitCode=0 Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.651390 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tstgx" event={"ID":"8e52450f-029a-42eb-aa2d-33f00609d0d5","Type":"ContainerDied","Data":"3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653"} Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.652538 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tstgx" event={"ID":"8e52450f-029a-42eb-aa2d-33f00609d0d5","Type":"ContainerStarted","Data":"eab889217f8a8622e9761504b1353a6b80c9676213f0383ff76c6be4decdaa69"} Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.658293 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f997531-b81c-41be-96aa-5f20fe185369","Type":"ContainerStarted","Data":"17e9d91fb1d676e3afead599c257a6b07d3b1d575de0085c26816acf488d8a55"} Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.737634 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.738047 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="proxy-httpd" containerID="cri-o://f7912fdb78f56776725e59f0bed55a0e943287457315979d9dabb6fe22ef4567" gracePeriod=30 Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.738157 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="ceilometer-notification-agent" containerID="cri-o://6d42e0bf6a4bfb6e17bd2132ad5ec5bf615baf7df9835a838be4f53930f8691c" gracePeriod=30 Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.738209 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="sg-core" containerID="cri-o://5dfa69dc464494691f8d7d8acd362ecd6b1d01ec38eaf690fc56d60692ffe0a2" gracePeriod=30 Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.738040 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="ceilometer-central-agent" containerID="cri-o://83146bf66612f0506b464c43f6ab639d2980a1b82b64ea0903f2ff02f0f017f4" gracePeriod=30 Mar 18 07:09:51 crc kubenswrapper[4917]: I0318 07:09:51.782266 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9" path="/var/lib/kubelet/pods/d9fe9ead-8bad-4e5d-8719-ba6e4a94dae9/volumes" Mar 18 07:09:52 crc kubenswrapper[4917]: I0318 07:09:52.671524 4917 generic.go:334] "Generic (PLEG): container finished" podID="06f1a177-9873-437e-9571-0e4731a15ee5" containerID="f7912fdb78f56776725e59f0bed55a0e943287457315979d9dabb6fe22ef4567" exitCode=0 Mar 18 07:09:52 crc kubenswrapper[4917]: I0318 07:09:52.671867 4917 generic.go:334] "Generic (PLEG): container finished" podID="06f1a177-9873-437e-9571-0e4731a15ee5" containerID="5dfa69dc464494691f8d7d8acd362ecd6b1d01ec38eaf690fc56d60692ffe0a2" exitCode=2 Mar 18 07:09:52 crc kubenswrapper[4917]: I0318 07:09:52.671877 4917 generic.go:334] "Generic (PLEG): container finished" podID="06f1a177-9873-437e-9571-0e4731a15ee5" containerID="83146bf66612f0506b464c43f6ab639d2980a1b82b64ea0903f2ff02f0f017f4" exitCode=0 Mar 18 07:09:52 crc kubenswrapper[4917]: I0318 07:09:52.671622 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerDied","Data":"f7912fdb78f56776725e59f0bed55a0e943287457315979d9dabb6fe22ef4567"} Mar 18 07:09:52 crc kubenswrapper[4917]: I0318 07:09:52.671914 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerDied","Data":"5dfa69dc464494691f8d7d8acd362ecd6b1d01ec38eaf690fc56d60692ffe0a2"} Mar 18 07:09:52 crc kubenswrapper[4917]: I0318 07:09:52.671929 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerDied","Data":"83146bf66612f0506b464c43f6ab639d2980a1b82b64ea0903f2ff02f0f017f4"} Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.684682 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f997531-b81c-41be-96aa-5f20fe185369","Type":"ContainerStarted","Data":"2ae6bc03fa06e459c8435a83079a258845af95b3fb669f441d6f1382b2e5ce57"} Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.685089 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.689793 4917 generic.go:334] "Generic (PLEG): container finished" podID="06f1a177-9873-437e-9571-0e4731a15ee5" containerID="6d42e0bf6a4bfb6e17bd2132ad5ec5bf615baf7df9835a838be4f53930f8691c" exitCode=0 Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.689856 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerDied","Data":"6d42e0bf6a4bfb6e17bd2132ad5ec5bf615baf7df9835a838be4f53930f8691c"} Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.691675 4917 generic.go:334] "Generic (PLEG): container finished" podID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerID="4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8" exitCode=0 Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.691698 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tstgx" event={"ID":"8e52450f-029a-42eb-aa2d-33f00609d0d5","Type":"ContainerDied","Data":"4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8"} Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.720141 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.9608800889999998 podStartE2EDuration="3.720110274s" podCreationTimestamp="2026-03-18 07:09:50 +0000 UTC" firstStartedPulling="2026-03-18 07:09:51.585786501 +0000 UTC m=+1376.526941215" lastFinishedPulling="2026-03-18 07:09:52.345016686 +0000 UTC m=+1377.286171400" observedRunningTime="2026-03-18 07:09:53.705781774 +0000 UTC m=+1378.646936518" watchObservedRunningTime="2026-03-18 07:09:53.720110274 +0000 UTC m=+1378.661265068" Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.920328 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.958547 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-log-httpd\") pod \"06f1a177-9873-437e-9571-0e4731a15ee5\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.958696 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-sg-core-conf-yaml\") pod \"06f1a177-9873-437e-9571-0e4731a15ee5\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.959047 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-combined-ca-bundle\") pod \"06f1a177-9873-437e-9571-0e4731a15ee5\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.959155 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-run-httpd\") pod \"06f1a177-9873-437e-9571-0e4731a15ee5\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.959197 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-scripts\") pod \"06f1a177-9873-437e-9571-0e4731a15ee5\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.959279 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-config-data\") pod \"06f1a177-9873-437e-9571-0e4731a15ee5\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.959352 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvg2h\" (UniqueName: \"kubernetes.io/projected/06f1a177-9873-437e-9571-0e4731a15ee5-kube-api-access-bvg2h\") pod \"06f1a177-9873-437e-9571-0e4731a15ee5\" (UID: \"06f1a177-9873-437e-9571-0e4731a15ee5\") " Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.960727 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "06f1a177-9873-437e-9571-0e4731a15ee5" (UID: "06f1a177-9873-437e-9571-0e4731a15ee5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.961668 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "06f1a177-9873-437e-9571-0e4731a15ee5" (UID: "06f1a177-9873-437e-9571-0e4731a15ee5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.966244 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-scripts" (OuterVolumeSpecName: "scripts") pod "06f1a177-9873-437e-9571-0e4731a15ee5" (UID: "06f1a177-9873-437e-9571-0e4731a15ee5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.970504 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06f1a177-9873-437e-9571-0e4731a15ee5-kube-api-access-bvg2h" (OuterVolumeSpecName: "kube-api-access-bvg2h") pod "06f1a177-9873-437e-9571-0e4731a15ee5" (UID: "06f1a177-9873-437e-9571-0e4731a15ee5"). InnerVolumeSpecName "kube-api-access-bvg2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:09:53 crc kubenswrapper[4917]: I0318 07:09:53.995930 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "06f1a177-9873-437e-9571-0e4731a15ee5" (UID: "06f1a177-9873-437e-9571-0e4731a15ee5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.061694 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.061725 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.061735 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvg2h\" (UniqueName: \"kubernetes.io/projected/06f1a177-9873-437e-9571-0e4731a15ee5-kube-api-access-bvg2h\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.061748 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06f1a177-9873-437e-9571-0e4731a15ee5-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.061756 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.075437 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06f1a177-9873-437e-9571-0e4731a15ee5" (UID: "06f1a177-9873-437e-9571-0e4731a15ee5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.085825 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-config-data" (OuterVolumeSpecName: "config-data") pod "06f1a177-9873-437e-9571-0e4731a15ee5" (UID: "06f1a177-9873-437e-9571-0e4731a15ee5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.163561 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.163631 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06f1a177-9873-437e-9571-0e4731a15ee5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.704883 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06f1a177-9873-437e-9571-0e4731a15ee5","Type":"ContainerDied","Data":"44ba4613efad0732f7a3530c6ef096602bfaa8a21a10dadd596015375cbbc767"} Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.705172 4917 scope.go:117] "RemoveContainer" containerID="f7912fdb78f56776725e59f0bed55a0e943287457315979d9dabb6fe22ef4567" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.704905 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.726872 4917 scope.go:117] "RemoveContainer" containerID="5dfa69dc464494691f8d7d8acd362ecd6b1d01ec38eaf690fc56d60692ffe0a2" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.755297 4917 scope.go:117] "RemoveContainer" containerID="6d42e0bf6a4bfb6e17bd2132ad5ec5bf615baf7df9835a838be4f53930f8691c" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.756648 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.768494 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.781907 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:54 crc kubenswrapper[4917]: E0318 07:09:54.782310 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="sg-core" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.782326 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="sg-core" Mar 18 07:09:54 crc kubenswrapper[4917]: E0318 07:09:54.782339 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="proxy-httpd" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.782346 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="proxy-httpd" Mar 18 07:09:54 crc kubenswrapper[4917]: E0318 07:09:54.782362 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="ceilometer-central-agent" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.782369 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="ceilometer-central-agent" Mar 18 07:09:54 crc kubenswrapper[4917]: E0318 07:09:54.782389 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="ceilometer-notification-agent" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.782395 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="ceilometer-notification-agent" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.782556 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="ceilometer-central-agent" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.782566 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="sg-core" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.782603 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="proxy-httpd" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.782615 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" containerName="ceilometer-notification-agent" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.785100 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.791085 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.791949 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.792867 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.805632 4917 scope.go:117] "RemoveContainer" containerID="83146bf66612f0506b464c43f6ab639d2980a1b82b64ea0903f2ff02f0f017f4" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.815912 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.875041 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-scripts\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.875086 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcf5m\" (UniqueName: \"kubernetes.io/projected/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-kube-api-access-rcf5m\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.875265 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.875391 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.875600 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-run-httpd\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.875623 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.875666 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-config-data\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.875761 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-log-httpd\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.977253 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.977342 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.977423 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-run-httpd\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.977444 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.977477 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-config-data\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.977526 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-log-httpd\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.977552 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-scripts\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.977574 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcf5m\" (UniqueName: \"kubernetes.io/projected/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-kube-api-access-rcf5m\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.978278 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-log-httpd\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.978423 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-run-httpd\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.982184 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.983144 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.983691 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.991927 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-scripts\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:54 crc kubenswrapper[4917]: I0318 07:09:54.991956 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-config-data\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:55 crc kubenswrapper[4917]: I0318 07:09:55.001126 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcf5m\" (UniqueName: \"kubernetes.io/projected/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-kube-api-access-rcf5m\") pod \"ceilometer-0\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " pod="openstack/ceilometer-0" Mar 18 07:09:55 crc kubenswrapper[4917]: I0318 07:09:55.122615 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:09:55 crc kubenswrapper[4917]: I0318 07:09:55.627531 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:09:55 crc kubenswrapper[4917]: W0318 07:09:55.633983 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a4114f4_5182_4b59_b9be_72a6f4ed11fb.slice/crio-8a099c58c4530a1fbe2c5919bfbe64dfd649755c41781983253ff0ccfab89991 WatchSource:0}: Error finding container 8a099c58c4530a1fbe2c5919bfbe64dfd649755c41781983253ff0ccfab89991: Status 404 returned error can't find the container with id 8a099c58c4530a1fbe2c5919bfbe64dfd649755c41781983253ff0ccfab89991 Mar 18 07:09:55 crc kubenswrapper[4917]: I0318 07:09:55.715844 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tstgx" event={"ID":"8e52450f-029a-42eb-aa2d-33f00609d0d5","Type":"ContainerStarted","Data":"ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6"} Mar 18 07:09:55 crc kubenswrapper[4917]: I0318 07:09:55.717197 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerStarted","Data":"8a099c58c4530a1fbe2c5919bfbe64dfd649755c41781983253ff0ccfab89991"} Mar 18 07:09:55 crc kubenswrapper[4917]: I0318 07:09:55.743906 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tstgx" podStartSLOduration=3.9363123780000002 podStartE2EDuration="6.743886126s" podCreationTimestamp="2026-03-18 07:09:49 +0000 UTC" firstStartedPulling="2026-03-18 07:09:51.653683719 +0000 UTC m=+1376.594838433" lastFinishedPulling="2026-03-18 07:09:54.461257457 +0000 UTC m=+1379.402412181" observedRunningTime="2026-03-18 07:09:55.735218944 +0000 UTC m=+1380.676373678" watchObservedRunningTime="2026-03-18 07:09:55.743886126 +0000 UTC m=+1380.685040840" Mar 18 07:09:55 crc kubenswrapper[4917]: I0318 07:09:55.785238 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06f1a177-9873-437e-9571-0e4731a15ee5" path="/var/lib/kubelet/pods/06f1a177-9873-437e-9571-0e4731a15ee5/volumes" Mar 18 07:09:56 crc kubenswrapper[4917]: I0318 07:09:56.731964 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerStarted","Data":"4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c"} Mar 18 07:09:57 crc kubenswrapper[4917]: I0318 07:09:57.752393 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerStarted","Data":"7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc"} Mar 18 07:09:57 crc kubenswrapper[4917]: I0318 07:09:57.752765 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerStarted","Data":"eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45"} Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.146481 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563630-lm6sk"] Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.148747 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563630-lm6sk" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.150284 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.151713 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.155462 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563630-lm6sk"] Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.156190 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.284129 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.284397 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.313878 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-274q4\" (UniqueName: \"kubernetes.io/projected/50ce1300-1bea-4223-84b2-e2ca681962a6-kube-api-access-274q4\") pod \"auto-csr-approver-29563630-lm6sk\" (UID: \"50ce1300-1bea-4223-84b2-e2ca681962a6\") " pod="openshift-infra/auto-csr-approver-29563630-lm6sk" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.329829 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.415596 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-274q4\" (UniqueName: \"kubernetes.io/projected/50ce1300-1bea-4223-84b2-e2ca681962a6-kube-api-access-274q4\") pod \"auto-csr-approver-29563630-lm6sk\" (UID: \"50ce1300-1bea-4223-84b2-e2ca681962a6\") " pod="openshift-infra/auto-csr-approver-29563630-lm6sk" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.433671 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-274q4\" (UniqueName: \"kubernetes.io/projected/50ce1300-1bea-4223-84b2-e2ca681962a6-kube-api-access-274q4\") pod \"auto-csr-approver-29563630-lm6sk\" (UID: \"50ce1300-1bea-4223-84b2-e2ca681962a6\") " pod="openshift-infra/auto-csr-approver-29563630-lm6sk" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.476483 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563630-lm6sk" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.849287 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.893964 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tstgx"] Mar 18 07:10:00 crc kubenswrapper[4917]: I0318 07:10:00.988431 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563630-lm6sk"] Mar 18 07:10:01 crc kubenswrapper[4917]: I0318 07:10:01.000281 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:10:01 crc kubenswrapper[4917]: I0318 07:10:01.095832 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 07:10:01 crc kubenswrapper[4917]: I0318 07:10:01.787867 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerStarted","Data":"4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10"} Mar 18 07:10:01 crc kubenswrapper[4917]: I0318 07:10:01.788360 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 07:10:01 crc kubenswrapper[4917]: I0318 07:10:01.792897 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563630-lm6sk" event={"ID":"50ce1300-1bea-4223-84b2-e2ca681962a6","Type":"ContainerStarted","Data":"b5f28e42f903edd593469c0773997bcf992d19c863f00003dd43fbb80ba4cdc8"} Mar 18 07:10:01 crc kubenswrapper[4917]: I0318 07:10:01.808344 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.68431386 podStartE2EDuration="7.808322383s" podCreationTimestamp="2026-03-18 07:09:54 +0000 UTC" firstStartedPulling="2026-03-18 07:09:55.637996218 +0000 UTC m=+1380.579150932" lastFinishedPulling="2026-03-18 07:10:00.762004741 +0000 UTC m=+1385.703159455" observedRunningTime="2026-03-18 07:10:01.806132449 +0000 UTC m=+1386.747287193" watchObservedRunningTime="2026-03-18 07:10:01.808322383 +0000 UTC m=+1386.749477107" Mar 18 07:10:02 crc kubenswrapper[4917]: I0318 07:10:02.831655 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563630-lm6sk" event={"ID":"50ce1300-1bea-4223-84b2-e2ca681962a6","Type":"ContainerStarted","Data":"c962aa87c069585684597428a945085dba9c26dd06ff54afb7f098c0ddd1c7b9"} Mar 18 07:10:02 crc kubenswrapper[4917]: I0318 07:10:02.832636 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tstgx" podUID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerName="registry-server" containerID="cri-o://ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6" gracePeriod=2 Mar 18 07:10:02 crc kubenswrapper[4917]: I0318 07:10:02.864133 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563630-lm6sk" podStartSLOduration=1.441229882 podStartE2EDuration="2.864113987s" podCreationTimestamp="2026-03-18 07:10:00 +0000 UTC" firstStartedPulling="2026-03-18 07:10:01.00008137 +0000 UTC m=+1385.941236094" lastFinishedPulling="2026-03-18 07:10:02.422965445 +0000 UTC m=+1387.364120199" observedRunningTime="2026-03-18 07:10:02.85973223 +0000 UTC m=+1387.800886944" watchObservedRunningTime="2026-03-18 07:10:02.864113987 +0000 UTC m=+1387.805268701" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.311095 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.476565 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-catalog-content\") pod \"8e52450f-029a-42eb-aa2d-33f00609d0d5\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.476728 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-utilities\") pod \"8e52450f-029a-42eb-aa2d-33f00609d0d5\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.476813 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5rzb\" (UniqueName: \"kubernetes.io/projected/8e52450f-029a-42eb-aa2d-33f00609d0d5-kube-api-access-z5rzb\") pod \"8e52450f-029a-42eb-aa2d-33f00609d0d5\" (UID: \"8e52450f-029a-42eb-aa2d-33f00609d0d5\") " Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.478169 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-utilities" (OuterVolumeSpecName: "utilities") pod "8e52450f-029a-42eb-aa2d-33f00609d0d5" (UID: "8e52450f-029a-42eb-aa2d-33f00609d0d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.496883 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e52450f-029a-42eb-aa2d-33f00609d0d5-kube-api-access-z5rzb" (OuterVolumeSpecName: "kube-api-access-z5rzb") pod "8e52450f-029a-42eb-aa2d-33f00609d0d5" (UID: "8e52450f-029a-42eb-aa2d-33f00609d0d5"). InnerVolumeSpecName "kube-api-access-z5rzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.520876 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e52450f-029a-42eb-aa2d-33f00609d0d5" (UID: "8e52450f-029a-42eb-aa2d-33f00609d0d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.578852 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.578895 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5rzb\" (UniqueName: \"kubernetes.io/projected/8e52450f-029a-42eb-aa2d-33f00609d0d5-kube-api-access-z5rzb\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.578911 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e52450f-029a-42eb-aa2d-33f00609d0d5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.840977 4917 generic.go:334] "Generic (PLEG): container finished" podID="50ce1300-1bea-4223-84b2-e2ca681962a6" containerID="c962aa87c069585684597428a945085dba9c26dd06ff54afb7f098c0ddd1c7b9" exitCode=0 Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.841032 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563630-lm6sk" event={"ID":"50ce1300-1bea-4223-84b2-e2ca681962a6","Type":"ContainerDied","Data":"c962aa87c069585684597428a945085dba9c26dd06ff54afb7f098c0ddd1c7b9"} Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.843207 4917 generic.go:334] "Generic (PLEG): container finished" podID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerID="ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6" exitCode=0 Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.843286 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tstgx" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.843273 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tstgx" event={"ID":"8e52450f-029a-42eb-aa2d-33f00609d0d5","Type":"ContainerDied","Data":"ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6"} Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.843447 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tstgx" event={"ID":"8e52450f-029a-42eb-aa2d-33f00609d0d5","Type":"ContainerDied","Data":"eab889217f8a8622e9761504b1353a6b80c9676213f0383ff76c6be4decdaa69"} Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.843475 4917 scope.go:117] "RemoveContainer" containerID="ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.875516 4917 scope.go:117] "RemoveContainer" containerID="4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.895598 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tstgx"] Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.901394 4917 scope.go:117] "RemoveContainer" containerID="3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.904760 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tstgx"] Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.953819 4917 scope.go:117] "RemoveContainer" containerID="ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6" Mar 18 07:10:03 crc kubenswrapper[4917]: E0318 07:10:03.954238 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6\": container with ID starting with ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6 not found: ID does not exist" containerID="ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.954273 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6"} err="failed to get container status \"ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6\": rpc error: code = NotFound desc = could not find container \"ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6\": container with ID starting with ce808948504eb320413c0cedcd6a058e5b4d08240f379243f9b56f7424564af6 not found: ID does not exist" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.954302 4917 scope.go:117] "RemoveContainer" containerID="4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8" Mar 18 07:10:03 crc kubenswrapper[4917]: E0318 07:10:03.954598 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8\": container with ID starting with 4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8 not found: ID does not exist" containerID="4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.954626 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8"} err="failed to get container status \"4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8\": rpc error: code = NotFound desc = could not find container \"4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8\": container with ID starting with 4e2821fa6973b38abce59021feba1adf0c3aa29aaf91dcabf0c320bd65c55eb8 not found: ID does not exist" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.954642 4917 scope.go:117] "RemoveContainer" containerID="3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653" Mar 18 07:10:03 crc kubenswrapper[4917]: E0318 07:10:03.954937 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653\": container with ID starting with 3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653 not found: ID does not exist" containerID="3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653" Mar 18 07:10:03 crc kubenswrapper[4917]: I0318 07:10:03.954963 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653"} err="failed to get container status \"3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653\": rpc error: code = NotFound desc = could not find container \"3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653\": container with ID starting with 3a7b438059888da694415bd1058260294416c9a77b804d368cb9a35766501653 not found: ID does not exist" Mar 18 07:10:05 crc kubenswrapper[4917]: I0318 07:10:05.232607 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563630-lm6sk" Mar 18 07:10:05 crc kubenswrapper[4917]: I0318 07:10:05.315496 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-274q4\" (UniqueName: \"kubernetes.io/projected/50ce1300-1bea-4223-84b2-e2ca681962a6-kube-api-access-274q4\") pod \"50ce1300-1bea-4223-84b2-e2ca681962a6\" (UID: \"50ce1300-1bea-4223-84b2-e2ca681962a6\") " Mar 18 07:10:05 crc kubenswrapper[4917]: I0318 07:10:05.320682 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50ce1300-1bea-4223-84b2-e2ca681962a6-kube-api-access-274q4" (OuterVolumeSpecName: "kube-api-access-274q4") pod "50ce1300-1bea-4223-84b2-e2ca681962a6" (UID: "50ce1300-1bea-4223-84b2-e2ca681962a6"). InnerVolumeSpecName "kube-api-access-274q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:05 crc kubenswrapper[4917]: I0318 07:10:05.417567 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-274q4\" (UniqueName: \"kubernetes.io/projected/50ce1300-1bea-4223-84b2-e2ca681962a6-kube-api-access-274q4\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:05 crc kubenswrapper[4917]: I0318 07:10:05.782756 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e52450f-029a-42eb-aa2d-33f00609d0d5" path="/var/lib/kubelet/pods/8e52450f-029a-42eb-aa2d-33f00609d0d5/volumes" Mar 18 07:10:05 crc kubenswrapper[4917]: I0318 07:10:05.863245 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563630-lm6sk" event={"ID":"50ce1300-1bea-4223-84b2-e2ca681962a6","Type":"ContainerDied","Data":"b5f28e42f903edd593469c0773997bcf992d19c863f00003dd43fbb80ba4cdc8"} Mar 18 07:10:05 crc kubenswrapper[4917]: I0318 07:10:05.863285 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5f28e42f903edd593469c0773997bcf992d19c863f00003dd43fbb80ba4cdc8" Mar 18 07:10:05 crc kubenswrapper[4917]: I0318 07:10:05.863297 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563630-lm6sk" Mar 18 07:10:06 crc kubenswrapper[4917]: I0318 07:10:06.305269 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563624-ws22n"] Mar 18 07:10:06 crc kubenswrapper[4917]: I0318 07:10:06.324771 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563624-ws22n"] Mar 18 07:10:07 crc kubenswrapper[4917]: I0318 07:10:07.796783 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1cc3bc-d171-4a57-9eba-cfc6da82fe52" path="/var/lib/kubelet/pods/ff1cc3bc-d171-4a57-9eba-cfc6da82fe52/volumes" Mar 18 07:10:08 crc kubenswrapper[4917]: I0318 07:10:08.555184 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 07:10:08 crc kubenswrapper[4917]: I0318 07:10:08.555377 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="fd47f339-3398-4565-82f8-4b715e0b19a9" containerName="openstackclient" containerID="cri-o://c4b6c0f7460d0188f56f5fc10d1b3881ea6f8779c842ed7425cfa88b2e731c1c" gracePeriod=2 Mar 18 07:10:08 crc kubenswrapper[4917]: I0318 07:10:08.600086 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.161138 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2qt5f"] Mar 18 07:10:09 crc kubenswrapper[4917]: E0318 07:10:09.161803 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerName="registry-server" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.161817 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerName="registry-server" Mar 18 07:10:09 crc kubenswrapper[4917]: E0318 07:10:09.161838 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd47f339-3398-4565-82f8-4b715e0b19a9" containerName="openstackclient" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.161844 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd47f339-3398-4565-82f8-4b715e0b19a9" containerName="openstackclient" Mar 18 07:10:09 crc kubenswrapper[4917]: E0318 07:10:09.161860 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerName="extract-content" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.161865 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerName="extract-content" Mar 18 07:10:09 crc kubenswrapper[4917]: E0318 07:10:09.161876 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerName="extract-utilities" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.161882 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerName="extract-utilities" Mar 18 07:10:09 crc kubenswrapper[4917]: E0318 07:10:09.161896 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50ce1300-1bea-4223-84b2-e2ca681962a6" containerName="oc" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.161909 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="50ce1300-1bea-4223-84b2-e2ca681962a6" containerName="oc" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.167752 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd47f339-3398-4565-82f8-4b715e0b19a9" containerName="openstackclient" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.167782 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="50ce1300-1bea-4223-84b2-e2ca681962a6" containerName="oc" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.167804 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52450f-029a-42eb-aa2d-33f00609d0d5" containerName="registry-server" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.168459 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.173089 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.184336 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.211985 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2qt5f"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.244019 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8033-account-create-update-6c2bk"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.247200 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.261734 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.294625 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56cwz\" (UniqueName: \"kubernetes.io/projected/7f9f629f-c02c-4416-ae21-2b49cea903b5-kube-api-access-56cwz\") pod \"root-account-create-update-2qt5f\" (UID: \"7f9f629f-c02c-4416-ae21-2b49cea903b5\") " pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.294867 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts\") pod \"root-account-create-update-2qt5f\" (UID: \"7f9f629f-c02c-4416-ae21-2b49cea903b5\") " pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.367634 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8033-account-create-update-6c2bk"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.396296 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzgmn\" (UniqueName: \"kubernetes.io/projected/3727d24d-eb93-467b-af3f-66090ad92329-kube-api-access-tzgmn\") pod \"nova-cell1-8033-account-create-update-6c2bk\" (UID: \"3727d24d-eb93-467b-af3f-66090ad92329\") " pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.396374 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts\") pod \"root-account-create-update-2qt5f\" (UID: \"7f9f629f-c02c-4416-ae21-2b49cea903b5\") " pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.396433 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56cwz\" (UniqueName: \"kubernetes.io/projected/7f9f629f-c02c-4416-ae21-2b49cea903b5-kube-api-access-56cwz\") pod \"root-account-create-update-2qt5f\" (UID: \"7f9f629f-c02c-4416-ae21-2b49cea903b5\") " pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.396462 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3727d24d-eb93-467b-af3f-66090ad92329-operator-scripts\") pod \"nova-cell1-8033-account-create-update-6c2bk\" (UID: \"3727d24d-eb93-467b-af3f-66090ad92329\") " pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.397185 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts\") pod \"root-account-create-update-2qt5f\" (UID: \"7f9f629f-c02c-4416-ae21-2b49cea903b5\") " pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.414638 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8033-account-create-update-dxbxz"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.440654 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56cwz\" (UniqueName: \"kubernetes.io/projected/7f9f629f-c02c-4416-ae21-2b49cea903b5-kube-api-access-56cwz\") pod \"root-account-create-update-2qt5f\" (UID: \"7f9f629f-c02c-4416-ae21-2b49cea903b5\") " pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.443615 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.444258 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerName="openstack-network-exporter" containerID="cri-o://5c74ef922cf294c1354aeab44d02d26e32c40d47c6adfa40c3f157efa6666fe6" gracePeriod=300 Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.463989 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kqhm8"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.498993 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8033-account-create-update-dxbxz"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.500026 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3727d24d-eb93-467b-af3f-66090ad92329-operator-scripts\") pod \"nova-cell1-8033-account-create-update-6c2bk\" (UID: \"3727d24d-eb93-467b-af3f-66090ad92329\") " pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.500136 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzgmn\" (UniqueName: \"kubernetes.io/projected/3727d24d-eb93-467b-af3f-66090ad92329-kube-api-access-tzgmn\") pod \"nova-cell1-8033-account-create-update-6c2bk\" (UID: \"3727d24d-eb93-467b-af3f-66090ad92329\") " pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.501016 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.501126 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3727d24d-eb93-467b-af3f-66090ad92329-operator-scripts\") pod \"nova-cell1-8033-account-create-update-6c2bk\" (UID: \"3727d24d-eb93-467b-af3f-66090ad92329\") " pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.511964 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kqhm8"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.573771 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-801b-account-create-update-kx4dz"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.584384 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzgmn\" (UniqueName: \"kubernetes.io/projected/3727d24d-eb93-467b-af3f-66090ad92329-kube-api-access-tzgmn\") pod \"nova-cell1-8033-account-create-update-6c2bk\" (UID: \"3727d24d-eb93-467b-af3f-66090ad92329\") " pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.602925 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerName="ovsdbserver-nb" containerID="cri-o://b1a2420de022669dd8a8a7601588a4eb6184dda29d746ad16041b6928f5afdfb" gracePeriod=300 Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.605199 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.611900 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-801b-account-create-update-kx4dz"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.617876 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.618557 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerName="openstack-network-exporter" containerID="cri-o://b15b4f0c051b063f63312e19c1b7a905ea40662728a9caa08205f2747d82f183" gracePeriod=300 Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.643935 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db3f-account-create-update-9dqtt"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.673240 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db3f-account-create-update-9dqtt"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.695036 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-rw57h"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.722086 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-rw57h"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.760413 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 07:10:09 crc kubenswrapper[4917]: E0318 07:10:09.815651 4917 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:09 crc kubenswrapper[4917]: E0318 07:10:09.815697 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data podName:4a2ed8f1-269d-45fb-a766-46c867bd0a91 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:10.315682056 +0000 UTC m=+1395.256836770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data") pod "rabbitmq-cell1-server-0" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91") : configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.878067 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerName="ovsdbserver-sb" containerID="cri-o://a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697" gracePeriod=300 Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.905900 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1534f9-3185-43be-afb2-e37c921ff0e1" path="/var/lib/kubelet/pods/0b1534f9-3185-43be-afb2-e37c921ff0e1/volumes" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.906499 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21785c0a-645b-4a17-bfdf-cbb1167f5361" path="/var/lib/kubelet/pods/21785c0a-645b-4a17-bfdf-cbb1167f5361/volumes" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.915992 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d4f4e3-0908-410e-a4ec-e40c5550d370" path="/var/lib/kubelet/pods/55d4f4e3-0908-410e-a4ec-e40c5550d370/volumes" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.919535 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9abcd564-199b-4481-bb54-c1904431bce2" path="/var/lib/kubelet/pods/9abcd564-199b-4481-bb54-c1904431bce2/volumes" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920310 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f48d7dad-f180-4cbd-bd77-906daa3558ed" path="/var/lib/kubelet/pods/f48d7dad-f180-4cbd-bd77-906daa3558ed/volumes" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920818 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-pd424"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920841 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-9ntb4"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920853 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-9ntb4"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920867 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-pd424"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920876 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-82pcg"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920893 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-82pcg"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920902 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-jfsv2"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920912 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-jfsv2"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.920921 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b7da-account-create-update-v97z6"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.958455 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b7da-account-create-update-v97z6"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.973986 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.974254 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerName="ovn-northd" containerID="cri-o://ef4b723c7e5e12825354609c888ba3e920eaa69afb8a6ad4df7b13d783acb2c3" gracePeriod=30 Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.974430 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerName="openstack-network-exporter" containerID="cri-o://8b2c13a58a1120455d3b9c627b8cf69e354661c43f72f0bcc57f23e360c0f81f" gracePeriod=30 Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.986276 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bf779ffa-140b-4c10-b42e-9c7568cebb01/ovsdbserver-nb/0.log" Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.986321 4917 generic.go:334] "Generic (PLEG): container finished" podID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerID="5c74ef922cf294c1354aeab44d02d26e32c40d47c6adfa40c3f157efa6666fe6" exitCode=2 Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.986369 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bf779ffa-140b-4c10-b42e-9c7568cebb01","Type":"ContainerDied","Data":"5c74ef922cf294c1354aeab44d02d26e32c40d47c6adfa40c3f157efa6666fe6"} Mar 18 07:10:09 crc kubenswrapper[4917]: I0318 07:10:09.989948 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-7vl5b"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.011000 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-7vl5b"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.036537 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9gbdd"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.050123 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9f5fe129-b4bd-40ac-a3f2-6eec0469308b/ovsdbserver-sb/0.log" Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.050167 4917 generic.go:334] "Generic (PLEG): container finished" podID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerID="b15b4f0c051b063f63312e19c1b7a905ea40662728a9caa08205f2747d82f183" exitCode=2 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.050197 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9f5fe129-b4bd-40ac-a3f2-6eec0469308b","Type":"ContainerDied","Data":"b15b4f0c051b063f63312e19c1b7a905ea40662728a9caa08205f2747d82f183"} Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.056725 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-jwxq2"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.100041 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-rlzz7"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.100383 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-rlzz7" podUID="7a0e2915-1512-4288-8c43-7774fb90542f" containerName="openstack-network-exporter" containerID="cri-o://33af6cfc095e81af7f21997b2ef1a0bd3c1b9021392541c1e2f4d234da292135" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.139314 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-df967876c-494l9"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.139891 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-df967876c-494l9" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerName="neutron-api" containerID="cri-o://65ec14bc4fbd1d630fe7a906e20bb195b244f65201d429c85361678850908b23" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.140959 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-df967876c-494l9" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerName="neutron-httpd" containerID="cri-o://73299c429264676075eece28a8b7265bc2bcf9550d93b8dd8a305ef5c9be5061" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.236633 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-xqwbz"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.273324 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-xqwbz"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.330664 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6f74bf5646-8prg6"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.330927 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6f74bf5646-8prg6" podUID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerName="placement-log" containerID="cri-o://c1a1e3e9c9d8361d0c4155702ce697636d2fe57a0d4920a2496cd0fbb0e9a366" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.331329 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6f74bf5646-8prg6" podUID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerName="placement-api" containerID="cri-o://1592e72e88530b3687f95124325d50bfc66a649981b642fed4a4d25898d5c3d7" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.364857 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-wr662"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.375526 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-wr662"] Mar 18 07:10:10 crc kubenswrapper[4917]: E0318 07:10:10.376915 4917 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:10 crc kubenswrapper[4917]: E0318 07:10:10.376976 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data podName:4a2ed8f1-269d-45fb-a766-46c867bd0a91 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:11.376961434 +0000 UTC m=+1396.318116148 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data") pod "rabbitmq-cell1-server-0" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91") : configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:10 crc kubenswrapper[4917]: E0318 07:10:10.380165 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697 is running failed: container process not found" containerID="a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.391417 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-hkm5m"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.391636 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" podUID="fae3a765-1a72-4207-8066-d3f8926e8641" containerName="dnsmasq-dns" containerID="cri-o://dff717b59d7480b68ce1be64dd75f3257e481d3d0ab97f70359760ba880d621c" gracePeriod=10 Mar 18 07:10:10 crc kubenswrapper[4917]: E0318 07:10:10.392876 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697 is running failed: container process not found" containerID="a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 07:10:10 crc kubenswrapper[4917]: E0318 07:10:10.395753 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697 is running failed: container process not found" containerID="a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 18 07:10:10 crc kubenswrapper[4917]: E0318 07:10:10.395812 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerName="ovsdbserver-sb" Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.401566 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.401791 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerName="cinder-scheduler" containerID="cri-o://53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.402123 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerName="probe" containerID="cri-o://3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434144 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434562 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-server" containerID="cri-o://d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434687 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-updater" containerID="cri-o://12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434676 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-server" containerID="cri-o://5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434757 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-auditor" containerID="cri-o://8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434783 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-replicator" containerID="cri-o://33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434809 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-server" containerID="cri-o://ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434834 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-reaper" containerID="cri-o://1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434860 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-auditor" containerID="cri-o://fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434887 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-replicator" containerID="cri-o://473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.434989 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="swift-recon-cron" containerID="cri-o://44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.435016 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-updater" containerID="cri-o://8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.435055 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="rsync" containerID="cri-o://480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.435082 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-expirer" containerID="cri-o://07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.435123 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-replicator" containerID="cri-o://42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.435153 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-auditor" containerID="cri-o://002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.509354 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.509683 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="54ac6ac1-72cb-4383-8206-92169da43249" containerName="cinder-api-log" containerID="cri-o://33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.510067 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="54ac6ac1-72cb-4383-8206-92169da43249" containerName="cinder-api" containerID="cri-o://dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.521992 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.618217 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.618477 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-log" containerID="cri-o://34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.618890 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-api" containerID="cri-o://528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.670023 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-0f82-account-create-update-95z7q"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.715130 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-0f82-account-create-update-95z7q"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.813478 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.814070 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-log" containerID="cri-o://0c9345749605256155e36788fa5aacbab82152b73c33088f71c542c3f012582d" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.815315 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-metadata" containerID="cri-o://a8adb26cc0c60d07e23f6ba71eefaac96183dc1c6cfa0982a9403686a18c8986" gracePeriod=30 Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.856475 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-nb2kk"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.878524 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-nb2kk"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.908111 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-wj558"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.925146 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.949718 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-wj558"] Mar 18 07:10:10 crc kubenswrapper[4917]: I0318 07:10:10.987259 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2qt5f"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.058732 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.061678 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-log" containerID="cri-o://8eda75a9630a00d3bc84d4255c1197ed99b2146ee96f3b0c2b6c80a91b3dc188" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.062085 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-httpd" containerID="cri-o://ea57b6c51e2a6d56fc3ad907733130bd48fe5600650c29357de522f2e527c1cf" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.106815 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-frlcc"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.121866 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="11fb09df-78b6-44c6-a78f-2b720a98cfad" containerName="rabbitmq" containerID="cri-o://256f8ce82bcbadac34767fc05b95a7249fd4500c71a2a020dd30ba54395951e5" gracePeriod=604800 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.122009 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-frlcc"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.132676 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-jkt4f"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.153369 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-jkt4f"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.179535 4917 generic.go:334] "Generic (PLEG): container finished" podID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerID="34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904" exitCode=143 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.192639 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52bd1e19-7e65-4ea9-94bd-ad7edee4be11","Type":"ContainerDied","Data":"34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.192725 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.193004 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-log" containerID="cri-o://a57908c5b95dcdf307be2044978ccc9678a0b22f5e60956f8fbd25abaea6d4fe" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.193527 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-httpd" containerID="cri-o://5cf727c6f8e456b706e1dae3227882a5fd58a35e0cebf945f9304e0df9d9673a" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.210859 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4ff4-account-create-update-htzjz"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.218000 4917 generic.go:334] "Generic (PLEG): container finished" podID="06578cec-8f13-4daf-966e-89f743a134fe" containerID="0c9345749605256155e36788fa5aacbab82152b73c33088f71c542c3f012582d" exitCode=143 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.218269 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06578cec-8f13-4daf-966e-89f743a134fe","Type":"ContainerDied","Data":"0c9345749605256155e36788fa5aacbab82152b73c33088f71c542c3f012582d"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.231312 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4ff4-account-create-update-htzjz"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.235868 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7f954fcdf9-tt6r4"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.236086 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" podUID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerName="proxy-httpd" containerID="cri-o://34cb7d14ec735ff0e328ee9c042c8e5f4b080009d5c97efbb47af3b0fca71c21" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.236478 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" podUID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerName="proxy-server" containerID="cri-o://b7dd7e7dcd0ead924d5151b493b692b8cda6c133769ce8b19017ed20a1493c30" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.243756 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-de6b-account-create-update-j2zpb"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.258788 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-sr7h8"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.259874 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-de6b-account-create-update-j2zpb"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.260230 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9f5fe129-b4bd-40ac-a3f2-6eec0469308b/ovsdbserver-sb/0.log" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.260351 4917 generic.go:334] "Generic (PLEG): container finished" podID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerID="a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697" exitCode=143 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.260489 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9f5fe129-b4bd-40ac-a3f2-6eec0469308b","Type":"ContainerDied","Data":"a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.262232 4917 generic.go:334] "Generic (PLEG): container finished" podID="54ac6ac1-72cb-4383-8206-92169da43249" containerID="33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744" exitCode=143 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.262282 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54ac6ac1-72cb-4383-8206-92169da43249","Type":"ContainerDied","Data":"33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.270738 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4hnm7"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.280917 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8033-account-create-update-6c2bk"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.281126 4917 generic.go:334] "Generic (PLEG): container finished" podID="fd47f339-3398-4565-82f8-4b715e0b19a9" containerID="c4b6c0f7460d0188f56f5fc10d1b3881ea6f8779c842ed7425cfa88b2e731c1c" exitCode=137 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.286597 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-sr7h8"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.308408 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4hnm7"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.331368 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bf779ffa-140b-4c10-b42e-9c7568cebb01/ovsdbserver-nb/0.log" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.331480 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.351993 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.352286 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="e4affeee-7968-4e70-b6dd-d8d0f17cfa92" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.381801 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-5kn5p"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.405302 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-5kn5p"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.408578 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rlzz7_7a0e2915-1512-4288-8c43-7774fb90542f/openstack-network-exporter/0.log" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.408716 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:10:11 crc kubenswrapper[4917]: E0318 07:10:11.417822 4917 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:11 crc kubenswrapper[4917]: E0318 07:10:11.417909 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data podName:4a2ed8f1-269d-45fb-a766-46c867bd0a91 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:13.417885904 +0000 UTC m=+1398.359040638 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data") pod "rabbitmq-cell1-server-0" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91") : configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.424043 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9f5fe129-b4bd-40ac-a3f2-6eec0469308b/ovsdbserver-sb/0.log" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.424130 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.429568 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e169-account-create-update-gcjpw"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437259 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437286 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437295 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437303 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437310 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437318 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437325 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437333 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437340 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437349 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437357 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437423 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437452 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437464 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437476 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437485 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437495 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437507 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437518 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437528 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437539 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.437549 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.456006 4917 generic.go:334] "Generic (PLEG): container finished" podID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerID="73299c429264676075eece28a8b7265bc2bcf9550d93b8dd8a305ef5c9be5061" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.456085 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df967876c-494l9" event={"ID":"4be30c2f-97f1-4477-8322-8ed29dbd3c60","Type":"ContainerDied","Data":"73299c429264676075eece28a8b7265bc2bcf9550d93b8dd8a305ef5c9be5061"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.470922 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e169-account-create-update-gcjpw"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.510275 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_bf779ffa-140b-4c10-b42e-9c7568cebb01/ovsdbserver-nb/0.log" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.510319 4917 generic.go:334] "Generic (PLEG): container finished" podID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerID="b1a2420de022669dd8a8a7601588a4eb6184dda29d746ad16041b6928f5afdfb" exitCode=143 Mar 18 07:10:11 crc kubenswrapper[4917]: E0318 07:10:11.510376 4917 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 18 07:10:11 crc kubenswrapper[4917]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:4caef2b55e01b9a7ee88a22bc69db1893521a91d95c7ad4c8e593f14f17a5f95,Command:[/bin/sh -c #!/bin/bash Mar 18 07:10:11 crc kubenswrapper[4917]: Mar 18 07:10:11 crc kubenswrapper[4917]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 18 07:10:11 crc kubenswrapper[4917]: Mar 18 07:10:11 crc kubenswrapper[4917]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 18 07:10:11 crc kubenswrapper[4917]: Mar 18 07:10:11 crc kubenswrapper[4917]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 18 07:10:11 crc kubenswrapper[4917]: Mar 18 07:10:11 crc kubenswrapper[4917]: if [ -n "nova_cell1" ]; then Mar 18 07:10:11 crc kubenswrapper[4917]: GRANT_DATABASE="nova_cell1" Mar 18 07:10:11 crc kubenswrapper[4917]: else Mar 18 07:10:11 crc kubenswrapper[4917]: GRANT_DATABASE="*" Mar 18 07:10:11 crc kubenswrapper[4917]: fi Mar 18 07:10:11 crc kubenswrapper[4917]: Mar 18 07:10:11 crc kubenswrapper[4917]: # going for maximum compatibility here: Mar 18 07:10:11 crc kubenswrapper[4917]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 18 07:10:11 crc kubenswrapper[4917]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 18 07:10:11 crc kubenswrapper[4917]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 18 07:10:11 crc kubenswrapper[4917]: # support updates Mar 18 07:10:11 crc kubenswrapper[4917]: Mar 18 07:10:11 crc kubenswrapper[4917]: $MYSQL_CMD < logger="UnhandledError" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.510383 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"bf779ffa-140b-4c10-b42e-9c7568cebb01","Type":"ContainerDied","Data":"b1a2420de022669dd8a8a7601588a4eb6184dda29d746ad16041b6928f5afdfb"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.510423 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.510425 4917 scope.go:117] "RemoveContainer" containerID="5c74ef922cf294c1354aeab44d02d26e32c40d47c6adfa40c3f157efa6666fe6" Mar 18 07:10:11 crc kubenswrapper[4917]: E0318 07:10:11.511732 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-8033-account-create-update-6c2bk" podUID="3727d24d-eb93-467b-af3f-66090ad92329" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.513550 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9pm58"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520340 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovn-rundir\") pod \"7a0e2915-1512-4288-8c43-7774fb90542f\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520385 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0e2915-1512-4288-8c43-7774fb90542f-config\") pod \"7a0e2915-1512-4288-8c43-7774fb90542f\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520427 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovs-rundir\") pod \"7a0e2915-1512-4288-8c43-7774fb90542f\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520456 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdbserver-nb-tls-certs\") pod \"bf779ffa-140b-4c10-b42e-9c7568cebb01\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520480 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-combined-ca-bundle\") pod \"7a0e2915-1512-4288-8c43-7774fb90542f\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520487 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "7a0e2915-1512-4288-8c43-7774fb90542f" (UID: "7a0e2915-1512-4288-8c43-7774fb90542f"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520500 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-combined-ca-bundle\") pod \"bf779ffa-140b-4c10-b42e-9c7568cebb01\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520664 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-metrics-certs-tls-certs\") pod \"7a0e2915-1512-4288-8c43-7774fb90542f\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520702 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-scripts\") pod \"bf779ffa-140b-4c10-b42e-9c7568cebb01\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520725 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-config\") pod \"bf779ffa-140b-4c10-b42e-9c7568cebb01\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520749 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdb-rundir\") pod \"bf779ffa-140b-4c10-b42e-9c7568cebb01\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520777 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsqd8\" (UniqueName: \"kubernetes.io/projected/bf779ffa-140b-4c10-b42e-9c7568cebb01-kube-api-access-dsqd8\") pod \"bf779ffa-140b-4c10-b42e-9c7568cebb01\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520812 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-metrics-certs-tls-certs\") pod \"bf779ffa-140b-4c10-b42e-9c7568cebb01\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520843 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bf779ffa-140b-4c10-b42e-9c7568cebb01\" (UID: \"bf779ffa-140b-4c10-b42e-9c7568cebb01\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.520948 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2l9j\" (UniqueName: \"kubernetes.io/projected/7a0e2915-1512-4288-8c43-7774fb90542f-kube-api-access-h2l9j\") pod \"7a0e2915-1512-4288-8c43-7774fb90542f\" (UID: \"7a0e2915-1512-4288-8c43-7774fb90542f\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.547264 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "bf779ffa-140b-4c10-b42e-9c7568cebb01" (UID: "bf779ffa-140b-4c10-b42e-9c7568cebb01"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.567019 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "7a0e2915-1512-4288-8c43-7774fb90542f" (UID: "7a0e2915-1512-4288-8c43-7774fb90542f"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.572758 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.573381 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-config" (OuterVolumeSpecName: "config") pod "bf779ffa-140b-4c10-b42e-9c7568cebb01" (UID: "bf779ffa-140b-4c10-b42e-9c7568cebb01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.573435 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b9bcbc9d4-vddlw"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.573743 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api-log" containerID="cri-o://e6a5a8a7029188f639c27fa6e3638c25cc766067465396ac13f1f7a3e5c9c941" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.574362 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api" containerID="cri-o://09de8cec2a2b8b86818b49eb10353534cc95b1159f2eddd3b1fc8ba166f591d6" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.597731 4917 generic.go:334] "Generic (PLEG): container finished" podID="fae3a765-1a72-4207-8066-d3f8926e8641" containerID="dff717b59d7480b68ce1be64dd75f3257e481d3d0ab97f70359760ba880d621c" exitCode=0 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.598014 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" event={"ID":"fae3a765-1a72-4207-8066-d3f8926e8641","Type":"ContainerDied","Data":"dff717b59d7480b68ce1be64dd75f3257e481d3d0ab97f70359760ba880d621c"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.598501 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0e2915-1512-4288-8c43-7774fb90542f-config" (OuterVolumeSpecName: "config") pod "7a0e2915-1512-4288-8c43-7774fb90542f" (UID: "7a0e2915-1512-4288-8c43-7774fb90542f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.599612 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-scripts" (OuterVolumeSpecName: "scripts") pod "bf779ffa-140b-4c10-b42e-9c7568cebb01" (UID: "bf779ffa-140b-4c10-b42e-9c7568cebb01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.604421 4917 generic.go:334] "Generic (PLEG): container finished" podID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerID="8b2c13a58a1120455d3b9c627b8cf69e354661c43f72f0bcc57f23e360c0f81f" exitCode=2 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.608664 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b83ec86-5c66-4dc6-9236-a437f37611a9","Type":"ContainerDied","Data":"8b2c13a58a1120455d3b9c627b8cf69e354661c43f72f0bcc57f23e360c0f81f"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.614287 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rlzz7_7a0e2915-1512-4288-8c43-7774fb90542f/openstack-network-exporter/0.log" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.614355 4917 generic.go:334] "Generic (PLEG): container finished" podID="7a0e2915-1512-4288-8c43-7774fb90542f" containerID="33af6cfc095e81af7f21997b2ef1a0bd3c1b9021392541c1e2f4d234da292135" exitCode=2 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.614465 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rlzz7" event={"ID":"7a0e2915-1512-4288-8c43-7774fb90542f","Type":"ContainerDied","Data":"33af6cfc095e81af7f21997b2ef1a0bd3c1b9021392541c1e2f4d234da292135"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.614682 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rlzz7" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.625969 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="6976b4d3-75e6-4b74-99db-3fd9acb3a742" containerName="galera" containerID="cri-o://f1e89b7d2b258ebb604f705e899b6e2668a5b5bff0002dfc62fd05250dfb7183" gracePeriod=29 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.685626 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "bf779ffa-140b-4c10-b42e-9c7568cebb01" (UID: "bf779ffa-140b-4c10-b42e-9c7568cebb01"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.690121 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0e2915-1512-4288-8c43-7774fb90542f-kube-api-access-h2l9j" (OuterVolumeSpecName: "kube-api-access-h2l9j") pod "7a0e2915-1512-4288-8c43-7774fb90542f" (UID: "7a0e2915-1512-4288-8c43-7774fb90542f"). InnerVolumeSpecName "kube-api-access-h2l9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.690324 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" containerID="cri-o://5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" gracePeriod=29 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.694133 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf779ffa-140b-4c10-b42e-9c7568cebb01-kube-api-access-dsqd8" (OuterVolumeSpecName: "kube-api-access-dsqd8") pod "bf779ffa-140b-4c10-b42e-9c7568cebb01" (UID: "bf779ffa-140b-4c10-b42e-9c7568cebb01"). InnerVolumeSpecName "kube-api-access-dsqd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.703461 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-config\") pod \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.703600 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdbserver-sb-tls-certs\") pod \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.703622 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-metrics-certs-tls-certs\") pod \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.703702 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-combined-ca-bundle\") pod \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.703773 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-scripts\") pod \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.703879 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdb-rundir\") pod \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.703908 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.703985 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4f7r\" (UniqueName: \"kubernetes.io/projected/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-kube-api-access-j4f7r\") pod \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\" (UID: \"9f5fe129-b4bd-40ac-a3f2-6eec0469308b\") " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.707693 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-b749f8cd6-w2w6h"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.708038 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" podUID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerName="barbican-keystone-listener-log" containerID="cri-o://a36e72dd6a4a408d1a8611cc5d11c6ae06e79402f38ae0b454c8d8abb3d8a397" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.708712 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" podUID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerName="barbican-keystone-listener" containerID="cri-o://1760f96d4b10c19a42b16a801413b85944dced5f3964dea049426aa637ed35f1" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.713302 4917 generic.go:334] "Generic (PLEG): container finished" podID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerID="c1a1e3e9c9d8361d0c4155702ce697636d2fe57a0d4920a2496cd0fbb0e9a366" exitCode=143 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.713392 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f74bf5646-8prg6" event={"ID":"2140bc3b-8c96-4226-b7c4-811b0724682d","Type":"ContainerDied","Data":"c1a1e3e9c9d8361d0c4155702ce697636d2fe57a0d4920a2496cd0fbb0e9a366"} Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.730744 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.730775 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf779ffa-140b-4c10-b42e-9c7568cebb01-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.730784 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.730804 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsqd8\" (UniqueName: \"kubernetes.io/projected/bf779ffa-140b-4c10-b42e-9c7568cebb01-kube-api-access-dsqd8\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.730834 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.730851 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2l9j\" (UniqueName: \"kubernetes.io/projected/7a0e2915-1512-4288-8c43-7774fb90542f-kube-api-access-h2l9j\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.730867 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0e2915-1512-4288-8c43-7774fb90542f-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.730876 4917 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a0e2915-1512-4288-8c43-7774fb90542f-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.732020 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9pm58"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.733291 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-config" (OuterVolumeSpecName: "config") pod "9f5fe129-b4bd-40ac-a3f2-6eec0469308b" (UID: "9f5fe129-b4bd-40ac-a3f2-6eec0469308b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.736403 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" containerID="cri-o://faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" gracePeriod=29 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.736689 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-scripts" (OuterVolumeSpecName: "scripts") pod "9f5fe129-b4bd-40ac-a3f2-6eec0469308b" (UID: "9f5fe129-b4bd-40ac-a3f2-6eec0469308b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.741394 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "9f5fe129-b4bd-40ac-a3f2-6eec0469308b" (UID: "9f5fe129-b4bd-40ac-a3f2-6eec0469308b"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.741491 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5948845567-w7h4j"] Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.754732 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5948845567-w7h4j" podUID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerName="barbican-worker-log" containerID="cri-o://f5c44c7ab9e02426289634fa91304fcc0c9c17e0ffb94d8253cd509cae2718d1" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.755398 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5948845567-w7h4j" podUID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerName="barbican-worker" containerID="cri-o://1e96cd0c2b6ad866d42096dc84192a730d0d27e13ac4658e5fa9dd813dbe217c" gracePeriod=30 Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.803635 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "9f5fe129-b4bd-40ac-a3f2-6eec0469308b" (UID: "9f5fe129-b4bd-40ac-a3f2-6eec0469308b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.803662 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-kube-api-access-j4f7r" (OuterVolumeSpecName: "kube-api-access-j4f7r") pod "9f5fe129-b4bd-40ac-a3f2-6eec0469308b" (UID: "9f5fe129-b4bd-40ac-a3f2-6eec0469308b"). InnerVolumeSpecName "kube-api-access-j4f7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.824120 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.835390 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00252edd-28b8-4859-b52a-eccc6226bde2" path="/var/lib/kubelet/pods/00252edd-28b8-4859-b52a-eccc6226bde2/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.839891 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18360b12-be17-4e63-ba90-8afb66e879e4" path="/var/lib/kubelet/pods/18360b12-be17-4e63-ba90-8afb66e879e4/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.840116 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.840400 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5" path="/var/lib/kubelet/pods/1dd22a11-3e2a-4a84-87c1-7eeb19a3e6c5/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.840702 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.841332 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25920c8e-4c1a-42d3-b021-de1b7f8b39d8" path="/var/lib/kubelet/pods/25920c8e-4c1a-42d3-b021-de1b7f8b39d8/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.841532 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.841546 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.841571 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.841594 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4f7r\" (UniqueName: \"kubernetes.io/projected/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-kube-api-access-j4f7r\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.842400 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3861ad88-ced8-438a-8456-9a432a8bd828" path="/var/lib/kubelet/pods/3861ad88-ced8-438a-8456-9a432a8bd828/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.843255 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38903d42-d383-4a16-99a9-a252e9238bb2" path="/var/lib/kubelet/pods/38903d42-d383-4a16-99a9-a252e9238bb2/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.843980 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52afc7b2-83cd-41dc-bdb5-b764acb0af7b" path="/var/lib/kubelet/pods/52afc7b2-83cd-41dc-bdb5-b764acb0af7b/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.844855 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6608e620-c6ed-439f-9a07-67eef576a390" path="/var/lib/kubelet/pods/6608e620-c6ed-439f-9a07-67eef576a390/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.848853 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73ea0479-f6bc-4136-af63-8e92f46e3891" path="/var/lib/kubelet/pods/73ea0479-f6bc-4136-af63-8e92f46e3891/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.849400 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80378da5-5294-4d7e-92c9-2aba37eb64a1" path="/var/lib/kubelet/pods/80378da5-5294-4d7e-92c9-2aba37eb64a1/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.849968 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c4981c-d457-45ea-8208-57085879a6f5" path="/var/lib/kubelet/pods/87c4981c-d457-45ea-8208-57085879a6f5/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.850858 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a06e756-cae4-44b6-8bed-4951e159e223" path="/var/lib/kubelet/pods/8a06e756-cae4-44b6-8bed-4951e159e223/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.852516 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="926d49c5-2eb9-4bd6-8a38-31a1d02e6b47" path="/var/lib/kubelet/pods/926d49c5-2eb9-4bd6-8a38-31a1d02e6b47/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.853011 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9caf5978-14cb-440d-bb9b-9ad1cc0590af" path="/var/lib/kubelet/pods/9caf5978-14cb-440d-bb9b-9ad1cc0590af/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.853475 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d30d15-8621-4653-b678-2693b301b35f" path="/var/lib/kubelet/pods/a8d30d15-8621-4653-b678-2693b301b35f/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.854616 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd14ce63-7434-4646-9995-5cc41d2a4c6c" path="/var/lib/kubelet/pods/cd14ce63-7434-4646-9995-5cc41d2a4c6c/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.855161 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4fc471f-181e-4ace-b02d-beb73c8ea737" path="/var/lib/kubelet/pods/e4fc471f-181e-4ace-b02d-beb73c8ea737/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.855684 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edbd50ea-b9fc-479c-8702-82df627467bf" path="/var/lib/kubelet/pods/edbd50ea-b9fc-479c-8702-82df627467bf/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.856154 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f24e450c-1619-42db-92b0-8f8aa6d9a1ab" path="/var/lib/kubelet/pods/f24e450c-1619-42db-92b0-8f8aa6d9a1ab/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.857116 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe31ee4c-2020-409c-a29c-e91f4107a3f3" path="/var/lib/kubelet/pods/fe31ee4c-2020-409c-a29c-e91f4107a3f3/volumes" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.875272 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f5fe129-b4bd-40ac-a3f2-6eec0469308b" (UID: "9f5fe129-b4bd-40ac-a3f2-6eec0469308b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.894069 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.906055 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a0e2915-1512-4288-8c43-7774fb90542f" (UID: "7a0e2915-1512-4288-8c43-7774fb90542f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.906270 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf779ffa-140b-4c10-b42e-9c7568cebb01" (UID: "bf779ffa-140b-4c10-b42e-9c7568cebb01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.917741 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "bf779ffa-140b-4c10-b42e-9c7568cebb01" (UID: "bf779ffa-140b-4c10-b42e-9c7568cebb01"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.943048 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.943076 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.943104 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.943113 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:11 crc kubenswrapper[4917]: I0318 07:10:11.943124 4917 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.001810 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "bf779ffa-140b-4c10-b42e-9c7568cebb01" (UID: "bf779ffa-140b-4c10-b42e-9c7568cebb01"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.022239 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "7a0e2915-1512-4288-8c43-7774fb90542f" (UID: "7a0e2915-1512-4288-8c43-7774fb90542f"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.031740 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "9f5fe129-b4bd-40ac-a3f2-6eec0469308b" (UID: "9f5fe129-b4bd-40ac-a3f2-6eec0469308b"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.034261 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "9f5fe129-b4bd-40ac-a3f2-6eec0469308b" (UID: "9f5fe129-b4bd-40ac-a3f2-6eec0469308b"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.042976 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.043010 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.043048 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8033-account-create-update-6c2bk"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.043221 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a62df143-348a-4ec7-b331-04db3857e847" containerName="nova-scheduler-scheduler" containerID="cri-o://3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567" gracePeriod=30 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.046538 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bf779ffa-140b-4c10-b42e-9c7568cebb01-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.046558 4917 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a0e2915-1512-4288-8c43-7774fb90542f-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.046569 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.046591 4917 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f5fe129-b4bd-40ac-a3f2-6eec0469308b-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.074331 4917 scope.go:117] "RemoveContainer" containerID="b1a2420de022669dd8a8a7601588a4eb6184dda29d746ad16041b6928f5afdfb" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.099764 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.102153 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="4a2ed8f1-269d-45fb-a766-46c867bd0a91" containerName="rabbitmq" containerID="cri-o://a60676e9973b2a6031a6ea6dba53de6eade7182dfd3e15a64da35230283bab60" gracePeriod=604800 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.111788 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.128122 4917 scope.go:117] "RemoveContainer" containerID="33af6cfc095e81af7f21997b2ef1a0bd3c1b9021392541c1e2f4d234da292135" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.152400 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config-secret\") pod \"fd47f339-3398-4565-82f8-4b715e0b19a9\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.152455 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q5jp\" (UniqueName: \"kubernetes.io/projected/fae3a765-1a72-4207-8066-d3f8926e8641-kube-api-access-7q5jp\") pod \"fae3a765-1a72-4207-8066-d3f8926e8641\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.152474 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-sb\") pod \"fae3a765-1a72-4207-8066-d3f8926e8641\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.152496 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-nb\") pod \"fae3a765-1a72-4207-8066-d3f8926e8641\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.152512 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config\") pod \"fd47f339-3398-4565-82f8-4b715e0b19a9\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.152876 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-combined-ca-bundle\") pod \"fd47f339-3398-4565-82f8-4b715e0b19a9\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.152900 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-swift-storage-0\") pod \"fae3a765-1a72-4207-8066-d3f8926e8641\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.152989 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8p6z\" (UniqueName: \"kubernetes.io/projected/fd47f339-3398-4565-82f8-4b715e0b19a9-kube-api-access-n8p6z\") pod \"fd47f339-3398-4565-82f8-4b715e0b19a9\" (UID: \"fd47f339-3398-4565-82f8-4b715e0b19a9\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.153025 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-config\") pod \"fae3a765-1a72-4207-8066-d3f8926e8641\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.153043 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-svc\") pod \"fae3a765-1a72-4207-8066-d3f8926e8641\" (UID: \"fae3a765-1a72-4207-8066-d3f8926e8641\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.164717 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.170883 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.186742 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae3a765-1a72-4207-8066-d3f8926e8641-kube-api-access-7q5jp" (OuterVolumeSpecName: "kube-api-access-7q5jp") pod "fae3a765-1a72-4207-8066-d3f8926e8641" (UID: "fae3a765-1a72-4207-8066-d3f8926e8641"). InnerVolumeSpecName "kube-api-access-7q5jp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.189826 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd47f339-3398-4565-82f8-4b715e0b19a9-kube-api-access-n8p6z" (OuterVolumeSpecName: "kube-api-access-n8p6z") pod "fd47f339-3398-4565-82f8-4b715e0b19a9" (UID: "fd47f339-3398-4565-82f8-4b715e0b19a9"). InnerVolumeSpecName "kube-api-access-n8p6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.190328 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fd47f339-3398-4565-82f8-4b715e0b19a9" (UID: "fd47f339-3398-4565-82f8-4b715e0b19a9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.227315 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd47f339-3398-4565-82f8-4b715e0b19a9" (UID: "fd47f339-3398-4565-82f8-4b715e0b19a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.240397 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fae3a765-1a72-4207-8066-d3f8926e8641" (UID: "fae3a765-1a72-4207-8066-d3f8926e8641"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.254965 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.254994 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8p6z\" (UniqueName: \"kubernetes.io/projected/fd47f339-3398-4565-82f8-4b715e0b19a9-kube-api-access-n8p6z\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.255005 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q5jp\" (UniqueName: \"kubernetes.io/projected/fae3a765-1a72-4207-8066-d3f8926e8641-kube-api-access-7q5jp\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.255013 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.255021 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.263536 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.263817 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="dd4ec623-4dba-48e7-89f5-6cd3eadce847" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" gracePeriod=30 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.271316 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w8q9f"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.272182 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fae3a765-1a72-4207-8066-d3f8926e8641" (UID: "fae3a765-1a72-4207-8066-d3f8926e8641"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.273983 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fae3a765-1a72-4207-8066-d3f8926e8641" (UID: "fae3a765-1a72-4207-8066-d3f8926e8641"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.279627 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w8q9f"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.297675 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.297854 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="42cfbb53-8521-4c39-98ee-2d666b5682d3" containerName="nova-cell0-conductor-conductor" containerID="cri-o://ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" gracePeriod=30 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.304073 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qj2wp"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.304925 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-config" (OuterVolumeSpecName: "config") pod "fae3a765-1a72-4207-8066-d3f8926e8641" (UID: "fae3a765-1a72-4207-8066-d3f8926e8641"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.306390 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.312302 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-qj2wp"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.316560 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-rlzz7"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.319018 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fd47f339-3398-4565-82f8-4b715e0b19a9" (UID: "fd47f339-3398-4565-82f8-4b715e0b19a9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.320628 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-rlzz7"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.340971 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fae3a765-1a72-4207-8066-d3f8926e8641" (UID: "fae3a765-1a72-4207-8066-d3f8926e8641"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.358687 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.358717 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.358727 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fd47f339-3398-4565-82f8-4b715e0b19a9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.358736 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.358751 4917 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fae3a765-1a72-4207-8066-d3f8926e8641-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.459590 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-combined-ca-bundle\") pod \"ad1b30db-147f-4314-8fbe-ba8aa096be57\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.459853 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-scripts\") pod \"ad1b30db-147f-4314-8fbe-ba8aa096be57\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.459959 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkl4n\" (UniqueName: \"kubernetes.io/projected/ad1b30db-147f-4314-8fbe-ba8aa096be57-kube-api-access-pkl4n\") pod \"ad1b30db-147f-4314-8fbe-ba8aa096be57\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.459991 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data-custom\") pod \"ad1b30db-147f-4314-8fbe-ba8aa096be57\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.460008 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad1b30db-147f-4314-8fbe-ba8aa096be57-etc-machine-id\") pod \"ad1b30db-147f-4314-8fbe-ba8aa096be57\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.460092 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data\") pod \"ad1b30db-147f-4314-8fbe-ba8aa096be57\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.464329 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-scripts" (OuterVolumeSpecName: "scripts") pod "ad1b30db-147f-4314-8fbe-ba8aa096be57" (UID: "ad1b30db-147f-4314-8fbe-ba8aa096be57"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.465662 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad1b30db-147f-4314-8fbe-ba8aa096be57-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ad1b30db-147f-4314-8fbe-ba8aa096be57" (UID: "ad1b30db-147f-4314-8fbe-ba8aa096be57"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.472536 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ad1b30db-147f-4314-8fbe-ba8aa096be57" (UID: "ad1b30db-147f-4314-8fbe-ba8aa096be57"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.473473 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1b30db-147f-4314-8fbe-ba8aa096be57-kube-api-access-pkl4n" (OuterVolumeSpecName: "kube-api-access-pkl4n") pod "ad1b30db-147f-4314-8fbe-ba8aa096be57" (UID: "ad1b30db-147f-4314-8fbe-ba8aa096be57"). InnerVolumeSpecName "kube-api-access-pkl4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: E0318 07:10:12.530242 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:12 crc kubenswrapper[4917]: E0318 07:10:12.531156 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:12 crc kubenswrapper[4917]: E0318 07:10:12.534134 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:12 crc kubenswrapper[4917]: E0318 07:10:12.534167 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="42cfbb53-8521-4c39-98ee-2d666b5682d3" containerName="nova-cell0-conductor-conductor" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.564915 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.564940 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkl4n\" (UniqueName: \"kubernetes.io/projected/ad1b30db-147f-4314-8fbe-ba8aa096be57-kube-api-access-pkl4n\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.564950 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.564959 4917 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ad1b30db-147f-4314-8fbe-ba8aa096be57-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.623083 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ad1b30db-147f-4314-8fbe-ba8aa096be57" (UID: "ad1b30db-147f-4314-8fbe-ba8aa096be57"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.665288 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data" (OuterVolumeSpecName: "config-data") pod "ad1b30db-147f-4314-8fbe-ba8aa096be57" (UID: "ad1b30db-147f-4314-8fbe-ba8aa096be57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.665526 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data\") pod \"ad1b30db-147f-4314-8fbe-ba8aa096be57\" (UID: \"ad1b30db-147f-4314-8fbe-ba8aa096be57\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.665863 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: W0318 07:10:12.665935 4917 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ad1b30db-147f-4314-8fbe-ba8aa096be57/volumes/kubernetes.io~secret/config-data Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.665946 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data" (OuterVolumeSpecName: "config-data") pod "ad1b30db-147f-4314-8fbe-ba8aa096be57" (UID: "ad1b30db-147f-4314-8fbe-ba8aa096be57"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.733705 4917 generic.go:334] "Generic (PLEG): container finished" podID="dbba9465-1e4b-4b42-b512-addd628093d3" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.733777 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jwxq2" event={"ID":"dbba9465-1e4b-4b42-b512-addd628093d3","Type":"ContainerDied","Data":"faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.735124 4917 generic.go:334] "Generic (PLEG): container finished" podID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerID="8eda75a9630a00d3bc84d4255c1197ed99b2146ee96f3b0c2b6c80a91b3dc188" exitCode=143 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.735178 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a55197a-92c3-451c-9d5d-d3a6426c995b","Type":"ContainerDied","Data":"8eda75a9630a00d3bc84d4255c1197ed99b2146ee96f3b0c2b6c80a91b3dc188"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.736616 4917 generic.go:334] "Generic (PLEG): container finished" podID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerID="e6a5a8a7029188f639c27fa6e3638c25cc766067465396ac13f1f7a3e5c9c941" exitCode=143 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.736679 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" event={"ID":"0f3ff50e-a301-4abe-bbaf-2b0075b80b47","Type":"ContainerDied","Data":"e6a5a8a7029188f639c27fa6e3638c25cc766067465396ac13f1f7a3e5c9c941"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.738224 4917 generic.go:334] "Generic (PLEG): container finished" podID="6976b4d3-75e6-4b74-99db-3fd9acb3a742" containerID="f1e89b7d2b258ebb604f705e899b6e2668a5b5bff0002dfc62fd05250dfb7183" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.738264 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6976b4d3-75e6-4b74-99db-3fd9acb3a742","Type":"ContainerDied","Data":"f1e89b7d2b258ebb604f705e899b6e2668a5b5bff0002dfc62fd05250dfb7183"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.739462 4917 scope.go:117] "RemoveContainer" containerID="c4b6c0f7460d0188f56f5fc10d1b3881ea6f8779c842ed7425cfa88b2e731c1c" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.739555 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.747736 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8033-account-create-update-6c2bk" event={"ID":"3727d24d-eb93-467b-af3f-66090ad92329","Type":"ContainerStarted","Data":"04a0cdffd1c52fdb804d3680a26721e9ae8cff67c7e30951091b6fb3cd936a9f"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.760391 4917 generic.go:334] "Generic (PLEG): container finished" podID="7f9f629f-c02c-4416-ae21-2b49cea903b5" containerID="4d2961fedfd15a4905cf0158a4313ce7cb45dc6b8b90b25a9434e03f8a7117ad" exitCode=1 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.760469 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2qt5f" event={"ID":"7f9f629f-c02c-4416-ae21-2b49cea903b5","Type":"ContainerDied","Data":"4d2961fedfd15a4905cf0158a4313ce7cb45dc6b8b90b25a9434e03f8a7117ad"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.760494 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2qt5f" event={"ID":"7f9f629f-c02c-4416-ae21-2b49cea903b5","Type":"ContainerStarted","Data":"a810fef54ee8210e0fbe3bd0778873475b840742cb4ca777849bbc2ae07934a1"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.762441 4917 scope.go:117] "RemoveContainer" containerID="4d2961fedfd15a4905cf0158a4313ce7cb45dc6b8b90b25a9434e03f8a7117ad" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.769516 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ad1b30db-147f-4314-8fbe-ba8aa096be57-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.784176 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.784207 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.784218 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.784300 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.784327 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.784336 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.788338 4917 generic.go:334] "Generic (PLEG): container finished" podID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerID="b7dd7e7dcd0ead924d5151b493b692b8cda6c133769ce8b19017ed20a1493c30" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.788378 4917 generic.go:334] "Generic (PLEG): container finished" podID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerID="34cb7d14ec735ff0e328ee9c042c8e5f4b080009d5c97efbb47af3b0fca71c21" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.788368 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" event={"ID":"b7d0bad6-9874-40d7-8848-e138b487c00e","Type":"ContainerDied","Data":"b7dd7e7dcd0ead924d5151b493b692b8cda6c133769ce8b19017ed20a1493c30"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.788408 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" event={"ID":"b7d0bad6-9874-40d7-8848-e138b487c00e","Type":"ContainerDied","Data":"34cb7d14ec735ff0e328ee9c042c8e5f4b080009d5c97efbb47af3b0fca71c21"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.788428 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" event={"ID":"b7d0bad6-9874-40d7-8848-e138b487c00e","Type":"ContainerDied","Data":"b9ac0a39b967b6f9fa2ed4e25c828093b3222b9edf5f2e0378ad05f5df9cb2b5"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.788439 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9ac0a39b967b6f9fa2ed4e25c828093b3222b9edf5f2e0378ad05f5df9cb2b5" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.795467 4917 generic.go:334] "Generic (PLEG): container finished" podID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerID="a36e72dd6a4a408d1a8611cc5d11c6ae06e79402f38ae0b454c8d8abb3d8a397" exitCode=143 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.795522 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" event={"ID":"156b1187-23ee-4a81-8d1d-ad91c2468b7d","Type":"ContainerDied","Data":"a36e72dd6a4a408d1a8611cc5d11c6ae06e79402f38ae0b454c8d8abb3d8a397"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.799028 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" event={"ID":"fae3a765-1a72-4207-8066-d3f8926e8641","Type":"ContainerDied","Data":"63f2ce8a5446b6ec29e34554521410cd73e22fec6b8cea1118b0712182c091c2"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.799100 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b6d8fd79c-hkm5m" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.816877 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9f5fe129-b4bd-40ac-a3f2-6eec0469308b/ovsdbserver-sb/0.log" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.816990 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.817264 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9f5fe129-b4bd-40ac-a3f2-6eec0469308b","Type":"ContainerDied","Data":"5cd4f04a6e279ab9756dac383c90c32a4bfedd644d6f51091e5311ffbbb73163"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.826731 4917 generic.go:334] "Generic (PLEG): container finished" podID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerID="3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.826760 4917 generic.go:334] "Generic (PLEG): container finished" podID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerID="53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.826856 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.826941 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad1b30db-147f-4314-8fbe-ba8aa096be57","Type":"ContainerDied","Data":"3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.826984 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad1b30db-147f-4314-8fbe-ba8aa096be57","Type":"ContainerDied","Data":"53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.826995 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"ad1b30db-147f-4314-8fbe-ba8aa096be57","Type":"ContainerDied","Data":"184f89cb853d19513fbf115a3dddacb62855f8fa97bc71b9e33e0ef276e3cf86"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.835029 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.837214 4917 generic.go:334] "Generic (PLEG): container finished" podID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerID="a57908c5b95dcdf307be2044978ccc9678a0b22f5e60956f8fbd25abaea6d4fe" exitCode=143 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.837282 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71bf8cc3-5674-418a-a126-f43e3d2f092d","Type":"ContainerDied","Data":"a57908c5b95dcdf307be2044978ccc9678a0b22f5e60956f8fbd25abaea6d4fe"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.837568 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.842881 4917 generic.go:334] "Generic (PLEG): container finished" podID="e4affeee-7968-4e70-b6dd-d8d0f17cfa92" containerID="b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db" exitCode=0 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.842953 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e4affeee-7968-4e70-b6dd-d8d0f17cfa92","Type":"ContainerDied","Data":"b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.842977 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"e4affeee-7968-4e70-b6dd-d8d0f17cfa92","Type":"ContainerDied","Data":"8f801b94c21d98ddc1316b5073c91c34de6be89ce63b0c7026607977fac37c69"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.845041 4917 scope.go:117] "RemoveContainer" containerID="dff717b59d7480b68ce1be64dd75f3257e481d3d0ab97f70359760ba880d621c" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.852551 4917 generic.go:334] "Generic (PLEG): container finished" podID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerID="f5c44c7ab9e02426289634fa91304fcc0c9c17e0ffb94d8253cd509cae2718d1" exitCode=143 Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.852608 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5948845567-w7h4j" event={"ID":"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6","Type":"ContainerDied","Data":"f5c44c7ab9e02426289634fa91304fcc0c9c17e0ffb94d8253cd509cae2718d1"} Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.882408 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-etc-swift\") pod \"b7d0bad6-9874-40d7-8848-e138b487c00e\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.882740 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-log-httpd\") pod \"b7d0bad6-9874-40d7-8848-e138b487c00e\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.882799 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-combined-ca-bundle\") pod \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.882838 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-config-data\") pod \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.882860 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-internal-tls-certs\") pod \"b7d0bad6-9874-40d7-8848-e138b487c00e\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.882899 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9bsq\" (UniqueName: \"kubernetes.io/projected/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-kube-api-access-g9bsq\") pod \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.882931 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-nova-novncproxy-tls-certs\") pod \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.882953 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-public-tls-certs\") pod \"b7d0bad6-9874-40d7-8848-e138b487c00e\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.883014 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-config-data\") pod \"b7d0bad6-9874-40d7-8848-e138b487c00e\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.883049 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpppx\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-kube-api-access-kpppx\") pod \"b7d0bad6-9874-40d7-8848-e138b487c00e\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.883071 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-run-httpd\") pod \"b7d0bad6-9874-40d7-8848-e138b487c00e\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.883085 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-vencrypt-tls-certs\") pod \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\" (UID: \"e4affeee-7968-4e70-b6dd-d8d0f17cfa92\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.883104 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-combined-ca-bundle\") pod \"b7d0bad6-9874-40d7-8848-e138b487c00e\" (UID: \"b7d0bad6-9874-40d7-8848-e138b487c00e\") " Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.883335 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b7d0bad6-9874-40d7-8848-e138b487c00e" (UID: "b7d0bad6-9874-40d7-8848-e138b487c00e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.883483 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.888126 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-kube-api-access-kpppx" (OuterVolumeSpecName: "kube-api-access-kpppx") pod "b7d0bad6-9874-40d7-8848-e138b487c00e" (UID: "b7d0bad6-9874-40d7-8848-e138b487c00e"). InnerVolumeSpecName "kube-api-access-kpppx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.890940 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b7d0bad6-9874-40d7-8848-e138b487c00e" (UID: "b7d0bad6-9874-40d7-8848-e138b487c00e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.890960 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b7d0bad6-9874-40d7-8848-e138b487c00e" (UID: "b7d0bad6-9874-40d7-8848-e138b487c00e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: E0318 07:10:12.897249 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef4b723c7e5e12825354609c888ba3e920eaa69afb8a6ad4df7b13d783acb2c3" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 07:10:12 crc kubenswrapper[4917]: E0318 07:10:12.904396 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef4b723c7e5e12825354609c888ba3e920eaa69afb8a6ad4df7b13d783acb2c3" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 07:10:12 crc kubenswrapper[4917]: E0318 07:10:12.909774 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ef4b723c7e5e12825354609c888ba3e920eaa69afb8a6ad4df7b13d783acb2c3" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 18 07:10:12 crc kubenswrapper[4917]: E0318 07:10:12.909854 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerName="ovn-northd" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.920282 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-kube-api-access-g9bsq" (OuterVolumeSpecName: "kube-api-access-g9bsq") pod "e4affeee-7968-4e70-b6dd-d8d0f17cfa92" (UID: "e4affeee-7968-4e70-b6dd-d8d0f17cfa92"). InnerVolumeSpecName "kube-api-access-g9bsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.920326 4917 scope.go:117] "RemoveContainer" containerID="1c56939532f2bc0f9729b861c994e25e5f036cb964bc64d8c59110624f5b3f66" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.929645 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.941317 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.949646 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.972107 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.984826 4917 scope.go:117] "RemoveContainer" containerID="b15b4f0c051b063f63312e19c1b7a905ea40662728a9caa08205f2747d82f183" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.986774 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9bsq\" (UniqueName: \"kubernetes.io/projected/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-kube-api-access-g9bsq\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.986802 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpppx\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-kube-api-access-kpppx\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.986817 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b7d0bad6-9874-40d7-8848-e138b487c00e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:12 crc kubenswrapper[4917]: I0318 07:10:12.986830 4917 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b7d0bad6-9874-40d7-8848-e138b487c00e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.008736 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-hkm5m"] Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.017223 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b6d8fd79c-hkm5m"] Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.021062 4917 scope.go:117] "RemoveContainer" containerID="a4a0dce21ceb855c3dedf8606c314f9fc50f4fb26263f29efa1097031e597697" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.043977 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-config-data" (OuterVolumeSpecName: "config-data") pod "e4affeee-7968-4e70-b6dd-d8d0f17cfa92" (UID: "e4affeee-7968-4e70-b6dd-d8d0f17cfa92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.055108 4917 scope.go:117] "RemoveContainer" containerID="3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.066785 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.089720 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-generated\") pod \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.089899 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-default\") pod \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.089921 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcffm\" (UniqueName: \"kubernetes.io/projected/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kube-api-access-gcffm\") pod \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.089970 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-operator-scripts\") pod \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.091059 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "6976b4d3-75e6-4b74-99db-3fd9acb3a742" (UID: "6976b4d3-75e6-4b74-99db-3fd9acb3a742"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.091731 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6976b4d3-75e6-4b74-99db-3fd9acb3a742" (UID: "6976b4d3-75e6-4b74-99db-3fd9acb3a742"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.092186 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "6976b4d3-75e6-4b74-99db-3fd9acb3a742" (UID: "6976b4d3-75e6-4b74-99db-3fd9acb3a742"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.093347 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.093393 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-combined-ca-bundle\") pod \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.093411 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kolla-config\") pod \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.093427 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-galera-tls-certs\") pod \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\" (UID: \"6976b4d3-75e6-4b74-99db-3fd9acb3a742\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.093987 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.094005 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.094014 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.094023 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.094722 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "6976b4d3-75e6-4b74-99db-3fd9acb3a742" (UID: "6976b4d3-75e6-4b74-99db-3fd9acb3a742"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.117576 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7d0bad6-9874-40d7-8848-e138b487c00e" (UID: "b7d0bad6-9874-40d7-8848-e138b487c00e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.118487 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "6976b4d3-75e6-4b74-99db-3fd9acb3a742" (UID: "6976b4d3-75e6-4b74-99db-3fd9acb3a742"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.122439 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-config-data" (OuterVolumeSpecName: "config-data") pod "b7d0bad6-9874-40d7-8848-e138b487c00e" (UID: "b7d0bad6-9874-40d7-8848-e138b487c00e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.125758 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kube-api-access-gcffm" (OuterVolumeSpecName: "kube-api-access-gcffm") pod "6976b4d3-75e6-4b74-99db-3fd9acb3a742" (UID: "6976b4d3-75e6-4b74-99db-3fd9acb3a742"). InnerVolumeSpecName "kube-api-access-gcffm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.142754 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4affeee-7968-4e70-b6dd-d8d0f17cfa92" (UID: "e4affeee-7968-4e70-b6dd-d8d0f17cfa92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.147958 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b7d0bad6-9874-40d7-8848-e138b487c00e" (UID: "b7d0bad6-9874-40d7-8848-e138b487c00e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.149078 4917 scope.go:117] "RemoveContainer" containerID="53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.169110 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6976b4d3-75e6-4b74-99db-3fd9acb3a742" (UID: "6976b4d3-75e6-4b74-99db-3fd9acb3a742"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.175698 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.181568 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "e4affeee-7968-4e70-b6dd-d8d0f17cfa92" (UID: "e4affeee-7968-4e70-b6dd-d8d0f17cfa92"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.200756 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzgmn\" (UniqueName: \"kubernetes.io/projected/3727d24d-eb93-467b-af3f-66090ad92329-kube-api-access-tzgmn\") pod \"3727d24d-eb93-467b-af3f-66090ad92329\" (UID: \"3727d24d-eb93-467b-af3f-66090ad92329\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.200923 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3727d24d-eb93-467b-af3f-66090ad92329-operator-scripts\") pod \"3727d24d-eb93-467b-af3f-66090ad92329\" (UID: \"3727d24d-eb93-467b-af3f-66090ad92329\") " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201635 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3727d24d-eb93-467b-af3f-66090ad92329-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3727d24d-eb93-467b-af3f-66090ad92329" (UID: "3727d24d-eb93-467b-af3f-66090ad92329"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201661 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201690 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201700 4917 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201711 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201722 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201733 4917 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201744 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201755 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.201766 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcffm\" (UniqueName: \"kubernetes.io/projected/6976b4d3-75e6-4b74-99db-3fd9acb3a742-kube-api-access-gcffm\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.203976 4917 scope.go:117] "RemoveContainer" containerID="3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b" Mar 18 07:10:13 crc kubenswrapper[4917]: E0318 07:10:13.204426 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b\": container with ID starting with 3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b not found: ID does not exist" containerID="3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.204461 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b"} err="failed to get container status \"3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b\": rpc error: code = NotFound desc = could not find container \"3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b\": container with ID starting with 3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b not found: ID does not exist" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.204485 4917 scope.go:117] "RemoveContainer" containerID="53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9" Mar 18 07:10:13 crc kubenswrapper[4917]: E0318 07:10:13.204746 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9\": container with ID starting with 53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9 not found: ID does not exist" containerID="53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.204796 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9"} err="failed to get container status \"53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9\": rpc error: code = NotFound desc = could not find container \"53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9\": container with ID starting with 53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9 not found: ID does not exist" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.204815 4917 scope.go:117] "RemoveContainer" containerID="3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.205087 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b"} err="failed to get container status \"3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b\": rpc error: code = NotFound desc = could not find container \"3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b\": container with ID starting with 3a18689e9d9a67e160c19e291f3939f6b61ba237db0be69002afda9197c57d8b not found: ID does not exist" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.205132 4917 scope.go:117] "RemoveContainer" containerID="53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.205402 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9"} err="failed to get container status \"53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9\": rpc error: code = NotFound desc = could not find container \"53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9\": container with ID starting with 53ab47cbd80e5510fb9e65e4273e8c80aa60374739d7ba4a567e04b18b15bae9 not found: ID does not exist" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.205420 4917 scope.go:117] "RemoveContainer" containerID="b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.206549 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3727d24d-eb93-467b-af3f-66090ad92329-kube-api-access-tzgmn" (OuterVolumeSpecName: "kube-api-access-tzgmn") pod "3727d24d-eb93-467b-af3f-66090ad92329" (UID: "3727d24d-eb93-467b-af3f-66090ad92329"). InnerVolumeSpecName "kube-api-access-tzgmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.231684 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "e4affeee-7968-4e70-b6dd-d8d0f17cfa92" (UID: "e4affeee-7968-4e70-b6dd-d8d0f17cfa92"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.237732 4917 scope.go:117] "RemoveContainer" containerID="b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db" Mar 18 07:10:13 crc kubenswrapper[4917]: E0318 07:10:13.238784 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db\": container with ID starting with b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db not found: ID does not exist" containerID="b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.238851 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db"} err="failed to get container status \"b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db\": rpc error: code = NotFound desc = could not find container \"b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db\": container with ID starting with b4144a8989c452ed3f927923a69c5b478e57cebb919563faeb8bca1ca97e29db not found: ID does not exist" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.239150 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "6976b4d3-75e6-4b74-99db-3fd9acb3a742" (UID: "6976b4d3-75e6-4b74-99db-3fd9acb3a742"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.239478 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.271637 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b7d0bad6-9874-40d7-8848-e138b487c00e" (UID: "b7d0bad6-9874-40d7-8848-e138b487c00e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.304813 4917 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4affeee-7968-4e70-b6dd-d8d0f17cfa92-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.304847 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b7d0bad6-9874-40d7-8848-e138b487c00e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.304856 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzgmn\" (UniqueName: \"kubernetes.io/projected/3727d24d-eb93-467b-af3f-66090ad92329-kube-api-access-tzgmn\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.304867 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.304876 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3727d24d-eb93-467b-af3f-66090ad92329-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.304885 4917 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6976b4d3-75e6-4b74-99db-3fd9acb3a742-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:13 crc kubenswrapper[4917]: E0318 07:10:13.509548 4917 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:13 crc kubenswrapper[4917]: E0318 07:10:13.509975 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data podName:4a2ed8f1-269d-45fb-a766-46c867bd0a91 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:17.509956455 +0000 UTC m=+1402.451111169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data") pod "rabbitmq-cell1-server-0" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91") : configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:13 crc kubenswrapper[4917]: E0318 07:10:13.605430 4917 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 18 07:10:13 crc kubenswrapper[4917]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-18T07:10:11Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 18 07:10:13 crc kubenswrapper[4917]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Mar 18 07:10:13 crc kubenswrapper[4917]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-9gbdd" message=< Mar 18 07:10:13 crc kubenswrapper[4917]: Exiting ovn-controller (1) [FAILED] Mar 18 07:10:13 crc kubenswrapper[4917]: Killing ovn-controller (1) [ OK ] Mar 18 07:10:13 crc kubenswrapper[4917]: 2026-03-18T07:10:11Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 18 07:10:13 crc kubenswrapper[4917]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Mar 18 07:10:13 crc kubenswrapper[4917]: > Mar 18 07:10:13 crc kubenswrapper[4917]: E0318 07:10:13.605468 4917 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 18 07:10:13 crc kubenswrapper[4917]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2026-03-18T07:10:11Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Mar 18 07:10:13 crc kubenswrapper[4917]: /etc/init.d/functions: line 589: 379 Alarm clock "$@" Mar 18 07:10:13 crc kubenswrapper[4917]: > pod="openstack/ovn-controller-9gbdd" podUID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" containerName="ovn-controller" containerID="cri-o://c67ceb625cdd5baa04a13e13da848059276036abb1326784ef355b6956226e2f" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.605504 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-9gbdd" podUID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" containerName="ovn-controller" containerID="cri-o://c67ceb625cdd5baa04a13e13da848059276036abb1326784ef355b6956226e2f" gracePeriod=27 Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.683327 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.683606 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="ceilometer-central-agent" containerID="cri-o://4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c" gracePeriod=30 Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.684203 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="proxy-httpd" containerID="cri-o://4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10" gracePeriod=30 Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.684248 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="sg-core" containerID="cri-o://7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc" gracePeriod=30 Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.684280 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="ceilometer-notification-agent" containerID="cri-o://eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45" gracePeriod=30 Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.722457 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.722685 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9f997531-b81c-41be-96aa-5f20fe185369" containerName="kube-state-metrics" containerID="cri-o://2ae6bc03fa06e459c8435a83079a258845af95b3fb669f441d6f1382b2e5ce57" gracePeriod=30 Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.790884 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.213:3000/\": read tcp 10.217.0.2:43330->10.217.0.213:3000: read: connection reset by peer" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.845207 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7011ea08-beba-47f5-95e3-bb13aead931a" path="/var/lib/kubelet/pods/7011ea08-beba-47f5-95e3-bb13aead931a/volumes" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.847484 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0e2915-1512-4288-8c43-7774fb90542f" path="/var/lib/kubelet/pods/7a0e2915-1512-4288-8c43-7774fb90542f/volumes" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.848746 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" path="/var/lib/kubelet/pods/9f5fe129-b4bd-40ac-a3f2-6eec0469308b/volumes" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.850656 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8af5f47-3cc1-4d7f-a99f-0eae80c27416" path="/var/lib/kubelet/pods/a8af5f47-3cc1-4d7f-a99f-0eae80c27416/volumes" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.851424 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad1b30db-147f-4314-8fbe-ba8aa096be57" path="/var/lib/kubelet/pods/ad1b30db-147f-4314-8fbe-ba8aa096be57/volumes" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.853137 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf779ffa-140b-4c10-b42e-9c7568cebb01" path="/var/lib/kubelet/pods/bf779ffa-140b-4c10-b42e-9c7568cebb01/volumes" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.853826 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae3a765-1a72-4207-8066-d3f8926e8641" path="/var/lib/kubelet/pods/fae3a765-1a72-4207-8066-d3f8926e8641/volumes" Mar 18 07:10:13 crc kubenswrapper[4917]: I0318 07:10:13.855065 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd47f339-3398-4565-82f8-4b715e0b19a9" path="/var/lib/kubelet/pods/fd47f339-3398-4565-82f8-4b715e0b19a9/volumes" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.040000 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.040244 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="033e4d58-03e5-49fa-ad5b-169464bf7ba9" containerName="memcached" containerID="cri-o://bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789" gracePeriod=30 Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.048793 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="54ac6ac1-72cb-4383-8206-92169da43249" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.169:8776/healthcheck\": read tcp 10.217.0.2:41314->10.217.0.169:8776: read: connection reset by peer" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.052069 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.052103 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6976b4d3-75e6-4b74-99db-3fd9acb3a742","Type":"ContainerDied","Data":"12851433605b3bb8094e3ad0300c14dc29a5305e806eb3886ac2bf25df3c97e0"} Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.052163 4917 scope.go:117] "RemoveContainer" containerID="f1e89b7d2b258ebb604f705e899b6e2668a5b5bff0002dfc62fd05250dfb7183" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.079913 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-168c-account-create-update-jq6gc"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.086228 4917 scope.go:117] "RemoveContainer" containerID="15c8e43db4edb4b54e790e0d0ebd8efd89d196a97667fc082ccce43971fca170" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.094295 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-168c-account-create-update-jq6gc"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.103699 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-168c-account-create-update-kppnw"] Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104199 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0e2915-1512-4288-8c43-7774fb90542f" containerName="openstack-network-exporter" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104219 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0e2915-1512-4288-8c43-7774fb90542f" containerName="openstack-network-exporter" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104231 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerName="proxy-httpd" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104238 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerName="proxy-httpd" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104258 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerName="openstack-network-exporter" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104266 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerName="openstack-network-exporter" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104274 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerName="proxy-server" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104281 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerName="proxy-server" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104293 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6976b4d3-75e6-4b74-99db-3fd9acb3a742" containerName="mysql-bootstrap" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104300 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6976b4d3-75e6-4b74-99db-3fd9acb3a742" containerName="mysql-bootstrap" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104311 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerName="openstack-network-exporter" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104317 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerName="openstack-network-exporter" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104341 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerName="probe" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104348 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerName="probe" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104358 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4affeee-7968-4e70-b6dd-d8d0f17cfa92" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104365 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4affeee-7968-4e70-b6dd-d8d0f17cfa92" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104377 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae3a765-1a72-4207-8066-d3f8926e8641" containerName="init" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104384 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae3a765-1a72-4207-8066-d3f8926e8641" containerName="init" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104398 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerName="ovsdbserver-sb" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104405 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerName="ovsdbserver-sb" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104421 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6976b4d3-75e6-4b74-99db-3fd9acb3a742" containerName="galera" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104428 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6976b4d3-75e6-4b74-99db-3fd9acb3a742" containerName="galera" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104444 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerName="cinder-scheduler" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104451 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerName="cinder-scheduler" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104462 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerName="ovsdbserver-nb" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104468 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerName="ovsdbserver-nb" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.104477 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fae3a765-1a72-4207-8066-d3f8926e8641" containerName="dnsmasq-dns" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104483 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae3a765-1a72-4207-8066-d3f8926e8641" containerName="dnsmasq-dns" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104707 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerName="proxy-server" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104726 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerName="cinder-scheduler" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104735 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerName="ovsdbserver-nb" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104746 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerName="ovsdbserver-sb" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104754 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf779ffa-140b-4c10-b42e-9c7568cebb01" containerName="openstack-network-exporter" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104767 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d0bad6-9874-40d7-8848-e138b487c00e" containerName="proxy-httpd" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104781 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1b30db-147f-4314-8fbe-ba8aa096be57" containerName="probe" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104799 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0e2915-1512-4288-8c43-7774fb90542f" containerName="openstack-network-exporter" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104811 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6976b4d3-75e6-4b74-99db-3fd9acb3a742" containerName="galera" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104822 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="fae3a765-1a72-4207-8066-d3f8926e8641" containerName="dnsmasq-dns" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104834 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f5fe129-b4bd-40ac-a3f2-6eec0469308b" containerName="openstack-network-exporter" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.104841 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4affeee-7968-4e70-b6dd-d8d0f17cfa92" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.105505 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.107777 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.114897 4917 generic.go:334] "Generic (PLEG): container finished" podID="9f997531-b81c-41be-96aa-5f20fe185369" containerID="2ae6bc03fa06e459c8435a83079a258845af95b3fb669f441d6f1382b2e5ce57" exitCode=2 Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.114975 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f997531-b81c-41be-96aa-5f20fe185369","Type":"ContainerDied","Data":"2ae6bc03fa06e459c8435a83079a258845af95b3fb669f441d6f1382b2e5ce57"} Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.115117 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-qnf9f"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.120419 4917 generic.go:334] "Generic (PLEG): container finished" podID="7f9f629f-c02c-4416-ae21-2b49cea903b5" containerID="e8755e7fa4ce78be1417bae895c80614c14702cc700f0225c2621fb63c8edb09" exitCode=1 Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.120473 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2qt5f" event={"ID":"7f9f629f-c02c-4416-ae21-2b49cea903b5","Type":"ContainerDied","Data":"e8755e7fa4ce78be1417bae895c80614c14702cc700f0225c2621fb63c8edb09"} Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.121267 4917 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-2qt5f" secret="" err="secret \"galera-openstack-dockercfg-zhq7w\" not found" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.121303 4917 scope.go:117] "RemoveContainer" containerID="e8755e7fa4ce78be1417bae895c80614c14702cc700f0225c2621fb63c8edb09" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.121609 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-2qt5f_openstack(7f9f629f-c02c-4416-ae21-2b49cea903b5)\"" pod="openstack/root-account-create-update-2qt5f" podUID="7f9f629f-c02c-4416-ae21-2b49cea903b5" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.130287 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-168c-account-create-update-kppnw"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.131099 4917 generic.go:334] "Generic (PLEG): container finished" podID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerID="7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc" exitCode=2 Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.131143 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerDied","Data":"7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc"} Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.136096 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wpl6r"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.141126 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9gbdd_33ddaa4d-48c1-4c81-b0a3-4225b6382496/ovn-controller/0.log" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.141174 4917 generic.go:334] "Generic (PLEG): container finished" podID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" containerID="c67ceb625cdd5baa04a13e13da848059276036abb1326784ef355b6956226e2f" exitCode=143 Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.141229 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9gbdd" event={"ID":"33ddaa4d-48c1-4c81-b0a3-4225b6382496","Type":"ContainerDied","Data":"c67ceb625cdd5baa04a13e13da848059276036abb1326784ef355b6956226e2f"} Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.150783 4917 generic.go:334] "Generic (PLEG): container finished" podID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerID="1592e72e88530b3687f95124325d50bfc66a649981b642fed4a4d25898d5c3d7" exitCode=0 Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.150854 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f74bf5646-8prg6" event={"ID":"2140bc3b-8c96-4226-b7c4-811b0724682d","Type":"ContainerDied","Data":"1592e72e88530b3687f95124325d50bfc66a649981b642fed4a4d25898d5c3d7"} Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.155731 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-qnf9f"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.157811 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8033-account-create-update-6c2bk" event={"ID":"3727d24d-eb93-467b-af3f-66090ad92329","Type":"ContainerDied","Data":"04a0cdffd1c52fdb804d3680a26721e9ae8cff67c7e30951091b6fb3cd936a9f"} Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.157928 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8033-account-create-update-6c2bk" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.166312 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7f954fcdf9-tt6r4" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.166345 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.176054 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wpl6r"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.198729 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f48d59955-cprlv"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.198942 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-f48d59955-cprlv" podUID="3fe020eb-0bd4-4efa-9711-3f07ce31907c" containerName="keystone-api" containerID="cri-o://e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b" gracePeriod=30 Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.213880 4917 scope.go:117] "RemoveContainer" containerID="4d2961fedfd15a4905cf0158a4313ce7cb45dc6b8b90b25a9434e03f8a7117ad" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.217723 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.232325 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-168c-account-create-update-kppnw"] Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.232949 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-rzngt operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-168c-account-create-update-kppnw" podUID="55bbcadf-3a8c-4ed7-a650-073e43362ac7" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.237841 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-8xnsx"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.245017 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-8xnsx"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.245294 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts\") pod \"keystone-168c-account-create-update-kppnw\" (UID: \"55bbcadf-3a8c-4ed7-a650-073e43362ac7\") " pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.245408 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzngt\" (UniqueName: \"kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt\") pod \"keystone-168c-account-create-update-kppnw\" (UID: \"55bbcadf-3a8c-4ed7-a650-073e43362ac7\") " pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.245613 4917 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.245651 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts podName:7f9f629f-c02c-4416-ae21-2b49cea903b5 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:14.745637726 +0000 UTC m=+1399.686792440 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts") pod "root-account-create-update-2qt5f" (UID: "7f9f629f-c02c-4416-ae21-2b49cea903b5") : configmap "openstack-scripts" not found Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.248524 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2qt5f"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.248862 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": read tcp 10.217.0.2:56856->10.217.0.208:8774: read: connection reset by peer" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.248952 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": read tcp 10.217.0.2:56872->10.217.0.208:8774: read: connection reset by peer" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.260289 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9gbdd_33ddaa4d-48c1-4c81-b0a3-4225b6382496/ovn-controller/0.log" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.260368 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9gbdd" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.281639 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.297655 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.352110 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts\") pod \"keystone-168c-account-create-update-kppnw\" (UID: \"55bbcadf-3a8c-4ed7-a650-073e43362ac7\") " pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.352229 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzngt\" (UniqueName: \"kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt\") pod \"keystone-168c-account-create-update-kppnw\" (UID: \"55bbcadf-3a8c-4ed7-a650-073e43362ac7\") " pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.363156 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8033-account-create-update-6c2bk"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.381531 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8033-account-create-update-6c2bk"] Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.381678 4917 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.381919 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts podName:55bbcadf-3a8c-4ed7-a650-073e43362ac7 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:14.881899466 +0000 UTC m=+1399.823054180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts") pod "keystone-168c-account-create-update-kppnw" (UID: "55bbcadf-3a8c-4ed7-a650-073e43362ac7") : configmap "openstack-scripts" not found Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.390336 4917 projected.go:194] Error preparing data for projected volume kube-api-access-rzngt for pod openstack/keystone-168c-account-create-update-kppnw: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.390400 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt podName:55bbcadf-3a8c-4ed7-a650-073e43362ac7 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:14.890385303 +0000 UTC m=+1399.831540017 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rzngt" (UniqueName: "kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt") pod "keystone-168c-account-create-update-kppnw" (UID: "55bbcadf-3a8c-4ed7-a650-073e43362ac7") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.402472 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-7f954fcdf9-tt6r4"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.418039 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-7f954fcdf9-tt6r4"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.422050 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.431739 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.455852 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run\") pod \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.455897 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-combined-ca-bundle\") pod \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.455960 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-log-ovn\") pod \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.456024 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnw6x\" (UniqueName: \"kubernetes.io/projected/33ddaa4d-48c1-4c81-b0a3-4225b6382496-kube-api-access-wnw6x\") pod \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.456069 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-ovn-controller-tls-certs\") pod \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.456145 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33ddaa4d-48c1-4c81-b0a3-4225b6382496-scripts\") pod \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.456172 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run-ovn\") pod \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\" (UID: \"33ddaa4d-48c1-4c81-b0a3-4225b6382496\") " Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.457091 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "33ddaa4d-48c1-4c81-b0a3-4225b6382496" (UID: "33ddaa4d-48c1-4c81-b0a3-4225b6382496"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.457126 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run" (OuterVolumeSpecName: "var-run") pod "33ddaa4d-48c1-4c81-b0a3-4225b6382496" (UID: "33ddaa4d-48c1-4c81-b0a3-4225b6382496"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.463104 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "33ddaa4d-48c1-4c81-b0a3-4225b6382496" (UID: "33ddaa4d-48c1-4c81-b0a3-4225b6382496"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.464020 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33ddaa4d-48c1-4c81-b0a3-4225b6382496-scripts" (OuterVolumeSpecName: "scripts") pod "33ddaa4d-48c1-4c81-b0a3-4225b6382496" (UID: "33ddaa4d-48c1-4c81-b0a3-4225b6382496"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.494880 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33ddaa4d-48c1-4c81-b0a3-4225b6382496-kube-api-access-wnw6x" (OuterVolumeSpecName: "kube-api-access-wnw6x") pod "33ddaa4d-48c1-4c81-b0a3-4225b6382496" (UID: "33ddaa4d-48c1-4c81-b0a3-4225b6382496"). InnerVolumeSpecName "kube-api-access-wnw6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.512887 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.512984 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.516785 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.516884 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="84500248-15f0-4049-8423-43502d6587cf" containerName="galera" containerID="cri-o://2e38a061b090ac1a57bf820e3f8a49ce670631af9f0ea08d5ab50da29f72d9c6" gracePeriod=30 Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.516974 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.527176 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.527244 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.527477 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.527503 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.555392 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33ddaa4d-48c1-4c81-b0a3-4225b6382496" (UID: "33ddaa4d-48c1-4c81-b0a3-4225b6382496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.558549 4917 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.558576 4917 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.558599 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.558610 4917 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/33ddaa4d-48c1-4c81-b0a3-4225b6382496-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.558618 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnw6x\" (UniqueName: \"kubernetes.io/projected/33ddaa4d-48c1-4c81-b0a3-4225b6382496-kube-api-access-wnw6x\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.558626 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/33ddaa4d-48c1-4c81-b0a3-4225b6382496-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.566217 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.182:9292/healthcheck\": read tcp 10.217.0.2:47268->10.217.0.182:9292: read: connection reset by peer" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.566426 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.182:9292/healthcheck\": read tcp 10.217.0.2:47278->10.217.0.182:9292: read: connection reset by peer" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.670754 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "33ddaa4d-48c1-4c81-b0a3-4225b6382496" (UID: "33ddaa4d-48c1-4c81-b0a3-4225b6382496"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.671797 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/33ddaa4d-48c1-4c81-b0a3-4225b6382496-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.773141 4917 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.773404 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts podName:7f9f629f-c02c-4416-ae21-2b49cea903b5 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:15.773390304 +0000 UTC m=+1400.714545018 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts") pod "root-account-create-update-2qt5f" (UID: "7f9f629f-c02c-4416-ae21-2b49cea903b5") : configmap "openstack-scripts" not found Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.981790 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts\") pod \"keystone-168c-account-create-update-kppnw\" (UID: \"55bbcadf-3a8c-4ed7-a650-073e43362ac7\") " pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.981882 4917 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.981923 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts podName:55bbcadf-3a8c-4ed7-a650-073e43362ac7 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:15.981910421 +0000 UTC m=+1400.923065135 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts") pod "keystone-168c-account-create-update-kppnw" (UID: "55bbcadf-3a8c-4ed7-a650-073e43362ac7") : configmap "openstack-scripts" not found Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.982206 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzngt\" (UniqueName: \"kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt\") pod \"keystone-168c-account-create-update-kppnw\" (UID: \"55bbcadf-3a8c-4ed7-a650-073e43362ac7\") " pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.988119 4917 projected.go:194] Error preparing data for projected volume kube-api-access-rzngt for pod openstack/keystone-168c-account-create-update-kppnw: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.988234 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt podName:55bbcadf-3a8c-4ed7-a650-073e43362ac7 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:15.988187964 +0000 UTC m=+1400.929342678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzngt" (UniqueName: "kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt") pod "keystone-168c-account-create-update-kppnw" (UID: "55bbcadf-3a8c-4ed7-a650-073e43362ac7") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 07:10:14 crc kubenswrapper[4917]: E0318 07:10:14.989163 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.989207 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.993881 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.187:9292/healthcheck\": read tcp 10.217.0.2:39800->10.217.0.187:9292: read: connection reset by peer" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.993987 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.187:9292/healthcheck\": read tcp 10.217.0.2:39788->10.217.0.187:9292: read: connection reset by peer" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.995320 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 07:10:14 crc kubenswrapper[4917]: I0318 07:10:14.995414 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.000024 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.012449 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.012552 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="dd4ec623-4dba-48e7-89f5-6cd3eadce847" containerName="nova-cell1-conductor-conductor" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.064144 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.100664 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.170:9311/healthcheck\": read tcp 10.217.0.2:43354->10.217.0.170:9311: read: connection reset by peer" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.100714 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.170:9311/healthcheck\": read tcp 10.217.0.2:43348->10.217.0.170:9311: read: connection reset by peer" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202314 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-combined-ca-bundle\") pod \"2140bc3b-8c96-4226-b7c4-811b0724682d\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202383 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-scripts\") pod \"54ac6ac1-72cb-4383-8206-92169da43249\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202410 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-842zg\" (UniqueName: \"kubernetes.io/projected/2140bc3b-8c96-4226-b7c4-811b0724682d-kube-api-access-842zg\") pod \"2140bc3b-8c96-4226-b7c4-811b0724682d\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202428 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v4rg\" (UniqueName: \"kubernetes.io/projected/9f997531-b81c-41be-96aa-5f20fe185369-kube-api-access-8v4rg\") pod \"9f997531-b81c-41be-96aa-5f20fe185369\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202472 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data-custom\") pod \"54ac6ac1-72cb-4383-8206-92169da43249\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202488 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2140bc3b-8c96-4226-b7c4-811b0724682d-logs\") pod \"2140bc3b-8c96-4226-b7c4-811b0724682d\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202505 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75qmk\" (UniqueName: \"kubernetes.io/projected/54ac6ac1-72cb-4383-8206-92169da43249-kube-api-access-75qmk\") pod \"54ac6ac1-72cb-4383-8206-92169da43249\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202538 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-combined-ca-bundle\") pod \"9f997531-b81c-41be-96aa-5f20fe185369\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202557 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-public-tls-certs\") pod \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202574 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54ac6ac1-72cb-4383-8206-92169da43249-etc-machine-id\") pod \"54ac6ac1-72cb-4383-8206-92169da43249\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202611 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-config-data\") pod \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202628 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-combined-ca-bundle\") pod \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202658 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2wt6\" (UniqueName: \"kubernetes.io/projected/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-kube-api-access-c2wt6\") pod \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202704 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-scripts\") pod \"2140bc3b-8c96-4226-b7c4-811b0724682d\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202721 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-certs\") pod \"9f997531-b81c-41be-96aa-5f20fe185369\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202759 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-public-tls-certs\") pod \"2140bc3b-8c96-4226-b7c4-811b0724682d\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202781 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data\") pod \"54ac6ac1-72cb-4383-8206-92169da43249\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202837 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-config\") pod \"9f997531-b81c-41be-96aa-5f20fe185369\" (UID: \"9f997531-b81c-41be-96aa-5f20fe185369\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202858 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-internal-tls-certs\") pod \"2140bc3b-8c96-4226-b7c4-811b0724682d\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202897 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-logs\") pod \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202939 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-public-tls-certs\") pod \"54ac6ac1-72cb-4383-8206-92169da43249\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202957 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-config-data\") pod \"2140bc3b-8c96-4226-b7c4-811b0724682d\" (UID: \"2140bc3b-8c96-4226-b7c4-811b0724682d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.202975 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-internal-tls-certs\") pod \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\" (UID: \"52bd1e19-7e65-4ea9-94bd-ad7edee4be11\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.203014 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-combined-ca-bundle\") pod \"54ac6ac1-72cb-4383-8206-92169da43249\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.203039 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ac6ac1-72cb-4383-8206-92169da43249-logs\") pod \"54ac6ac1-72cb-4383-8206-92169da43249\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.203077 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-internal-tls-certs\") pod \"54ac6ac1-72cb-4383-8206-92169da43249\" (UID: \"54ac6ac1-72cb-4383-8206-92169da43249\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.207353 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2140bc3b-8c96-4226-b7c4-811b0724682d-logs" (OuterVolumeSpecName: "logs") pod "2140bc3b-8c96-4226-b7c4-811b0724682d" (UID: "2140bc3b-8c96-4226-b7c4-811b0724682d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.208549 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-kube-api-access-c2wt6" (OuterVolumeSpecName: "kube-api-access-c2wt6") pod "52bd1e19-7e65-4ea9-94bd-ad7edee4be11" (UID: "52bd1e19-7e65-4ea9-94bd-ad7edee4be11"). InnerVolumeSpecName "kube-api-access-c2wt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.208632 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54ac6ac1-72cb-4383-8206-92169da43249-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "54ac6ac1-72cb-4383-8206-92169da43249" (UID: "54ac6ac1-72cb-4383-8206-92169da43249"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.209054 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9vf8d"] Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.210056 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" containerName="ovn-controller" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.210218 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" containerName="ovn-controller" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.210305 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ac6ac1-72cb-4383-8206-92169da43249" containerName="cinder-api-log" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.210460 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ac6ac1-72cb-4383-8206-92169da43249" containerName="cinder-api-log" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.210545 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f997531-b81c-41be-96aa-5f20fe185369" containerName="kube-state-metrics" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.210625 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f997531-b81c-41be-96aa-5f20fe185369" containerName="kube-state-metrics" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.210773 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerName="placement-log" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.210849 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerName="placement-log" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.211012 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-api" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.211086 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-api" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.211478 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54ac6ac1-72cb-4383-8206-92169da43249" containerName="cinder-api" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.211549 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="54ac6ac1-72cb-4383-8206-92169da43249" containerName="cinder-api" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.211621 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-log" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.211671 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-log" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.211825 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerName="placement-api" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.211877 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerName="placement-api" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.212092 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-log" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.212159 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerName="placement-api" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.212225 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2140bc3b-8c96-4226-b7c4-811b0724682d" containerName="placement-log" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.212274 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerName="nova-api-api" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.212329 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" containerName="ovn-controller" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.212379 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ac6ac1-72cb-4383-8206-92169da43249" containerName="cinder-api-log" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.212430 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f997531-b81c-41be-96aa-5f20fe185369" containerName="kube-state-metrics" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.212486 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="54ac6ac1-72cb-4383-8206-92169da43249" containerName="cinder-api" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.213917 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.217572 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vf8d"] Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.221500 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-scripts" (OuterVolumeSpecName: "scripts") pod "54ac6ac1-72cb-4383-8206-92169da43249" (UID: "54ac6ac1-72cb-4383-8206-92169da43249"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.222941 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2140bc3b-8c96-4226-b7c4-811b0724682d-kube-api-access-842zg" (OuterVolumeSpecName: "kube-api-access-842zg") pod "2140bc3b-8c96-4226-b7c4-811b0724682d" (UID: "2140bc3b-8c96-4226-b7c4-811b0724682d"). InnerVolumeSpecName "kube-api-access-842zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.223393 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f997531-b81c-41be-96aa-5f20fe185369-kube-api-access-8v4rg" (OuterVolumeSpecName: "kube-api-access-8v4rg") pod "9f997531-b81c-41be-96aa-5f20fe185369" (UID: "9f997531-b81c-41be-96aa-5f20fe185369"). InnerVolumeSpecName "kube-api-access-8v4rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.224155 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-logs" (OuterVolumeSpecName: "logs") pod "52bd1e19-7e65-4ea9-94bd-ad7edee4be11" (UID: "52bd1e19-7e65-4ea9-94bd-ad7edee4be11"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.224675 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54ac6ac1-72cb-4383-8206-92169da43249-logs" (OuterVolumeSpecName: "logs") pod "54ac6ac1-72cb-4383-8206-92169da43249" (UID: "54ac6ac1-72cb-4383-8206-92169da43249"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.240416 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-scripts" (OuterVolumeSpecName: "scripts") pod "2140bc3b-8c96-4226-b7c4-811b0724682d" (UID: "2140bc3b-8c96-4226-b7c4-811b0724682d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.241517 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.242003 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "54ac6ac1-72cb-4383-8206-92169da43249" (UID: "54ac6ac1-72cb-4383-8206-92169da43249"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.242401 4917 generic.go:334] "Generic (PLEG): container finished" podID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerID="5cf727c6f8e456b706e1dae3227882a5fd58a35e0cebf945f9304e0df9d9673a" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.242475 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71bf8cc3-5674-418a-a126-f43e3d2f092d","Type":"ContainerDied","Data":"5cf727c6f8e456b706e1dae3227882a5fd58a35e0cebf945f9304e0df9d9673a"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.248612 4917 generic.go:334] "Generic (PLEG): container finished" podID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerID="4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.248635 4917 generic.go:334] "Generic (PLEG): container finished" podID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerID="4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.248642 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerDied","Data":"4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.248681 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerDied","Data":"4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.250665 4917 generic.go:334] "Generic (PLEG): container finished" podID="a62df143-348a-4ec7-b331-04db3857e847" containerID="3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.250727 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a62df143-348a-4ec7-b331-04db3857e847","Type":"ContainerDied","Data":"3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.253541 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54ac6ac1-72cb-4383-8206-92169da43249-kube-api-access-75qmk" (OuterVolumeSpecName: "kube-api-access-75qmk") pod "54ac6ac1-72cb-4383-8206-92169da43249" (UID: "54ac6ac1-72cb-4383-8206-92169da43249"). InnerVolumeSpecName "kube-api-access-75qmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.299766 4917 generic.go:334] "Generic (PLEG): container finished" podID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerID="09de8cec2a2b8b86818b49eb10353534cc95b1159f2eddd3b1fc8ba166f591d6" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.299863 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" event={"ID":"0f3ff50e-a301-4abe-bbaf-2b0075b80b47","Type":"ContainerDied","Data":"09de8cec2a2b8b86818b49eb10353534cc95b1159f2eddd3b1fc8ba166f591d6"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311846 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311873 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54ac6ac1-72cb-4383-8206-92169da43249-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311883 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311891 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-842zg\" (UniqueName: \"kubernetes.io/projected/2140bc3b-8c96-4226-b7c4-811b0724682d-kube-api-access-842zg\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311899 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v4rg\" (UniqueName: \"kubernetes.io/projected/9f997531-b81c-41be-96aa-5f20fe185369-kube-api-access-8v4rg\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311908 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311915 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2140bc3b-8c96-4226-b7c4-811b0724682d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311924 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75qmk\" (UniqueName: \"kubernetes.io/projected/54ac6ac1-72cb-4383-8206-92169da43249-kube-api-access-75qmk\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311932 4917 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/54ac6ac1-72cb-4383-8206-92169da43249-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311941 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2wt6\" (UniqueName: \"kubernetes.io/projected/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-kube-api-access-c2wt6\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.311949 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.322659 4917 generic.go:334] "Generic (PLEG): container finished" podID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerID="1760f96d4b10c19a42b16a801413b85944dced5f3964dea049426aa637ed35f1" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.322715 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" event={"ID":"156b1187-23ee-4a81-8d1d-ad91c2468b7d","Type":"ContainerDied","Data":"1760f96d4b10c19a42b16a801413b85944dced5f3964dea049426aa637ed35f1"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.332903 4917 generic.go:334] "Generic (PLEG): container finished" podID="54ac6ac1-72cb-4383-8206-92169da43249" containerID="dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.332965 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54ac6ac1-72cb-4383-8206-92169da43249","Type":"ContainerDied","Data":"dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.332990 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"54ac6ac1-72cb-4383-8206-92169da43249","Type":"ContainerDied","Data":"1b44ce85d63978bf96bfe8a750d393bbadcce33ce70e04f1c7203d1352090821"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.333005 4917 scope.go:117] "RemoveContainer" containerID="dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.333095 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.348037 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9f997531-b81c-41be-96aa-5f20fe185369","Type":"ContainerDied","Data":"17e9d91fb1d676e3afead599c257a6b07d3b1d575de0085c26816acf488d8a55"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.348098 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.366780 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a55197a-92c3-451c-9d5d-d3a6426c995b","Type":"ContainerDied","Data":"ea57b6c51e2a6d56fc3ad907733130bd48fe5600650c29357de522f2e527c1cf"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.366386 4917 generic.go:334] "Generic (PLEG): container finished" podID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerID="ea57b6c51e2a6d56fc3ad907733130bd48fe5600650c29357de522f2e527c1cf" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.376563 4917 scope.go:117] "RemoveContainer" containerID="33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.377136 4917 generic.go:334] "Generic (PLEG): container finished" podID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerID="1e96cd0c2b6ad866d42096dc84192a730d0d27e13ac4658e5fa9dd813dbe217c" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.377228 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5948845567-w7h4j" event={"ID":"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6","Type":"ContainerDied","Data":"1e96cd0c2b6ad866d42096dc84192a730d0d27e13ac4658e5fa9dd813dbe217c"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.380702 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "9f997531-b81c-41be-96aa-5f20fe185369" (UID: "9f997531-b81c-41be-96aa-5f20fe185369"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.390979 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f997531-b81c-41be-96aa-5f20fe185369" (UID: "9f997531-b81c-41be-96aa-5f20fe185369"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.398625 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9gbdd_33ddaa4d-48c1-4c81-b0a3-4225b6382496/ovn-controller/0.log" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.398742 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9gbdd" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.399795 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9gbdd" event={"ID":"33ddaa4d-48c1-4c81-b0a3-4225b6382496","Type":"ContainerDied","Data":"f6993223374c6901f85cb9d0491bb05d4720fb6709864af33102cac30541e4cc"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.405306 4917 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-2qt5f" secret="" err="secret \"galera-openstack-dockercfg-zhq7w\" not found" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.405442 4917 scope.go:117] "RemoveContainer" containerID="e8755e7fa4ce78be1417bae895c80614c14702cc700f0225c2621fb63c8edb09" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.405805 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-2qt5f_openstack(7f9f629f-c02c-4416-ae21-2b49cea903b5)\"" pod="openstack/root-account-create-update-2qt5f" podUID="7f9f629f-c02c-4416-ae21-2b49cea903b5" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.412410 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kolla-config\") pod \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.413509 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g44kl\" (UniqueName: \"kubernetes.io/projected/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kube-api-access-g44kl\") pod \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.413632 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-combined-ca-bundle\") pod \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.413748 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-config-data\") pod \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.413879 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-memcached-tls-certs\") pod \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\" (UID: \"033e4d58-03e5-49fa-ad5b-169464bf7ba9\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.413990 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "033e4d58-03e5-49fa-ad5b-169464bf7ba9" (UID: "033e4d58-03e5-49fa-ad5b-169464bf7ba9"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.414616 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-utilities\") pod \"redhat-operators-9vf8d\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.414634 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-config-data" (OuterVolumeSpecName: "config-data") pod "033e4d58-03e5-49fa-ad5b-169464bf7ba9" (UID: "033e4d58-03e5-49fa-ad5b-169464bf7ba9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.414878 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-catalog-content\") pod \"redhat-operators-9vf8d\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.414916 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g8sk\" (UniqueName: \"kubernetes.io/projected/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-kube-api-access-4g8sk\") pod \"redhat-operators-9vf8d\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.415123 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.415138 4917 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.415147 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/033e4d58-03e5-49fa-ad5b-169464bf7ba9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.415158 4917 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.418503 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2140bc3b-8c96-4226-b7c4-811b0724682d" (UID: "2140bc3b-8c96-4226-b7c4-811b0724682d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.418935 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6f74bf5646-8prg6" event={"ID":"2140bc3b-8c96-4226-b7c4-811b0724682d","Type":"ContainerDied","Data":"d41cab000d3a29afef33509ed9b90327b6aa87400a3addf1dcc69f927f8d56ae"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.419281 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6f74bf5646-8prg6" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.427462 4917 generic.go:334] "Generic (PLEG): container finished" podID="033e4d58-03e5-49fa-ad5b-169464bf7ba9" containerID="bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.427552 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"033e4d58-03e5-49fa-ad5b-169464bf7ba9","Type":"ContainerDied","Data":"bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.427578 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"033e4d58-03e5-49fa-ad5b-169464bf7ba9","Type":"ContainerDied","Data":"971f2be52f502a5e65d5e4a46858d48495433a69bd5670e36c859cff5013ff1c"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.427656 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.443709 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kube-api-access-g44kl" (OuterVolumeSpecName: "kube-api-access-g44kl") pod "033e4d58-03e5-49fa-ad5b-169464bf7ba9" (UID: "033e4d58-03e5-49fa-ad5b-169464bf7ba9"). InnerVolumeSpecName "kube-api-access-g44kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.445254 4917 generic.go:334] "Generic (PLEG): container finished" podID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" containerID="528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.445368 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.445510 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52bd1e19-7e65-4ea9-94bd-ad7edee4be11","Type":"ContainerDied","Data":"528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.445568 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52bd1e19-7e65-4ea9-94bd-ad7edee4be11","Type":"ContainerDied","Data":"7d1947684feeb1d3d961873e89111d53eee000dbdde84ca653c38b0793e88787"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.447375 4917 scope.go:117] "RemoveContainer" containerID="dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.458339 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f\": container with ID starting with dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f not found: ID does not exist" containerID="dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.458392 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f"} err="failed to get container status \"dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f\": rpc error: code = NotFound desc = could not find container \"dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f\": container with ID starting with dd0344d6af0bee9f7b75f93384834e2c2dd52082268da1f1c53dd213e61ddf2f not found: ID does not exist" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.458420 4917 scope.go:117] "RemoveContainer" containerID="33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.461354 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9gbdd"] Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.461902 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744\": container with ID starting with 33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744 not found: ID does not exist" containerID="33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.461988 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744"} err="failed to get container status \"33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744\": rpc error: code = NotFound desc = could not find container \"33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744\": container with ID starting with 33e6daa3cf81908ec93547b215acf8b7a685cdef2730d1b507937fbd8def7744 not found: ID does not exist" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.462013 4917 scope.go:117] "RemoveContainer" containerID="2ae6bc03fa06e459c8435a83079a258845af95b3fb669f441d6f1382b2e5ce57" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.465924 4917 generic.go:334] "Generic (PLEG): container finished" podID="06578cec-8f13-4daf-966e-89f743a134fe" containerID="a8adb26cc0c60d07e23f6ba71eefaac96183dc1c6cfa0982a9403686a18c8986" exitCode=0 Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.466003 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.466010 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06578cec-8f13-4daf-966e-89f743a134fe","Type":"ContainerDied","Data":"a8adb26cc0c60d07e23f6ba71eefaac96183dc1c6cfa0982a9403686a18c8986"} Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.469874 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9gbdd"] Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.506923 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-config-data" (OuterVolumeSpecName: "config-data") pod "52bd1e19-7e65-4ea9-94bd-ad7edee4be11" (UID: "52bd1e19-7e65-4ea9-94bd-ad7edee4be11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.511746 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "54ac6ac1-72cb-4383-8206-92169da43249" (UID: "54ac6ac1-72cb-4383-8206-92169da43249"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.516301 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-utilities\") pod \"redhat-operators-9vf8d\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.516437 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-catalog-content\") pod \"redhat-operators-9vf8d\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.516470 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g8sk\" (UniqueName: \"kubernetes.io/projected/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-kube-api-access-4g8sk\") pod \"redhat-operators-9vf8d\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.516576 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.516612 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.516624 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.516637 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g44kl\" (UniqueName: \"kubernetes.io/projected/033e4d58-03e5-49fa-ad5b-169464bf7ba9-kube-api-access-g44kl\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.516741 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52bd1e19-7e65-4ea9-94bd-ad7edee4be11" (UID: "52bd1e19-7e65-4ea9-94bd-ad7edee4be11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.517272 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-utilities\") pod \"redhat-operators-9vf8d\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.517347 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-catalog-content\") pod \"redhat-operators-9vf8d\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.539571 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "033e4d58-03e5-49fa-ad5b-169464bf7ba9" (UID: "033e4d58-03e5-49fa-ad5b-169464bf7ba9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.551932 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g8sk\" (UniqueName: \"kubernetes.io/projected/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-kube-api-access-4g8sk\") pod \"redhat-operators-9vf8d\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.561827 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52bd1e19-7e65-4ea9-94bd-ad7edee4be11" (UID: "52bd1e19-7e65-4ea9-94bd-ad7edee4be11"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.584250 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54ac6ac1-72cb-4383-8206-92169da43249" (UID: "54ac6ac1-72cb-4383-8206-92169da43249"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.590156 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-config-data" (OuterVolumeSpecName: "config-data") pod "2140bc3b-8c96-4226-b7c4-811b0724682d" (UID: "2140bc3b-8c96-4226-b7c4-811b0724682d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.596151 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data" (OuterVolumeSpecName: "config-data") pod "54ac6ac1-72cb-4383-8206-92169da43249" (UID: "54ac6ac1-72cb-4383-8206-92169da43249"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.597428 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "54ac6ac1-72cb-4383-8206-92169da43249" (UID: "54ac6ac1-72cb-4383-8206-92169da43249"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.601750 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "52bd1e19-7e65-4ea9-94bd-ad7edee4be11" (UID: "52bd1e19-7e65-4ea9-94bd-ad7edee4be11"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.619877 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.620132 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.620202 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.620259 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.620324 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.620384 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52bd1e19-7e65-4ea9-94bd-ad7edee4be11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.620450 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.620517 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54ac6ac1-72cb-4383-8206-92169da43249-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.652570 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "9f997531-b81c-41be-96aa-5f20fe185369" (UID: "9f997531-b81c-41be-96aa-5f20fe185369"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.688324 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2140bc3b-8c96-4226-b7c4-811b0724682d" (UID: "2140bc3b-8c96-4226-b7c4-811b0724682d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.691667 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "033e4d58-03e5-49fa-ad5b-169464bf7ba9" (UID: "033e4d58-03e5-49fa-ad5b-169464bf7ba9"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.728805 4917 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/9f997531-b81c-41be-96aa-5f20fe185369-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.728833 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.728844 4917 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/033e4d58-03e5-49fa-ad5b-169464bf7ba9-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.739555 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2140bc3b-8c96-4226-b7c4-811b0724682d" (UID: "2140bc3b-8c96-4226-b7c4-811b0724682d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.786533 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16611612-0229-4cd5-9877-ccccc8bf60de" path="/var/lib/kubelet/pods/16611612-0229-4cd5-9877-ccccc8bf60de/volumes" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.787244 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c53afd0-a49e-4af6-a08d-181a6227a31e" path="/var/lib/kubelet/pods/1c53afd0-a49e-4af6-a08d-181a6227a31e/volumes" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.788398 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33ddaa4d-48c1-4c81-b0a3-4225b6382496" path="/var/lib/kubelet/pods/33ddaa4d-48c1-4c81-b0a3-4225b6382496/volumes" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.794395 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3727d24d-eb93-467b-af3f-66090ad92329" path="/var/lib/kubelet/pods/3727d24d-eb93-467b-af3f-66090ad92329/volumes" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.794873 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3" path="/var/lib/kubelet/pods/5b8d6634-9b5f-4dd2-8b28-fa72c44c15f3/volumes" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.795780 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6976b4d3-75e6-4b74-99db-3fd9acb3a742" path="/var/lib/kubelet/pods/6976b4d3-75e6-4b74-99db-3fd9acb3a742/volumes" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.796494 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cbf9993-0d83-499a-8cc5-11662e0641e1" path="/var/lib/kubelet/pods/9cbf9993-0d83-499a-8cc5-11662e0641e1/volumes" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.797083 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d0bad6-9874-40d7-8848-e138b487c00e" path="/var/lib/kubelet/pods/b7d0bad6-9874-40d7-8848-e138b487c00e/volumes" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.798042 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4affeee-7968-4e70-b6dd-d8d0f17cfa92" path="/var/lib/kubelet/pods/e4affeee-7968-4e70-b6dd-d8d0f17cfa92/volumes" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.820287 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.830665 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2140bc3b-8c96-4226-b7c4-811b0724682d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.831758 4917 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.831804 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts podName:7f9f629f-c02c-4416-ae21-2b49cea903b5 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:17.831789602 +0000 UTC m=+1402.772944316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts") pod "root-account-create-update-2qt5f" (UID: "7f9f629f-c02c-4416-ae21-2b49cea903b5") : configmap "openstack-scripts" not found Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.832116 4917 scope.go:117] "RemoveContainer" containerID="c67ceb625cdd5baa04a13e13da848059276036abb1326784ef355b6956226e2f" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.852787 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.868675 4917 scope.go:117] "RemoveContainer" containerID="1592e72e88530b3687f95124325d50bfc66a649981b642fed4a4d25898d5c3d7" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.870612 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.877894 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.879638 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.907466 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567 is running failed: container process not found" containerID="3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.907851 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567 is running failed: container process not found" containerID="3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.908145 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567 is running failed: container process not found" containerID="3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 07:10:15 crc kubenswrapper[4917]: E0318 07:10:15.908170 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a62df143-348a-4ec7-b331-04db3857e847" containerName="nova-scheduler-scheduler" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.915628 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.931427 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dj29\" (UniqueName: \"kubernetes.io/projected/156b1187-23ee-4a81-8d1d-ad91c2468b7d-kube-api-access-8dj29\") pod \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.931487 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data\") pod \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.931512 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data-custom\") pod \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.931536 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-combined-ca-bundle\") pod \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.931576 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156b1187-23ee-4a81-8d1d-ad91c2468b7d-logs\") pod \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\" (UID: \"156b1187-23ee-4a81-8d1d-ad91c2468b7d\") " Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.932727 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/156b1187-23ee-4a81-8d1d-ad91c2468b7d-logs" (OuterVolumeSpecName: "logs") pod "156b1187-23ee-4a81-8d1d-ad91c2468b7d" (UID: "156b1187-23ee-4a81-8d1d-ad91c2468b7d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.935490 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.941197 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156b1187-23ee-4a81-8d1d-ad91c2468b7d-kube-api-access-8dj29" (OuterVolumeSpecName: "kube-api-access-8dj29") pod "156b1187-23ee-4a81-8d1d-ad91c2468b7d" (UID: "156b1187-23ee-4a81-8d1d-ad91c2468b7d"). InnerVolumeSpecName "kube-api-access-8dj29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.944168 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "156b1187-23ee-4a81-8d1d-ad91c2468b7d" (UID: "156b1187-23ee-4a81-8d1d-ad91c2468b7d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.944295 4917 scope.go:117] "RemoveContainer" containerID="c1a1e3e9c9d8361d0c4155702ce697636d2fe57a0d4920a2496cd0fbb0e9a366" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.947065 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.956479 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.956617 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.960837 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.965198 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.965354 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.985438 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data" (OuterVolumeSpecName: "config-data") pod "156b1187-23ee-4a81-8d1d-ad91c2468b7d" (UID: "156b1187-23ee-4a81-8d1d-ad91c2468b7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:15 crc kubenswrapper[4917]: I0318 07:10:15.985715 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:15.995888 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "156b1187-23ee-4a81-8d1d-ad91c2468b7d" (UID: "156b1187-23ee-4a81-8d1d-ad91c2468b7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.032809 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-config-data\") pod \"71bf8cc3-5674-418a-a126-f43e3d2f092d\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.032855 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"6a55197a-92c3-451c-9d5d-d3a6426c995b\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.032881 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-logs\") pod \"71bf8cc3-5674-418a-a126-f43e3d2f092d\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.032899 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-logs\") pod \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.032921 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-config-data\") pod \"6a55197a-92c3-451c-9d5d-d3a6426c995b\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.032944 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-config-data\") pod \"06578cec-8f13-4daf-966e-89f743a134fe\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.032962 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-httpd-run\") pod \"71bf8cc3-5674-418a-a126-f43e3d2f092d\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.032993 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6hwl\" (UniqueName: \"kubernetes.io/projected/6a55197a-92c3-451c-9d5d-d3a6426c995b-kube-api-access-s6hwl\") pod \"6a55197a-92c3-451c-9d5d-d3a6426c995b\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033018 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06578cec-8f13-4daf-966e-89f743a134fe-logs\") pod \"06578cec-8f13-4daf-966e-89f743a134fe\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033035 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr5mq\" (UniqueName: \"kubernetes.io/projected/a62df143-348a-4ec7-b331-04db3857e847-kube-api-access-xr5mq\") pod \"a62df143-348a-4ec7-b331-04db3857e847\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033053 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-scripts\") pod \"71bf8cc3-5674-418a-a126-f43e3d2f092d\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033071 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data\") pod \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033091 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-httpd-run\") pod \"6a55197a-92c3-451c-9d5d-d3a6426c995b\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033165 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29ln5\" (UniqueName: \"kubernetes.io/projected/71bf8cc3-5674-418a-a126-f43e3d2f092d-kube-api-access-29ln5\") pod \"71bf8cc3-5674-418a-a126-f43e3d2f092d\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033209 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2hfq\" (UniqueName: \"kubernetes.io/projected/06578cec-8f13-4daf-966e-89f743a134fe-kube-api-access-r2hfq\") pod \"06578cec-8f13-4daf-966e-89f743a134fe\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033254 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-scripts\") pod \"6a55197a-92c3-451c-9d5d-d3a6426c995b\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033273 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-combined-ca-bundle\") pod \"a62df143-348a-4ec7-b331-04db3857e847\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033295 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-nova-metadata-tls-certs\") pod \"06578cec-8f13-4daf-966e-89f743a134fe\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033318 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-logs\") pod \"6a55197a-92c3-451c-9d5d-d3a6426c995b\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033334 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-internal-tls-certs\") pod \"71bf8cc3-5674-418a-a126-f43e3d2f092d\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033365 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-combined-ca-bundle\") pod \"71bf8cc3-5674-418a-a126-f43e3d2f092d\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033390 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-config-data\") pod \"a62df143-348a-4ec7-b331-04db3857e847\" (UID: \"a62df143-348a-4ec7-b331-04db3857e847\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033411 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2zbs\" (UniqueName: \"kubernetes.io/projected/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-kube-api-access-z2zbs\") pod \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033428 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-public-tls-certs\") pod \"6a55197a-92c3-451c-9d5d-d3a6426c995b\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033448 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-combined-ca-bundle\") pod \"6a55197a-92c3-451c-9d5d-d3a6426c995b\" (UID: \"6a55197a-92c3-451c-9d5d-d3a6426c995b\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033472 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-combined-ca-bundle\") pod \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033489 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data-custom\") pod \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\" (UID: \"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033509 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-combined-ca-bundle\") pod \"06578cec-8f13-4daf-966e-89f743a134fe\" (UID: \"06578cec-8f13-4daf-966e-89f743a134fe\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033529 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"71bf8cc3-5674-418a-a126-f43e3d2f092d\" (UID: \"71bf8cc3-5674-418a-a126-f43e3d2f092d\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033803 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzngt\" (UniqueName: \"kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt\") pod \"keystone-168c-account-create-update-kppnw\" (UID: \"55bbcadf-3a8c-4ed7-a650-073e43362ac7\") " pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.033902 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts\") pod \"keystone-168c-account-create-update-kppnw\" (UID: \"55bbcadf-3a8c-4ed7-a650-073e43362ac7\") " pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.034028 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.034043 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/156b1187-23ee-4a81-8d1d-ad91c2468b7d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.034056 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dj29\" (UniqueName: \"kubernetes.io/projected/156b1187-23ee-4a81-8d1d-ad91c2468b7d-kube-api-access-8dj29\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.034066 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.034076 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156b1187-23ee-4a81-8d1d-ad91c2468b7d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: E0318 07:10:16.034131 4917 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 18 07:10:16 crc kubenswrapper[4917]: E0318 07:10:16.034181 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts podName:55bbcadf-3a8c-4ed7-a650-073e43362ac7 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:18.034166468 +0000 UTC m=+1402.975321182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts") pod "keystone-168c-account-create-update-kppnw" (UID: "55bbcadf-3a8c-4ed7-a650-073e43362ac7") : configmap "openstack-scripts" not found Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.041150 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-logs" (OuterVolumeSpecName: "logs") pod "71bf8cc3-5674-418a-a126-f43e3d2f092d" (UID: "71bf8cc3-5674-418a-a126-f43e3d2f092d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.041519 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-logs" (OuterVolumeSpecName: "logs") pod "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" (UID: "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.041536 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06578cec-8f13-4daf-966e-89f743a134fe-logs" (OuterVolumeSpecName: "logs") pod "06578cec-8f13-4daf-966e-89f743a134fe" (UID: "06578cec-8f13-4daf-966e-89f743a134fe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.042084 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a55197a-92c3-451c-9d5d-d3a6426c995b" (UID: "6a55197a-92c3-451c-9d5d-d3a6426c995b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.042295 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.045309 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "71bf8cc3-5674-418a-a126-f43e3d2f092d" (UID: "71bf8cc3-5674-418a-a126-f43e3d2f092d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.045676 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-scripts" (OuterVolumeSpecName: "scripts") pod "6a55197a-92c3-451c-9d5d-d3a6426c995b" (UID: "6a55197a-92c3-451c-9d5d-d3a6426c995b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.051816 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bf8cc3-5674-418a-a126-f43e3d2f092d-kube-api-access-29ln5" (OuterVolumeSpecName: "kube-api-access-29ln5") pod "71bf8cc3-5674-418a-a126-f43e3d2f092d" (UID: "71bf8cc3-5674-418a-a126-f43e3d2f092d"). InnerVolumeSpecName "kube-api-access-29ln5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.051934 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06578cec-8f13-4daf-966e-89f743a134fe-kube-api-access-r2hfq" (OuterVolumeSpecName: "kube-api-access-r2hfq") pod "06578cec-8f13-4daf-966e-89f743a134fe" (UID: "06578cec-8f13-4daf-966e-89f743a134fe"). InnerVolumeSpecName "kube-api-access-r2hfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.052253 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a55197a-92c3-451c-9d5d-d3a6426c995b-kube-api-access-s6hwl" (OuterVolumeSpecName: "kube-api-access-s6hwl") pod "6a55197a-92c3-451c-9d5d-d3a6426c995b" (UID: "6a55197a-92c3-451c-9d5d-d3a6426c995b"). InnerVolumeSpecName "kube-api-access-s6hwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: E0318 07:10:16.052393 4917 projected.go:194] Error preparing data for projected volume kube-api-access-rzngt for pod openstack/keystone-168c-account-create-update-kppnw: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 07:10:16 crc kubenswrapper[4917]: E0318 07:10:16.052444 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt podName:55bbcadf-3a8c-4ed7-a650-073e43362ac7 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:18.052420274 +0000 UTC m=+1402.993574988 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzngt" (UniqueName: "kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt") pod "keystone-168c-account-create-update-kppnw" (UID: "55bbcadf-3a8c-4ed7-a650-073e43362ac7") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.053145 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-logs" (OuterVolumeSpecName: "logs") pod "6a55197a-92c3-451c-9d5d-d3a6426c995b" (UID: "6a55197a-92c3-451c-9d5d-d3a6426c995b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.055715 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-scripts" (OuterVolumeSpecName: "scripts") pod "71bf8cc3-5674-418a-a126-f43e3d2f092d" (UID: "71bf8cc3-5674-418a-a126-f43e3d2f092d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.056271 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "71bf8cc3-5674-418a-a126-f43e3d2f092d" (UID: "71bf8cc3-5674-418a-a126-f43e3d2f092d"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.072873 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "6a55197a-92c3-451c-9d5d-d3a6426c995b" (UID: "6a55197a-92c3-451c-9d5d-d3a6426c995b"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.084356 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-kube-api-access-z2zbs" (OuterVolumeSpecName: "kube-api-access-z2zbs") pod "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" (UID: "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6"). InnerVolumeSpecName "kube-api-access-z2zbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.084485 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" (UID: "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.085308 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.090372 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62df143-348a-4ec7-b331-04db3857e847-kube-api-access-xr5mq" (OuterVolumeSpecName: "kube-api-access-xr5mq") pod "a62df143-348a-4ec7-b331-04db3857e847" (UID: "a62df143-348a-4ec7-b331-04db3857e847"). InnerVolumeSpecName "kube-api-access-xr5mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.094240 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-config-data" (OuterVolumeSpecName: "config-data") pod "06578cec-8f13-4daf-966e-89f743a134fe" (UID: "06578cec-8f13-4daf-966e-89f743a134fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.094250 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.101828 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" (UID: "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.114554 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-config-data" (OuterVolumeSpecName: "config-data") pod "a62df143-348a-4ec7-b331-04db3857e847" (UID: "a62df143-348a-4ec7-b331-04db3857e847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.124157 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.130671 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06578cec-8f13-4daf-966e-89f743a134fe" (UID: "06578cec-8f13-4daf-966e-89f743a134fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135480 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47f75\" (UniqueName: \"kubernetes.io/projected/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-kube-api-access-47f75\") pod \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135515 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-logs\") pod \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135565 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-ceilometer-tls-certs\") pod \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135596 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-public-tls-certs\") pod \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135614 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-run-httpd\") pod \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135632 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-sg-core-conf-yaml\") pod \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135646 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-log-httpd\") pod \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135679 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-combined-ca-bundle\") pod \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135711 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-scripts\") pod \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135736 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcf5m\" (UniqueName: \"kubernetes.io/projected/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-kube-api-access-rcf5m\") pod \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135762 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-config-data\") pod \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135788 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-internal-tls-certs\") pod \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135849 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-combined-ca-bundle\") pod \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\" (UID: \"6a4114f4-5182-4b59-b9be-72a6f4ed11fb\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135902 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data\") pod \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.135932 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data-custom\") pod \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\" (UID: \"0f3ff50e-a301-4abe-bbaf-2b0075b80b47\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136301 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136315 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136323 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136332 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136340 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/71bf8cc3-5674-418a-a126-f43e3d2f092d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136350 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6hwl\" (UniqueName: \"kubernetes.io/projected/6a55197a-92c3-451c-9d5d-d3a6426c995b-kube-api-access-s6hwl\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136359 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06578cec-8f13-4daf-966e-89f743a134fe-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136368 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr5mq\" (UniqueName: \"kubernetes.io/projected/a62df143-348a-4ec7-b331-04db3857e847-kube-api-access-xr5mq\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136375 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136383 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136390 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29ln5\" (UniqueName: \"kubernetes.io/projected/71bf8cc3-5674-418a-a126-f43e3d2f092d-kube-api-access-29ln5\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136398 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2hfq\" (UniqueName: \"kubernetes.io/projected/06578cec-8f13-4daf-966e-89f743a134fe-kube-api-access-r2hfq\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136406 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136413 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a55197a-92c3-451c-9d5d-d3a6426c995b-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136421 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136429 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2zbs\" (UniqueName: \"kubernetes.io/projected/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-kube-api-access-z2zbs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136437 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136447 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136456 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.136469 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.143497 4917 scope.go:117] "RemoveContainer" containerID="bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.144245 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a4114f4-5182-4b59-b9be-72a6f4ed11fb" (UID: "6a4114f4-5182-4b59-b9be-72a6f4ed11fb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.147831 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-logs" (OuterVolumeSpecName: "logs") pod "0f3ff50e-a301-4abe-bbaf-2b0075b80b47" (UID: "0f3ff50e-a301-4abe-bbaf-2b0075b80b47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.147989 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a4114f4-5182-4b59-b9be-72a6f4ed11fb" (UID: "6a4114f4-5182-4b59-b9be-72a6f4ed11fb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.162148 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6f74bf5646-8prg6"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.174726 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-kube-api-access-rcf5m" (OuterVolumeSpecName: "kube-api-access-rcf5m") pod "6a4114f4-5182-4b59-b9be-72a6f4ed11fb" (UID: "6a4114f4-5182-4b59-b9be-72a6f4ed11fb"). InnerVolumeSpecName "kube-api-access-rcf5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.176769 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-scripts" (OuterVolumeSpecName: "scripts") pod "6a4114f4-5182-4b59-b9be-72a6f4ed11fb" (UID: "6a4114f4-5182-4b59-b9be-72a6f4ed11fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.177136 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-kube-api-access-47f75" (OuterVolumeSpecName: "kube-api-access-47f75") pod "0f3ff50e-a301-4abe-bbaf-2b0075b80b47" (UID: "0f3ff50e-a301-4abe-bbaf-2b0075b80b47"). InnerVolumeSpecName "kube-api-access-47f75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.180538 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0f3ff50e-a301-4abe-bbaf-2b0075b80b47" (UID: "0f3ff50e-a301-4abe-bbaf-2b0075b80b47"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.194975 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6f74bf5646-8prg6"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.198379 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "06578cec-8f13-4daf-966e-89f743a134fe" (UID: "06578cec-8f13-4daf-966e-89f743a134fe"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.205662 4917 scope.go:117] "RemoveContainer" containerID="bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789" Mar 18 07:10:16 crc kubenswrapper[4917]: E0318 07:10:16.209717 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789\": container with ID starting with bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789 not found: ID does not exist" containerID="bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.209762 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789"} err="failed to get container status \"bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789\": rpc error: code = NotFound desc = could not find container \"bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789\": container with ID starting with bef23bde6ae6f7e2e9363213a31d51633bd0eb3c6a30546be8f7bd05a8be4789 not found: ID does not exist" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.209793 4917 scope.go:117] "RemoveContainer" containerID="528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.220773 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.221118 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a62df143-348a-4ec7-b331-04db3857e847" (UID: "a62df143-348a-4ec7-b331-04db3857e847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.228667 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-config-data" (OuterVolumeSpecName: "config-data") pod "6a55197a-92c3-451c-9d5d-d3a6426c995b" (UID: "6a55197a-92c3-451c-9d5d-d3a6426c995b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.236558 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.237190 4917 scope.go:117] "RemoveContainer" containerID="34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.238927 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "71bf8cc3-5674-418a-a126-f43e3d2f092d" (UID: "71bf8cc3-5674-418a-a126-f43e3d2f092d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239003 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47f75\" (UniqueName: \"kubernetes.io/projected/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-kube-api-access-47f75\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239035 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-logs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239045 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239055 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239064 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239073 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239083 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239091 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcf5m\" (UniqueName: \"kubernetes.io/projected/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-kube-api-access-rcf5m\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239100 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239108 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239117 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62df143-348a-4ec7-b331-04db3857e847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.239125 4917 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/06578cec-8f13-4daf-966e-89f743a134fe-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.269240 4917 scope.go:117] "RemoveContainer" containerID="528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2" Mar 18 07:10:16 crc kubenswrapper[4917]: E0318 07:10:16.269822 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2\": container with ID starting with 528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2 not found: ID does not exist" containerID="528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.269887 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2"} err="failed to get container status \"528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2\": rpc error: code = NotFound desc = could not find container \"528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2\": container with ID starting with 528310fefcfce2371557f269f4ffd4257d03c7395a163581ca8cc7f5ecce20c2 not found: ID does not exist" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.269915 4917 scope.go:117] "RemoveContainer" containerID="34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.270458 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6a55197a-92c3-451c-9d5d-d3a6426c995b" (UID: "6a55197a-92c3-451c-9d5d-d3a6426c995b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: E0318 07:10:16.270657 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904\": container with ID starting with 34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904 not found: ID does not exist" containerID="34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.270687 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904"} err="failed to get container status \"34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904\": rpc error: code = NotFound desc = could not find container \"34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904\": container with ID starting with 34887ba40d3b666b21e8780ee6c5e6360240223cff27c82eac53662b9fae0904 not found: ID does not exist" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.275520 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71bf8cc3-5674-418a-a126-f43e3d2f092d" (UID: "71bf8cc3-5674-418a-a126-f43e3d2f092d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.275950 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a55197a-92c3-451c-9d5d-d3a6426c995b" (UID: "6a55197a-92c3-451c-9d5d-d3a6426c995b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.299764 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f3ff50e-a301-4abe-bbaf-2b0075b80b47" (UID: "0f3ff50e-a301-4abe-bbaf-2b0075b80b47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.299767 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a4114f4-5182-4b59-b9be-72a6f4ed11fb" (UID: "6a4114f4-5182-4b59-b9be-72a6f4ed11fb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.308092 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0f3ff50e-a301-4abe-bbaf-2b0075b80b47" (UID: "0f3ff50e-a301-4abe-bbaf-2b0075b80b47"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.312095 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6a4114f4-5182-4b59-b9be-72a6f4ed11fb" (UID: "6a4114f4-5182-4b59-b9be-72a6f4ed11fb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.314135 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data" (OuterVolumeSpecName: "config-data") pod "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" (UID: "3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.316717 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-config-data" (OuterVolumeSpecName: "config-data") pod "71bf8cc3-5674-418a-a126-f43e3d2f092d" (UID: "71bf8cc3-5674-418a-a126-f43e3d2f092d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.320748 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0f3ff50e-a301-4abe-bbaf-2b0075b80b47" (UID: "0f3ff50e-a301-4abe-bbaf-2b0075b80b47"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.330428 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data" (OuterVolumeSpecName: "config-data") pod "0f3ff50e-a301-4abe-bbaf-2b0075b80b47" (UID: "0f3ff50e-a301-4abe-bbaf-2b0075b80b47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341720 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341742 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341753 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341762 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341781 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a55197a-92c3-451c-9d5d-d3a6426c995b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341789 4917 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341796 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341804 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341812 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71bf8cc3-5674-418a-a126-f43e3d2f092d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341821 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341831 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f3ff50e-a301-4abe-bbaf-2b0075b80b47-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.341838 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.368371 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a4114f4-5182-4b59-b9be-72a6f4ed11fb" (UID: "6a4114f4-5182-4b59-b9be-72a6f4ed11fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.375814 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-config-data" (OuterVolumeSpecName: "config-data") pod "6a4114f4-5182-4b59-b9be-72a6f4ed11fb" (UID: "6a4114f4-5182-4b59-b9be-72a6f4ed11fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.443995 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.444030 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a4114f4-5182-4b59-b9be-72a6f4ed11fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.480267 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.480302 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a62df143-348a-4ec7-b331-04db3857e847","Type":"ContainerDied","Data":"7371c9479f2ed60de873f794bb37ce7e955d10f29c11a42f847ae5ea1c5ea9c5"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.480357 4917 scope.go:117] "RemoveContainer" containerID="3bcde3f4ebf6decebce213cafb0bf69b82771bf8d9a7a1d3eb510d3edf121567" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.492357 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" event={"ID":"156b1187-23ee-4a81-8d1d-ad91c2468b7d","Type":"ContainerDied","Data":"9ec7b84ab64b0d0c583b8018f43d660e75ae61c45f617fb8f7929be5ae5f08c7"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.492376 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-b749f8cd6-w2w6h" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.500395 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06578cec-8f13-4daf-966e-89f743a134fe","Type":"ContainerDied","Data":"f1772958fffce7a652deb17c726739295a79c9470b96d3caddcdfc923ec8b81e"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.500487 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.515030 4917 generic.go:334] "Generic (PLEG): container finished" podID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerID="eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45" exitCode=0 Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.515147 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerDied","Data":"eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.515183 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a4114f4-5182-4b59-b9be-72a6f4ed11fb","Type":"ContainerDied","Data":"8a099c58c4530a1fbe2c5919bfbe64dfd649755c41781983253ff0ccfab89991"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.515325 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.519859 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"71bf8cc3-5674-418a-a126-f43e3d2f092d","Type":"ContainerDied","Data":"991692db54759527ea3b256f066d63b3660277c203f3b5066206bdec96242ce9"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.519977 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.530650 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6a55197a-92c3-451c-9d5d-d3a6426c995b","Type":"ContainerDied","Data":"87d09ffc94d20217be7a87bd5a256bac55d1b3e9873f7cfcbe7bfad674cfe67f"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.530760 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.536075 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5948845567-w7h4j" event={"ID":"3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6","Type":"ContainerDied","Data":"1a93e29be7f00a03a0efedd2eb748f7b33a31515679236629a47bee3e7af1df1"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.540242 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5948845567-w7h4j" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.550033 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b83ec86-5c66-4dc6-9236-a437f37611a9/ovn-northd/0.log" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.550065 4917 generic.go:334] "Generic (PLEG): container finished" podID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerID="ef4b723c7e5e12825354609c888ba3e920eaa69afb8a6ad4df7b13d783acb2c3" exitCode=139 Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.550109 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b83ec86-5c66-4dc6-9236-a437f37611a9","Type":"ContainerDied","Data":"ef4b723c7e5e12825354609c888ba3e920eaa69afb8a6ad4df7b13d783acb2c3"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.551899 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-168c-account-create-update-kppnw" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.552066 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.554188 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6b9bcbc9d4-vddlw" event={"ID":"0f3ff50e-a301-4abe-bbaf-2b0075b80b47","Type":"ContainerDied","Data":"ccf1f02a88ccfdc525acd4b3b33ed6f7ffb6a36da39a8304b607e281ce86c407"} Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.651038 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.656905 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.671808 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.682129 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.687071 4917 scope.go:117] "RemoveContainer" containerID="1760f96d4b10c19a42b16a801413b85944dced5f3964dea049426aa637ed35f1" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.701082 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.726832 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.731931 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6b9bcbc9d4-vddlw"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.747723 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6b9bcbc9d4-vddlw"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.752653 4917 scope.go:117] "RemoveContainer" containerID="a36e72dd6a4a408d1a8611cc5d11c6ae06e79402f38ae0b454c8d8abb3d8a397" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.756790 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5948845567-w7h4j"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.767160 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5948845567-w7h4j"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.805817 4917 scope.go:117] "RemoveContainer" containerID="a8adb26cc0c60d07e23f6ba71eefaac96183dc1c6cfa0982a9403686a18c8986" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.848010 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-168c-account-create-update-kppnw"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.864241 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-168c-account-create-update-kppnw"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.872335 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.877683 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.897115 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.920347 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.922850 4917 scope.go:117] "RemoveContainer" containerID="0c9345749605256155e36788fa5aacbab82152b73c33088f71c542c3f012582d" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.942646 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b83ec86-5c66-4dc6-9236-a437f37611a9/ovn-northd/0.log" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.942843 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.951475 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-b749f8cd6-w2w6h"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.965306 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-scripts\") pod \"2b83ec86-5c66-4dc6-9236-a437f37611a9\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.965380 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-rundir\") pod \"2b83ec86-5c66-4dc6-9236-a437f37611a9\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.965440 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-combined-ca-bundle\") pod \"2b83ec86-5c66-4dc6-9236-a437f37611a9\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.965536 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nztcl\" (UniqueName: \"kubernetes.io/projected/2b83ec86-5c66-4dc6-9236-a437f37611a9-kube-api-access-nztcl\") pod \"2b83ec86-5c66-4dc6-9236-a437f37611a9\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.965567 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-northd-tls-certs\") pod \"2b83ec86-5c66-4dc6-9236-a437f37611a9\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.965615 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-metrics-certs-tls-certs\") pod \"2b83ec86-5c66-4dc6-9236-a437f37611a9\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.965675 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-config\") pod \"2b83ec86-5c66-4dc6-9236-a437f37611a9\" (UID: \"2b83ec86-5c66-4dc6-9236-a437f37611a9\") " Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.973876 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-config" (OuterVolumeSpecName: "config") pod "2b83ec86-5c66-4dc6-9236-a437f37611a9" (UID: "2b83ec86-5c66-4dc6-9236-a437f37611a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.979635 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-scripts" (OuterVolumeSpecName: "scripts") pod "2b83ec86-5c66-4dc6-9236-a437f37611a9" (UID: "2b83ec86-5c66-4dc6-9236-a437f37611a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.980061 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "2b83ec86-5c66-4dc6-9236-a437f37611a9" (UID: "2b83ec86-5c66-4dc6-9236-a437f37611a9"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.980803 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.980827 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b83ec86-5c66-4dc6-9236-a437f37611a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.980840 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55bbcadf-3a8c-4ed7-a650-073e43362ac7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.980853 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.980865 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzngt\" (UniqueName: \"kubernetes.io/projected/55bbcadf-3a8c-4ed7-a650-073e43362ac7-kube-api-access-rzngt\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.982364 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b83ec86-5c66-4dc6-9236-a437f37611a9-kube-api-access-nztcl" (OuterVolumeSpecName: "kube-api-access-nztcl") pod "2b83ec86-5c66-4dc6-9236-a437f37611a9" (UID: "2b83ec86-5c66-4dc6-9236-a437f37611a9"). InnerVolumeSpecName "kube-api-access-nztcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.991080 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-b749f8cd6-w2w6h"] Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.995647 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b83ec86-5c66-4dc6-9236-a437f37611a9" (UID: "2b83ec86-5c66-4dc6-9236-a437f37611a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:16 crc kubenswrapper[4917]: I0318 07:10:16.998906 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9vf8d"] Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.010916 4917 scope.go:117] "RemoveContainer" containerID="4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.045313 4917 scope.go:117] "RemoveContainer" containerID="7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.082804 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.082829 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nztcl\" (UniqueName: \"kubernetes.io/projected/2b83ec86-5c66-4dc6-9236-a437f37611a9-kube-api-access-nztcl\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.083396 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.087878 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "2b83ec86-5c66-4dc6-9236-a437f37611a9" (UID: "2b83ec86-5c66-4dc6-9236-a437f37611a9"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.105858 4917 scope.go:117] "RemoveContainer" containerID="eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.111149 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "2b83ec86-5c66-4dc6-9236-a437f37611a9" (UID: "2b83ec86-5c66-4dc6-9236-a437f37611a9"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.140103 4917 scope.go:117] "RemoveContainer" containerID="4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.185108 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts\") pod \"7f9f629f-c02c-4416-ae21-2b49cea903b5\" (UID: \"7f9f629f-c02c-4416-ae21-2b49cea903b5\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.185177 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56cwz\" (UniqueName: \"kubernetes.io/projected/7f9f629f-c02c-4416-ae21-2b49cea903b5-kube-api-access-56cwz\") pod \"7f9f629f-c02c-4416-ae21-2b49cea903b5\" (UID: \"7f9f629f-c02c-4416-ae21-2b49cea903b5\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.185496 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.185512 4917 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b83ec86-5c66-4dc6-9236-a437f37611a9-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.187011 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f9f629f-c02c-4416-ae21-2b49cea903b5" (UID: "7f9f629f-c02c-4416-ae21-2b49cea903b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.191198 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f9f629f-c02c-4416-ae21-2b49cea903b5-kube-api-access-56cwz" (OuterVolumeSpecName: "kube-api-access-56cwz") pod "7f9f629f-c02c-4416-ae21-2b49cea903b5" (UID: "7f9f629f-c02c-4416-ae21-2b49cea903b5"). InnerVolumeSpecName "kube-api-access-56cwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.208966 4917 scope.go:117] "RemoveContainer" containerID="4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10" Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.213392 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10\": container with ID starting with 4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10 not found: ID does not exist" containerID="4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.213445 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10"} err="failed to get container status \"4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10\": rpc error: code = NotFound desc = could not find container \"4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10\": container with ID starting with 4dc2e9ddee0c23b5a9076e542d906e8b879b9866903238583a9ae1d20105ba10 not found: ID does not exist" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.213482 4917 scope.go:117] "RemoveContainer" containerID="7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc" Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.218768 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc\": container with ID starting with 7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc not found: ID does not exist" containerID="7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.218829 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc"} err="failed to get container status \"7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc\": rpc error: code = NotFound desc = could not find container \"7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc\": container with ID starting with 7f72ec23fa1ea46d6379f12f0650e87798b2d5569f19632165957948501e83fc not found: ID does not exist" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.218864 4917 scope.go:117] "RemoveContainer" containerID="eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45" Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.221516 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45\": container with ID starting with eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45 not found: ID does not exist" containerID="eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.221550 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45"} err="failed to get container status \"eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45\": rpc error: code = NotFound desc = could not find container \"eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45\": container with ID starting with eaaed70e268860375b72f2bb3a81e6acd69891080e10bf0015704ea67a30fc45 not found: ID does not exist" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.221563 4917 scope.go:117] "RemoveContainer" containerID="4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c" Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.225661 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c\": container with ID starting with 4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c not found: ID does not exist" containerID="4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.225689 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c"} err="failed to get container status \"4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c\": rpc error: code = NotFound desc = could not find container \"4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c\": container with ID starting with 4210305ae9a4f71191ab4f9d789b1a5f1eba3fa546ede9d88b5f2c70bbe10c8c not found: ID does not exist" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.225704 4917 scope.go:117] "RemoveContainer" containerID="5cf727c6f8e456b706e1dae3227882a5fd58a35e0cebf945f9304e0df9d9673a" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.253727 4917 scope.go:117] "RemoveContainer" containerID="a57908c5b95dcdf307be2044978ccc9678a0b22f5e60956f8fbd25abaea6d4fe" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.270678 4917 scope.go:117] "RemoveContainer" containerID="ea57b6c51e2a6d56fc3ad907733130bd48fe5600650c29357de522f2e527c1cf" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.287498 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f9f629f-c02c-4416-ae21-2b49cea903b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.287526 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56cwz\" (UniqueName: \"kubernetes.io/projected/7f9f629f-c02c-4416-ae21-2b49cea903b5-kube-api-access-56cwz\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.298038 4917 scope.go:117] "RemoveContainer" containerID="8eda75a9630a00d3bc84d4255c1197ed99b2146ee96f3b0c2b6c80a91b3dc188" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.322523 4917 scope.go:117] "RemoveContainer" containerID="1e96cd0c2b6ad866d42096dc84192a730d0d27e13ac4658e5fa9dd813dbe217c" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.347816 4917 scope.go:117] "RemoveContainer" containerID="f5c44c7ab9e02426289634fa91304fcc0c9c17e0ffb94d8253cd509cae2718d1" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.420161 4917 scope.go:117] "RemoveContainer" containerID="09de8cec2a2b8b86818b49eb10353534cc95b1159f2eddd3b1fc8ba166f591d6" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.455266 4917 scope.go:117] "RemoveContainer" containerID="e6a5a8a7029188f639c27fa6e3638c25cc766067465396ac13f1f7a3e5c9c941" Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.530808 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.548541 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.550010 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.550073 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="42cfbb53-8521-4c39-98ee-2d666b5682d3" containerName="nova-cell0-conductor-conductor" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.587546 4917 generic.go:334] "Generic (PLEG): container finished" podID="84500248-15f0-4049-8423-43502d6587cf" containerID="2e38a061b090ac1a57bf820e3f8a49ce670631af9f0ea08d5ab50da29f72d9c6" exitCode=0 Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.587720 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84500248-15f0-4049-8423-43502d6587cf","Type":"ContainerDied","Data":"2e38a061b090ac1a57bf820e3f8a49ce670631af9f0ea08d5ab50da29f72d9c6"} Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.594421 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2qt5f" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.594494 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2qt5f" event={"ID":"7f9f629f-c02c-4416-ae21-2b49cea903b5","Type":"ContainerDied","Data":"a810fef54ee8210e0fbe3bd0778873475b840742cb4ca777849bbc2ae07934a1"} Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.594958 4917 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:17 crc kubenswrapper[4917]: E0318 07:10:17.595027 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data podName:4a2ed8f1-269d-45fb-a766-46c867bd0a91 nodeName:}" failed. No retries permitted until 2026-03-18 07:10:25.595008435 +0000 UTC m=+1410.536163149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data") pod "rabbitmq-cell1-server-0" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91") : configmap "rabbitmq-cell1-config-data" not found Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.600087 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2b83ec86-5c66-4dc6-9236-a437f37611a9/ovn-northd/0.log" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.600148 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2b83ec86-5c66-4dc6-9236-a437f37611a9","Type":"ContainerDied","Data":"8c7c705eacdffc731a020742f6f8804ced9faa1236fe70754fd0f7c9b8c9d714"} Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.600227 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.613753 4917 generic.go:334] "Generic (PLEG): container finished" podID="11fb09df-78b6-44c6-a78f-2b720a98cfad" containerID="256f8ce82bcbadac34767fc05b95a7249fd4500c71a2a020dd30ba54395951e5" exitCode=0 Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.613811 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11fb09df-78b6-44c6-a78f-2b720a98cfad","Type":"ContainerDied","Data":"256f8ce82bcbadac34767fc05b95a7249fd4500c71a2a020dd30ba54395951e5"} Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.622769 4917 generic.go:334] "Generic (PLEG): container finished" podID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerID="1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da" exitCode=0 Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.622843 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vf8d" event={"ID":"f8ecba0e-a87f-4b57-9f3d-ac772febac6c","Type":"ContainerDied","Data":"1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da"} Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.622869 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vf8d" event={"ID":"f8ecba0e-a87f-4b57-9f3d-ac772febac6c","Type":"ContainerStarted","Data":"c0474d9ccdb51a9fead20a6271c62220afdb96a369b53d576c17cf1111c10417"} Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.710051 4917 scope.go:117] "RemoveContainer" containerID="e8755e7fa4ce78be1417bae895c80614c14702cc700f0225c2621fb63c8edb09" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.740743 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2qt5f"] Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.749701 4917 scope.go:117] "RemoveContainer" containerID="8b2c13a58a1120455d3b9c627b8cf69e354661c43f72f0bcc57f23e360c0f81f" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.750037 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2qt5f"] Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.756796 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.761003 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.781238 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="033e4d58-03e5-49fa-ad5b-169464bf7ba9" path="/var/lib/kubelet/pods/033e4d58-03e5-49fa-ad5b-169464bf7ba9/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.781776 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06578cec-8f13-4daf-966e-89f743a134fe" path="/var/lib/kubelet/pods/06578cec-8f13-4daf-966e-89f743a134fe/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.782317 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" path="/var/lib/kubelet/pods/0f3ff50e-a301-4abe-bbaf-2b0075b80b47/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.783418 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" path="/var/lib/kubelet/pods/156b1187-23ee-4a81-8d1d-ad91c2468b7d/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.784027 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2140bc3b-8c96-4226-b7c4-811b0724682d" path="/var/lib/kubelet/pods/2140bc3b-8c96-4226-b7c4-811b0724682d/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.785275 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" path="/var/lib/kubelet/pods/2b83ec86-5c66-4dc6-9236-a437f37611a9/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.785969 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" path="/var/lib/kubelet/pods/3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.786042 4917 scope.go:117] "RemoveContainer" containerID="ef4b723c7e5e12825354609c888ba3e920eaa69afb8a6ad4df7b13d783acb2c3" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.786513 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52bd1e19-7e65-4ea9-94bd-ad7edee4be11" path="/var/lib/kubelet/pods/52bd1e19-7e65-4ea9-94bd-ad7edee4be11/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.787548 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54ac6ac1-72cb-4383-8206-92169da43249" path="/var/lib/kubelet/pods/54ac6ac1-72cb-4383-8206-92169da43249/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.788111 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55bbcadf-3a8c-4ed7-a650-073e43362ac7" path="/var/lib/kubelet/pods/55bbcadf-3a8c-4ed7-a650-073e43362ac7/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.788866 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" path="/var/lib/kubelet/pods/6a4114f4-5182-4b59-b9be-72a6f4ed11fb/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.789943 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" path="/var/lib/kubelet/pods/6a55197a-92c3-451c-9d5d-d3a6426c995b/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.790715 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" path="/var/lib/kubelet/pods/71bf8cc3-5674-418a-a126-f43e3d2f092d/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.791625 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f9f629f-c02c-4416-ae21-2b49cea903b5" path="/var/lib/kubelet/pods/7f9f629f-c02c-4416-ae21-2b49cea903b5/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.792098 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f997531-b81c-41be-96aa-5f20fe185369" path="/var/lib/kubelet/pods/9f997531-b81c-41be-96aa-5f20fe185369/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.792867 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62df143-348a-4ec7-b331-04db3857e847" path="/var/lib/kubelet/pods/a62df143-348a-4ec7-b331-04db3857e847/volumes" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.867880 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.899558 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"84500248-15f0-4049-8423-43502d6587cf\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.899646 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-galera-tls-certs\") pod \"84500248-15f0-4049-8423-43502d6587cf\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.899674 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-combined-ca-bundle\") pod \"84500248-15f0-4049-8423-43502d6587cf\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.899710 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-config-data-default\") pod \"84500248-15f0-4049-8423-43502d6587cf\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.899773 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-kolla-config\") pod \"84500248-15f0-4049-8423-43502d6587cf\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.899808 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/84500248-15f0-4049-8423-43502d6587cf-config-data-generated\") pod \"84500248-15f0-4049-8423-43502d6587cf\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.899898 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92fvn\" (UniqueName: \"kubernetes.io/projected/84500248-15f0-4049-8423-43502d6587cf-kube-api-access-92fvn\") pod \"84500248-15f0-4049-8423-43502d6587cf\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.899914 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-operator-scripts\") pod \"84500248-15f0-4049-8423-43502d6587cf\" (UID: \"84500248-15f0-4049-8423-43502d6587cf\") " Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.901388 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84500248-15f0-4049-8423-43502d6587cf" (UID: "84500248-15f0-4049-8423-43502d6587cf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.902401 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "84500248-15f0-4049-8423-43502d6587cf" (UID: "84500248-15f0-4049-8423-43502d6587cf"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.913411 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "84500248-15f0-4049-8423-43502d6587cf" (UID: "84500248-15f0-4049-8423-43502d6587cf"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.913423 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84500248-15f0-4049-8423-43502d6587cf-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "84500248-15f0-4049-8423-43502d6587cf" (UID: "84500248-15f0-4049-8423-43502d6587cf"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.916928 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84500248-15f0-4049-8423-43502d6587cf-kube-api-access-92fvn" (OuterVolumeSpecName: "kube-api-access-92fvn") pod "84500248-15f0-4049-8423-43502d6587cf" (UID: "84500248-15f0-4049-8423-43502d6587cf"). InnerVolumeSpecName "kube-api-access-92fvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.928309 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "84500248-15f0-4049-8423-43502d6587cf" (UID: "84500248-15f0-4049-8423-43502d6587cf"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.930056 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84500248-15f0-4049-8423-43502d6587cf" (UID: "84500248-15f0-4049-8423-43502d6587cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.947229 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "84500248-15f0-4049-8423-43502d6587cf" (UID: "84500248-15f0-4049-8423-43502d6587cf"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:17 crc kubenswrapper[4917]: I0318 07:10:17.947755 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.003968 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-plugins-conf\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004044 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-tls\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004096 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-plugins\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004122 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004178 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25j97\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-kube-api-access-25j97\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004198 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-server-conf\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004231 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11fb09df-78b6-44c6-a78f-2b720a98cfad-pod-info\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004283 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-erlang-cookie\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004346 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-confd\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004419 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-config-data\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004414 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004449 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11fb09df-78b6-44c6-a78f-2b720a98cfad-erlang-cookie-secret\") pod \"11fb09df-78b6-44c6-a78f-2b720a98cfad\" (UID: \"11fb09df-78b6-44c6-a78f-2b720a98cfad\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004857 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004874 4917 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004884 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84500248-15f0-4049-8423-43502d6587cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004894 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004922 4917 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004931 4917 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004940 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/84500248-15f0-4049-8423-43502d6587cf-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004948 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92fvn\" (UniqueName: \"kubernetes.io/projected/84500248-15f0-4049-8423-43502d6587cf-kube-api-access-92fvn\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.004958 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84500248-15f0-4049-8423-43502d6587cf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.007051 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.007777 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.010834 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/11fb09df-78b6-44c6-a78f-2b720a98cfad-pod-info" (OuterVolumeSpecName: "pod-info") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.010839 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.011271 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-kube-api-access-25j97" (OuterVolumeSpecName: "kube-api-access-25j97") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "kube-api-access-25j97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.019543 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fb09df-78b6-44c6-a78f-2b720a98cfad-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.019923 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.022164 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.044340 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.044602 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-config-data" (OuterVolumeSpecName: "config-data") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.060133 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-server-conf" (OuterVolumeSpecName: "server-conf") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.101882 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "11fb09df-78b6-44c6-a78f-2b720a98cfad" (UID: "11fb09df-78b6-44c6-a78f-2b720a98cfad"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107221 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47ckz\" (UniqueName: \"kubernetes.io/projected/3fe020eb-0bd4-4efa-9711-3f07ce31907c-kube-api-access-47ckz\") pod \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107278 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-fernet-keys\") pod \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107336 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-combined-ca-bundle\") pod \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107367 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-public-tls-certs\") pod \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107391 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-internal-tls-certs\") pod \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107453 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-credential-keys\") pod \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107496 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-config-data\") pod \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107512 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-scripts\") pod \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107832 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107850 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107871 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107880 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25j97\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-kube-api-access-25j97\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107889 4917 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107897 4917 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/11fb09df-78b6-44c6-a78f-2b720a98cfad-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107905 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107913 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/11fb09df-78b6-44c6-a78f-2b720a98cfad-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107921 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107930 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/11fb09df-78b6-44c6-a78f-2b720a98cfad-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.107941 4917 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/11fb09df-78b6-44c6-a78f-2b720a98cfad-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.113393 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fe020eb-0bd4-4efa-9711-3f07ce31907c-kube-api-access-47ckz" (OuterVolumeSpecName: "kube-api-access-47ckz") pod "3fe020eb-0bd4-4efa-9711-3f07ce31907c" (UID: "3fe020eb-0bd4-4efa-9711-3f07ce31907c"). InnerVolumeSpecName "kube-api-access-47ckz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.115846 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3fe020eb-0bd4-4efa-9711-3f07ce31907c" (UID: "3fe020eb-0bd4-4efa-9711-3f07ce31907c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.118787 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "3fe020eb-0bd4-4efa-9711-3f07ce31907c" (UID: "3fe020eb-0bd4-4efa-9711-3f07ce31907c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.120853 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-scripts" (OuterVolumeSpecName: "scripts") pod "3fe020eb-0bd4-4efa-9711-3f07ce31907c" (UID: "3fe020eb-0bd4-4efa-9711-3f07ce31907c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.127271 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.142716 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fe020eb-0bd4-4efa-9711-3f07ce31907c" (UID: "3fe020eb-0bd4-4efa-9711-3f07ce31907c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.152786 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-config-data" (OuterVolumeSpecName: "config-data") pod "3fe020eb-0bd4-4efa-9711-3f07ce31907c" (UID: "3fe020eb-0bd4-4efa-9711-3f07ce31907c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: E0318 07:10:18.153003 4917 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-public-tls-certs podName:3fe020eb-0bd4-4efa-9711-3f07ce31907c nodeName:}" failed. No retries permitted until 2026-03-18 07:10:18.652981392 +0000 UTC m=+1403.594136106 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "public-tls-certs" (UniqueName: "kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-public-tls-certs") pod "3fe020eb-0bd4-4efa-9711-3f07ce31907c" (UID: "3fe020eb-0bd4-4efa-9711-3f07ce31907c") : error deleting /var/lib/kubelet/pods/3fe020eb-0bd4-4efa-9711-3f07ce31907c/volume-subpaths: remove /var/lib/kubelet/pods/3fe020eb-0bd4-4efa-9711-3f07ce31907c/volume-subpaths: no such file or directory Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.156704 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3fe020eb-0bd4-4efa-9711-3f07ce31907c" (UID: "3fe020eb-0bd4-4efa-9711-3f07ce31907c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.215788 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.215817 4917 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.215829 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.215840 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.215851 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-47ckz\" (UniqueName: \"kubernetes.io/projected/3fe020eb-0bd4-4efa-9711-3f07ce31907c-kube-api-access-47ckz\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.215861 4917 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.215871 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.215880 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.639393 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.639401 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"84500248-15f0-4049-8423-43502d6587cf","Type":"ContainerDied","Data":"e7ce4bd2ee2154314474fb5fa29a9d9c5cade19767d99235c00e067dfab7cf6c"} Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.639920 4917 scope.go:117] "RemoveContainer" containerID="2e38a061b090ac1a57bf820e3f8a49ce670631af9f0ea08d5ab50da29f72d9c6" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.650399 4917 generic.go:334] "Generic (PLEG): container finished" podID="4a2ed8f1-269d-45fb-a766-46c867bd0a91" containerID="a60676e9973b2a6031a6ea6dba53de6eade7182dfd3e15a64da35230283bab60" exitCode=0 Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.650477 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a2ed8f1-269d-45fb-a766-46c867bd0a91","Type":"ContainerDied","Data":"a60676e9973b2a6031a6ea6dba53de6eade7182dfd3e15a64da35230283bab60"} Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.653018 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.653010 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"11fb09df-78b6-44c6-a78f-2b720a98cfad","Type":"ContainerDied","Data":"2ff3c9e276be85f73c62af181c5793a51b056b4ea2cb32371faa29a948b35ef0"} Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.659870 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vf8d" event={"ID":"f8ecba0e-a87f-4b57-9f3d-ac772febac6c","Type":"ContainerStarted","Data":"5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a"} Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.670146 4917 generic.go:334] "Generic (PLEG): container finished" podID="3fe020eb-0bd4-4efa-9711-3f07ce31907c" containerID="e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b" exitCode=0 Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.670203 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f48d59955-cprlv" event={"ID":"3fe020eb-0bd4-4efa-9711-3f07ce31907c","Type":"ContainerDied","Data":"e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b"} Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.670224 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-f48d59955-cprlv" event={"ID":"3fe020eb-0bd4-4efa-9711-3f07ce31907c","Type":"ContainerDied","Data":"21b378879cfae082d0af9a227d89c23bed61173340d4db654b8bbf653c807925"} Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.670277 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-f48d59955-cprlv" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.720497 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.722164 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-public-tls-certs\") pod \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\" (UID: \"3fe020eb-0bd4-4efa-9711-3f07ce31907c\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.744761 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.751072 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3fe020eb-0bd4-4efa-9711-3f07ce31907c" (UID: "3fe020eb-0bd4-4efa-9711-3f07ce31907c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.757207 4917 scope.go:117] "RemoveContainer" containerID="b2b06add313005d06645803b1e8e5e8da09cb4fad220682987d99604261e8632" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.760224 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.766755 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.772173 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.797421 4917 scope.go:117] "RemoveContainer" containerID="256f8ce82bcbadac34767fc05b95a7249fd4500c71a2a020dd30ba54395951e5" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.816555 4917 scope.go:117] "RemoveContainer" containerID="38f6a17087633473084cee1b528df5d10be62a5fb94b0daa33f536007711ed0e" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.823893 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mnbs\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-kube-api-access-8mnbs\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.823944 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-confd\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.823979 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-tls\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.824043 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-plugins-conf\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.824069 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a2ed8f1-269d-45fb-a766-46c867bd0a91-erlang-cookie-secret\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.824091 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-server-conf\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.824108 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.824148 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.824172 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a2ed8f1-269d-45fb-a766-46c867bd0a91-pod-info\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.824189 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-erlang-cookie\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.824239 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-plugins\") pod \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\" (UID: \"4a2ed8f1-269d-45fb-a766-46c867bd0a91\") " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.824497 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fe020eb-0bd4-4efa-9711-3f07ce31907c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.826226 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.826221 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.826533 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.830571 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-kube-api-access-8mnbs" (OuterVolumeSpecName: "kube-api-access-8mnbs") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "kube-api-access-8mnbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.831898 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a2ed8f1-269d-45fb-a766-46c867bd0a91-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.831936 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.832738 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.837103 4917 scope.go:117] "RemoveContainer" containerID="e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.844095 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data" (OuterVolumeSpecName: "config-data") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.844249 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4a2ed8f1-269d-45fb-a766-46c867bd0a91-pod-info" (OuterVolumeSpecName: "pod-info") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.861563 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-server-conf" (OuterVolumeSpecName: "server-conf") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.864275 4917 scope.go:117] "RemoveContainer" containerID="e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b" Mar 18 07:10:18 crc kubenswrapper[4917]: E0318 07:10:18.864751 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b\": container with ID starting with e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b not found: ID does not exist" containerID="e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.864792 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b"} err="failed to get container status \"e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b\": rpc error: code = NotFound desc = could not find container \"e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b\": container with ID starting with e8a8d294f1207f162b41355fe2791414a5cc1950dcf91f1321d4f2f941ade02b not found: ID does not exist" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.904826 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4a2ed8f1-269d-45fb-a766-46c867bd0a91" (UID: "4a2ed8f1-269d-45fb-a766-46c867bd0a91"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925723 4917 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925745 4917 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4a2ed8f1-269d-45fb-a766-46c867bd0a91-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925755 4917 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925778 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925787 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4a2ed8f1-269d-45fb-a766-46c867bd0a91-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925795 4917 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4a2ed8f1-269d-45fb-a766-46c867bd0a91-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925804 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925814 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925823 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mnbs\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-kube-api-access-8mnbs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925830 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.925837 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4a2ed8f1-269d-45fb-a766-46c867bd0a91-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:18 crc kubenswrapper[4917]: I0318 07:10:18.969885 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.034203 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.069375 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-f48d59955-cprlv"] Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.086332 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-f48d59955-cprlv"] Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.294709 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.342093 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlb5g\" (UniqueName: \"kubernetes.io/projected/42cfbb53-8521-4c39-98ee-2d666b5682d3-kube-api-access-wlb5g\") pod \"42cfbb53-8521-4c39-98ee-2d666b5682d3\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.342531 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-config-data\") pod \"42cfbb53-8521-4c39-98ee-2d666b5682d3\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.342567 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-combined-ca-bundle\") pod \"42cfbb53-8521-4c39-98ee-2d666b5682d3\" (UID: \"42cfbb53-8521-4c39-98ee-2d666b5682d3\") " Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.347275 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cfbb53-8521-4c39-98ee-2d666b5682d3-kube-api-access-wlb5g" (OuterVolumeSpecName: "kube-api-access-wlb5g") pod "42cfbb53-8521-4c39-98ee-2d666b5682d3" (UID: "42cfbb53-8521-4c39-98ee-2d666b5682d3"). InnerVolumeSpecName "kube-api-access-wlb5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.362119 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-config-data" (OuterVolumeSpecName: "config-data") pod "42cfbb53-8521-4c39-98ee-2d666b5682d3" (UID: "42cfbb53-8521-4c39-98ee-2d666b5682d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.375540 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "42cfbb53-8521-4c39-98ee-2d666b5682d3" (UID: "42cfbb53-8521-4c39-98ee-2d666b5682d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.443627 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.443654 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/42cfbb53-8521-4c39-98ee-2d666b5682d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.443665 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlb5g\" (UniqueName: \"kubernetes.io/projected/42cfbb53-8521-4c39-98ee-2d666b5682d3-kube-api-access-wlb5g\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.515276 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.515826 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.516182 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.516210 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.516269 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.518442 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.519738 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.519772 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.699930 4917 generic.go:334] "Generic (PLEG): container finished" podID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerID="5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a" exitCode=0 Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.700001 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vf8d" event={"ID":"f8ecba0e-a87f-4b57-9f3d-ac772febac6c","Type":"ContainerDied","Data":"5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a"} Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.702369 4917 generic.go:334] "Generic (PLEG): container finished" podID="42cfbb53-8521-4c39-98ee-2d666b5682d3" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" exitCode=0 Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.702413 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"42cfbb53-8521-4c39-98ee-2d666b5682d3","Type":"ContainerDied","Data":"ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8"} Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.702448 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"42cfbb53-8521-4c39-98ee-2d666b5682d3","Type":"ContainerDied","Data":"a2d4cc981f3c655f89ab55a6561adaa9b77a2bed3671e51864801f4be995d8d3"} Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.702465 4917 scope.go:117] "RemoveContainer" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.702458 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.709676 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4a2ed8f1-269d-45fb-a766-46c867bd0a91","Type":"ContainerDied","Data":"ea7460b86f89a147f264b57eef417205a660e20414287c6dc10f2350ca339469"} Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.709802 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.735810 4917 scope.go:117] "RemoveContainer" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.737380 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8\": container with ID starting with ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8 not found: ID does not exist" containerID="ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.737414 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8"} err="failed to get container status \"ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8\": rpc error: code = NotFound desc = could not find container \"ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8\": container with ID starting with ae87104b3ebd402ee3588be022da3695cb3e37e8a3fd82220c7f732f21c724e8 not found: ID does not exist" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.737438 4917 scope.go:117] "RemoveContainer" containerID="a60676e9973b2a6031a6ea6dba53de6eade7182dfd3e15a64da35230283bab60" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.780910 4917 scope.go:117] "RemoveContainer" containerID="89d1b01a410a58d6efadc5b7f3e235ac35c4482c50d32403ecd6e850c30e9231" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.788096 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11fb09df-78b6-44c6-a78f-2b720a98cfad" path="/var/lib/kubelet/pods/11fb09df-78b6-44c6-a78f-2b720a98cfad/volumes" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.789523 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fe020eb-0bd4-4efa-9711-3f07ce31907c" path="/var/lib/kubelet/pods/3fe020eb-0bd4-4efa-9711-3f07ce31907c/volumes" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.791731 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84500248-15f0-4049-8423-43502d6587cf" path="/var/lib/kubelet/pods/84500248-15f0-4049-8423-43502d6587cf/volumes" Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.793923 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.794002 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.794032 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.794559 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 07:10:19 crc kubenswrapper[4917]: I0318 07:10:19.953396 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-df967876c-494l9" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.171:9696/\": dial tcp 10.217.0.171:9696: connect: connection refused" Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.960332 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89 is running failed: container process not found" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.960992 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89 is running failed: container process not found" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.961305 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89 is running failed: container process not found" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 18 07:10:19 crc kubenswrapper[4917]: E0318 07:10:19.961335 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="dd4ec623-4dba-48e7-89f5-6cd3eadce847" containerName="nova-cell1-conductor-conductor" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.139485 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.260862 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-config-data\") pod \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.260967 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-combined-ca-bundle\") pod \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.261069 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w64nt\" (UniqueName: \"kubernetes.io/projected/dd4ec623-4dba-48e7-89f5-6cd3eadce847-kube-api-access-w64nt\") pod \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\" (UID: \"dd4ec623-4dba-48e7-89f5-6cd3eadce847\") " Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.264557 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4ec623-4dba-48e7-89f5-6cd3eadce847-kube-api-access-w64nt" (OuterVolumeSpecName: "kube-api-access-w64nt") pod "dd4ec623-4dba-48e7-89f5-6cd3eadce847" (UID: "dd4ec623-4dba-48e7-89f5-6cd3eadce847"). InnerVolumeSpecName "kube-api-access-w64nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.282722 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd4ec623-4dba-48e7-89f5-6cd3eadce847" (UID: "dd4ec623-4dba-48e7-89f5-6cd3eadce847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.293100 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-config-data" (OuterVolumeSpecName: "config-data") pod "dd4ec623-4dba-48e7-89f5-6cd3eadce847" (UID: "dd4ec623-4dba-48e7-89f5-6cd3eadce847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.362901 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w64nt\" (UniqueName: \"kubernetes.io/projected/dd4ec623-4dba-48e7-89f5-6cd3eadce847-kube-api-access-w64nt\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.362928 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.362938 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd4ec623-4dba-48e7-89f5-6cd3eadce847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.745115 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vf8d" event={"ID":"f8ecba0e-a87f-4b57-9f3d-ac772febac6c","Type":"ContainerStarted","Data":"92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3"} Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.754794 4917 generic.go:334] "Generic (PLEG): container finished" podID="dd4ec623-4dba-48e7-89f5-6cd3eadce847" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" exitCode=0 Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.754862 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.754943 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd4ec623-4dba-48e7-89f5-6cd3eadce847","Type":"ContainerDied","Data":"d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89"} Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.755014 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"dd4ec623-4dba-48e7-89f5-6cd3eadce847","Type":"ContainerDied","Data":"747956bbe9b21ee96566757b3eb20cac4ee3bc212347d4a35bed675a9a4e4d1f"} Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.755045 4917 scope.go:117] "RemoveContainer" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.786383 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9vf8d" podStartSLOduration=3.281909444 podStartE2EDuration="5.786364633s" podCreationTimestamp="2026-03-18 07:10:15 +0000 UTC" firstStartedPulling="2026-03-18 07:10:17.639530704 +0000 UTC m=+1402.580685418" lastFinishedPulling="2026-03-18 07:10:20.143985893 +0000 UTC m=+1405.085140607" observedRunningTime="2026-03-18 07:10:20.781977906 +0000 UTC m=+1405.723132630" watchObservedRunningTime="2026-03-18 07:10:20.786364633 +0000 UTC m=+1405.727519347" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.803638 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.813615 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.815951 4917 scope.go:117] "RemoveContainer" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" Mar 18 07:10:20 crc kubenswrapper[4917]: E0318 07:10:20.816291 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89\": container with ID starting with d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89 not found: ID does not exist" containerID="d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89" Mar 18 07:10:20 crc kubenswrapper[4917]: I0318 07:10:20.816324 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89"} err="failed to get container status \"d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89\": rpc error: code = NotFound desc = could not find container \"d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89\": container with ID starting with d5e4c2ce54a363193492618bc8d364fd4a956142baef3c3beb9e2a25e56e6a89 not found: ID does not exist" Mar 18 07:10:21 crc kubenswrapper[4917]: I0318 07:10:21.786036 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cfbb53-8521-4c39-98ee-2d666b5682d3" path="/var/lib/kubelet/pods/42cfbb53-8521-4c39-98ee-2d666b5682d3/volumes" Mar 18 07:10:21 crc kubenswrapper[4917]: I0318 07:10:21.787200 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2ed8f1-269d-45fb-a766-46c867bd0a91" path="/var/lib/kubelet/pods/4a2ed8f1-269d-45fb-a766-46c867bd0a91/volumes" Mar 18 07:10:21 crc kubenswrapper[4917]: I0318 07:10:21.788229 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4ec623-4dba-48e7-89f5-6cd3eadce847" path="/var/lib/kubelet/pods/dd4ec623-4dba-48e7-89f5-6cd3eadce847/volumes" Mar 18 07:10:24 crc kubenswrapper[4917]: E0318 07:10:24.511798 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:24 crc kubenswrapper[4917]: E0318 07:10:24.514060 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:24 crc kubenswrapper[4917]: E0318 07:10:24.514971 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:24 crc kubenswrapper[4917]: E0318 07:10:24.515661 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:24 crc kubenswrapper[4917]: E0318 07:10:24.515732 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" Mar 18 07:10:24 crc kubenswrapper[4917]: E0318 07:10:24.516273 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:24 crc kubenswrapper[4917]: E0318 07:10:24.522884 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:24 crc kubenswrapper[4917]: E0318 07:10:24.522952 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" Mar 18 07:10:25 crc kubenswrapper[4917]: I0318 07:10:25.854652 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:25 crc kubenswrapper[4917]: I0318 07:10:25.854720 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:26 crc kubenswrapper[4917]: I0318 07:10:26.930165 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9vf8d" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerName="registry-server" probeResult="failure" output=< Mar 18 07:10:26 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 07:10:26 crc kubenswrapper[4917]: > Mar 18 07:10:29 crc kubenswrapper[4917]: E0318 07:10:29.511839 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:29 crc kubenswrapper[4917]: E0318 07:10:29.512486 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:29 crc kubenswrapper[4917]: E0318 07:10:29.512809 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:29 crc kubenswrapper[4917]: E0318 07:10:29.512841 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" Mar 18 07:10:29 crc kubenswrapper[4917]: E0318 07:10:29.513679 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:29 crc kubenswrapper[4917]: E0318 07:10:29.515558 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:29 crc kubenswrapper[4917]: E0318 07:10:29.517276 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:29 crc kubenswrapper[4917]: E0318 07:10:29.517328 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" Mar 18 07:10:32 crc kubenswrapper[4917]: I0318 07:10:32.892552 4917 generic.go:334] "Generic (PLEG): container finished" podID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerID="65ec14bc4fbd1d630fe7a906e20bb195b244f65201d429c85361678850908b23" exitCode=0 Mar 18 07:10:32 crc kubenswrapper[4917]: I0318 07:10:32.892633 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df967876c-494l9" event={"ID":"4be30c2f-97f1-4477-8322-8ed29dbd3c60","Type":"ContainerDied","Data":"65ec14bc4fbd1d630fe7a906e20bb195b244f65201d429c85361678850908b23"} Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.184632 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df967876c-494l9" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.271448 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-internal-tls-certs\") pod \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.272212 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-config\") pod \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.272664 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-ovndb-tls-certs\") pod \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.272931 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-combined-ca-bundle\") pod \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.273961 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-public-tls-certs\") pod \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.275680 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt6v6\" (UniqueName: \"kubernetes.io/projected/4be30c2f-97f1-4477-8322-8ed29dbd3c60-kube-api-access-mt6v6\") pod \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.276025 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-httpd-config\") pod \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\" (UID: \"4be30c2f-97f1-4477-8322-8ed29dbd3c60\") " Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.281453 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be30c2f-97f1-4477-8322-8ed29dbd3c60-kube-api-access-mt6v6" (OuterVolumeSpecName: "kube-api-access-mt6v6") pod "4be30c2f-97f1-4477-8322-8ed29dbd3c60" (UID: "4be30c2f-97f1-4477-8322-8ed29dbd3c60"). InnerVolumeSpecName "kube-api-access-mt6v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.282540 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4be30c2f-97f1-4477-8322-8ed29dbd3c60" (UID: "4be30c2f-97f1-4477-8322-8ed29dbd3c60"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.334838 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-config" (OuterVolumeSpecName: "config") pod "4be30c2f-97f1-4477-8322-8ed29dbd3c60" (UID: "4be30c2f-97f1-4477-8322-8ed29dbd3c60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.341857 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4be30c2f-97f1-4477-8322-8ed29dbd3c60" (UID: "4be30c2f-97f1-4477-8322-8ed29dbd3c60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.345409 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4be30c2f-97f1-4477-8322-8ed29dbd3c60" (UID: "4be30c2f-97f1-4477-8322-8ed29dbd3c60"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.351198 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4be30c2f-97f1-4477-8322-8ed29dbd3c60" (UID: "4be30c2f-97f1-4477-8322-8ed29dbd3c60"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.370162 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4be30c2f-97f1-4477-8322-8ed29dbd3c60" (UID: "4be30c2f-97f1-4477-8322-8ed29dbd3c60"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.378155 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.378188 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt6v6\" (UniqueName: \"kubernetes.io/projected/4be30c2f-97f1-4477-8322-8ed29dbd3c60-kube-api-access-mt6v6\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.378203 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.378215 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.378226 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-config\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.378237 4917 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.378250 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4be30c2f-97f1-4477-8322-8ed29dbd3c60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.911893 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-df967876c-494l9" event={"ID":"4be30c2f-97f1-4477-8322-8ed29dbd3c60","Type":"ContainerDied","Data":"ddd8f5ab47d54228e6650aa6597aac7478e65c7c29f6f7b7b6697aa9ff43a1b4"} Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.911978 4917 scope.go:117] "RemoveContainer" containerID="73299c429264676075eece28a8b7265bc2bcf9550d93b8dd8a305ef5c9be5061" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.911987 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-df967876c-494l9" Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.953180 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-df967876c-494l9"] Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.964120 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-df967876c-494l9"] Mar 18 07:10:33 crc kubenswrapper[4917]: I0318 07:10:33.967615 4917 scope.go:117] "RemoveContainer" containerID="65ec14bc4fbd1d630fe7a906e20bb195b244f65201d429c85361678850908b23" Mar 18 07:10:34 crc kubenswrapper[4917]: E0318 07:10:34.512112 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:34 crc kubenswrapper[4917]: E0318 07:10:34.512726 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:34 crc kubenswrapper[4917]: E0318 07:10:34.513167 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:34 crc kubenswrapper[4917]: E0318 07:10:34.513246 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" Mar 18 07:10:34 crc kubenswrapper[4917]: E0318 07:10:34.515052 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:34 crc kubenswrapper[4917]: E0318 07:10:34.517402 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:34 crc kubenswrapper[4917]: E0318 07:10:34.521070 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:34 crc kubenswrapper[4917]: E0318 07:10:34.521172 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" Mar 18 07:10:35 crc kubenswrapper[4917]: I0318 07:10:35.785706 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" path="/var/lib/kubelet/pods/4be30c2f-97f1-4477-8322-8ed29dbd3c60/volumes" Mar 18 07:10:35 crc kubenswrapper[4917]: I0318 07:10:35.939043 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:36 crc kubenswrapper[4917]: I0318 07:10:36.026821 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:36 crc kubenswrapper[4917]: I0318 07:10:36.190104 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vf8d"] Mar 18 07:10:37 crc kubenswrapper[4917]: I0318 07:10:37.960456 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9vf8d" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerName="registry-server" containerID="cri-o://92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3" gracePeriod=2 Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.486461 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.567315 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-catalog-content\") pod \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.567376 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g8sk\" (UniqueName: \"kubernetes.io/projected/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-kube-api-access-4g8sk\") pod \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.567437 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-utilities\") pod \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\" (UID: \"f8ecba0e-a87f-4b57-9f3d-ac772febac6c\") " Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.568552 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-utilities" (OuterVolumeSpecName: "utilities") pod "f8ecba0e-a87f-4b57-9f3d-ac772febac6c" (UID: "f8ecba0e-a87f-4b57-9f3d-ac772febac6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.575198 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-kube-api-access-4g8sk" (OuterVolumeSpecName: "kube-api-access-4g8sk") pod "f8ecba0e-a87f-4b57-9f3d-ac772febac6c" (UID: "f8ecba0e-a87f-4b57-9f3d-ac772febac6c"). InnerVolumeSpecName "kube-api-access-4g8sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.669173 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g8sk\" (UniqueName: \"kubernetes.io/projected/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-kube-api-access-4g8sk\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.669213 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.733774 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f8ecba0e-a87f-4b57-9f3d-ac772febac6c" (UID: "f8ecba0e-a87f-4b57-9f3d-ac772febac6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.770707 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f8ecba0e-a87f-4b57-9f3d-ac772febac6c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.975067 4917 generic.go:334] "Generic (PLEG): container finished" podID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerID="92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3" exitCode=0 Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.975281 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vf8d" event={"ID":"f8ecba0e-a87f-4b57-9f3d-ac772febac6c","Type":"ContainerDied","Data":"92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3"} Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.975820 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9vf8d" event={"ID":"f8ecba0e-a87f-4b57-9f3d-ac772febac6c","Type":"ContainerDied","Data":"c0474d9ccdb51a9fead20a6271c62220afdb96a369b53d576c17cf1111c10417"} Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.975849 4917 scope.go:117] "RemoveContainer" containerID="92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3" Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.975301 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9vf8d" Mar 18 07:10:38 crc kubenswrapper[4917]: I0318 07:10:38.995066 4917 scope.go:117] "RemoveContainer" containerID="5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a" Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.012146 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9vf8d"] Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.018776 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9vf8d"] Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.034748 4917 scope.go:117] "RemoveContainer" containerID="1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da" Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.060158 4917 scope.go:117] "RemoveContainer" containerID="92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3" Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.060717 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3\": container with ID starting with 92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3 not found: ID does not exist" containerID="92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3" Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.060751 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3"} err="failed to get container status \"92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3\": rpc error: code = NotFound desc = could not find container \"92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3\": container with ID starting with 92e335a5d6a55a7c95639fcf8e3168e28b158643115d9ac7913351c25d2e53a3 not found: ID does not exist" Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.060773 4917 scope.go:117] "RemoveContainer" containerID="5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a" Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.061182 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a\": container with ID starting with 5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a not found: ID does not exist" containerID="5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a" Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.061208 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a"} err="failed to get container status \"5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a\": rpc error: code = NotFound desc = could not find container \"5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a\": container with ID starting with 5649799fdb464a5f2ce0a66cedaea8d47d3c32a366d00bbeb2aeb4865f83dc8a not found: ID does not exist" Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.061222 4917 scope.go:117] "RemoveContainer" containerID="1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da" Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.061730 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da\": container with ID starting with 1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da not found: ID does not exist" containerID="1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da" Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.061778 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da"} err="failed to get container status \"1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da\": rpc error: code = NotFound desc = could not find container \"1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da\": container with ID starting with 1f5fae4066baaa539fd35e55cd2372251d9c0f98af3719796f6ab236264330da not found: ID does not exist" Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.512073 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.513175 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.514088 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.514194 4917 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.514113 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.517000 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.518863 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 18 07:10:39 crc kubenswrapper[4917]: E0318 07:10:39.518929 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-jwxq2" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" Mar 18 07:10:39 crc kubenswrapper[4917]: I0318 07:10:39.787063 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" path="/var/lib/kubelet/pods/f8ecba0e-a87f-4b57-9f3d-ac772febac6c/volumes" Mar 18 07:10:40 crc kubenswrapper[4917]: I0318 07:10:40.963468 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.006563 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jwxq2_dbba9465-1e4b-4b42-b512-addd628093d3/ovs-vswitchd/0.log" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.007932 4917 generic.go:334] "Generic (PLEG): container finished" podID="dbba9465-1e4b-4b42-b512-addd628093d3" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" exitCode=137 Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.008010 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jwxq2" event={"ID":"dbba9465-1e4b-4b42-b512-addd628093d3","Type":"ContainerDied","Data":"5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8"} Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.025935 4917 generic.go:334] "Generic (PLEG): container finished" podID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerID="44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00" exitCode=137 Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.025974 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00"} Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.026001 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7385bce6-e9e7-4f8b-84be-9afb342f7134","Type":"ContainerDied","Data":"04516f85c5dee38cd77003da96b1ab2262b612ddd5b119b0d301a50d9f551d59"} Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.026017 4917 scope.go:117] "RemoveContainer" containerID="44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.026186 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.055854 4917 scope.go:117] "RemoveContainer" containerID="480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.086385 4917 scope.go:117] "RemoveContainer" containerID="07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.110229 4917 scope.go:117] "RemoveContainer" containerID="8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.124030 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385bce6-e9e7-4f8b-84be-9afb342f7134-combined-ca-bundle\") pod \"7385bce6-e9e7-4f8b-84be-9afb342f7134\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.124085 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"7385bce6-e9e7-4f8b-84be-9afb342f7134\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.124126 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-cache\") pod \"7385bce6-e9e7-4f8b-84be-9afb342f7134\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.124146 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-lock\") pod \"7385bce6-e9e7-4f8b-84be-9afb342f7134\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.124178 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg8sq\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-kube-api-access-lg8sq\") pod \"7385bce6-e9e7-4f8b-84be-9afb342f7134\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.124315 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") pod \"7385bce6-e9e7-4f8b-84be-9afb342f7134\" (UID: \"7385bce6-e9e7-4f8b-84be-9afb342f7134\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.125324 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-cache" (OuterVolumeSpecName: "cache") pod "7385bce6-e9e7-4f8b-84be-9afb342f7134" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.126117 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-lock" (OuterVolumeSpecName: "lock") pod "7385bce6-e9e7-4f8b-84be-9afb342f7134" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.134832 4917 scope.go:117] "RemoveContainer" containerID="002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.134925 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "swift") pod "7385bce6-e9e7-4f8b-84be-9afb342f7134" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.134934 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7385bce6-e9e7-4f8b-84be-9afb342f7134" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.143337 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-kube-api-access-lg8sq" (OuterVolumeSpecName: "kube-api-access-lg8sq") pod "7385bce6-e9e7-4f8b-84be-9afb342f7134" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134"). InnerVolumeSpecName "kube-api-access-lg8sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.172638 4917 scope.go:117] "RemoveContainer" containerID="42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.189125 4917 scope.go:117] "RemoveContainer" containerID="5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.206048 4917 scope.go:117] "RemoveContainer" containerID="12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.216224 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jwxq2_dbba9465-1e4b-4b42-b512-addd628093d3/ovs-vswitchd/0.log" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.217888 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.223443 4917 scope.go:117] "RemoveContainer" containerID="8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.225402 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.225430 4917 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-cache\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.225442 4917 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7385bce6-e9e7-4f8b-84be-9afb342f7134-lock\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.225453 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg8sq\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-kube-api-access-lg8sq\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.225465 4917 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7385bce6-e9e7-4f8b-84be-9afb342f7134-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.247568 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.253372 4917 scope.go:117] "RemoveContainer" containerID="33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.293892 4917 scope.go:117] "RemoveContainer" containerID="ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.320473 4917 scope.go:117] "RemoveContainer" containerID="1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326205 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-log\") pod \"dbba9465-1e4b-4b42-b512-addd628093d3\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326261 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-log" (OuterVolumeSpecName: "var-log") pod "dbba9465-1e4b-4b42-b512-addd628093d3" (UID: "dbba9465-1e4b-4b42-b512-addd628093d3"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326311 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-run\") pod \"dbba9465-1e4b-4b42-b512-addd628093d3\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326395 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-run" (OuterVolumeSpecName: "var-run") pod "dbba9465-1e4b-4b42-b512-addd628093d3" (UID: "dbba9465-1e4b-4b42-b512-addd628093d3"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326400 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24kf9\" (UniqueName: \"kubernetes.io/projected/dbba9465-1e4b-4b42-b512-addd628093d3-kube-api-access-24kf9\") pod \"dbba9465-1e4b-4b42-b512-addd628093d3\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326476 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-etc-ovs\") pod \"dbba9465-1e4b-4b42-b512-addd628093d3\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326506 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-lib\") pod \"dbba9465-1e4b-4b42-b512-addd628093d3\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326543 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbba9465-1e4b-4b42-b512-addd628093d3-scripts\") pod \"dbba9465-1e4b-4b42-b512-addd628093d3\" (UID: \"dbba9465-1e4b-4b42-b512-addd628093d3\") " Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326707 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "dbba9465-1e4b-4b42-b512-addd628093d3" (UID: "dbba9465-1e4b-4b42-b512-addd628093d3"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.326782 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-lib" (OuterVolumeSpecName: "var-lib") pod "dbba9465-1e4b-4b42-b512-addd628093d3" (UID: "dbba9465-1e4b-4b42-b512-addd628093d3"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.327864 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbba9465-1e4b-4b42-b512-addd628093d3-scripts" (OuterVolumeSpecName: "scripts") pod "dbba9465-1e4b-4b42-b512-addd628093d3" (UID: "dbba9465-1e4b-4b42-b512-addd628093d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.329769 4917 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-run\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.329792 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.329801 4917 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.329811 4917 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-lib\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.329820 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbba9465-1e4b-4b42-b512-addd628093d3-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.329831 4917 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dbba9465-1e4b-4b42-b512-addd628093d3-var-log\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.330873 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbba9465-1e4b-4b42-b512-addd628093d3-kube-api-access-24kf9" (OuterVolumeSpecName: "kube-api-access-24kf9") pod "dbba9465-1e4b-4b42-b512-addd628093d3" (UID: "dbba9465-1e4b-4b42-b512-addd628093d3"). InnerVolumeSpecName "kube-api-access-24kf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.338127 4917 scope.go:117] "RemoveContainer" containerID="fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.352789 4917 scope.go:117] "RemoveContainer" containerID="473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.372483 4917 scope.go:117] "RemoveContainer" containerID="d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.389426 4917 scope.go:117] "RemoveContainer" containerID="44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.390036 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00\": container with ID starting with 44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00 not found: ID does not exist" containerID="44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.390078 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00"} err="failed to get container status \"44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00\": rpc error: code = NotFound desc = could not find container \"44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00\": container with ID starting with 44582f81ca2505fa44d4d2931657fd2e1f65fc210c7a41e84e9d2a2fc5c42b00 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.390100 4917 scope.go:117] "RemoveContainer" containerID="480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.390370 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7\": container with ID starting with 480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7 not found: ID does not exist" containerID="480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.390400 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7"} err="failed to get container status \"480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7\": rpc error: code = NotFound desc = could not find container \"480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7\": container with ID starting with 480f82b360083ec61ccc5aea78ac5ac98c3b7b59cd983b0a6f611ea643fe81c7 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.390414 4917 scope.go:117] "RemoveContainer" containerID="07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.390887 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0\": container with ID starting with 07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0 not found: ID does not exist" containerID="07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.390922 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0"} err="failed to get container status \"07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0\": rpc error: code = NotFound desc = could not find container \"07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0\": container with ID starting with 07bca469e350565aae1cd2eeaf9d578b8614d0f500b6eecc0cd3fc15c63d21a0 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.390949 4917 scope.go:117] "RemoveContainer" containerID="8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.391377 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a\": container with ID starting with 8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a not found: ID does not exist" containerID="8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.391399 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a"} err="failed to get container status \"8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a\": rpc error: code = NotFound desc = could not find container \"8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a\": container with ID starting with 8a4aa80d38200abfca8552caeec199a32abc00d907fb8d05e08a1ed24dc9de2a not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.391411 4917 scope.go:117] "RemoveContainer" containerID="002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.391649 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182\": container with ID starting with 002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182 not found: ID does not exist" containerID="002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.391669 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182"} err="failed to get container status \"002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182\": rpc error: code = NotFound desc = could not find container \"002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182\": container with ID starting with 002276fe0ca9a71c057df6b48fcd1aedc5f0114a76e0d171f614c1b1c5c32182 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.391680 4917 scope.go:117] "RemoveContainer" containerID="42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.391989 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0\": container with ID starting with 42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0 not found: ID does not exist" containerID="42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.392007 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0"} err="failed to get container status \"42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0\": rpc error: code = NotFound desc = could not find container \"42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0\": container with ID starting with 42ba7ee8a0f0b918dd698008dd108bc316b2c51039a2adc5497607791bef2da0 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.392019 4917 scope.go:117] "RemoveContainer" containerID="5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.392328 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056\": container with ID starting with 5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056 not found: ID does not exist" containerID="5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.392356 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056"} err="failed to get container status \"5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056\": rpc error: code = NotFound desc = could not find container \"5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056\": container with ID starting with 5e953936b0248965db03575e799a1f438882efdf1a462ceda5d609b87a8e7056 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.392378 4917 scope.go:117] "RemoveContainer" containerID="12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.392645 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84\": container with ID starting with 12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84 not found: ID does not exist" containerID="12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.392665 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84"} err="failed to get container status \"12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84\": rpc error: code = NotFound desc = could not find container \"12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84\": container with ID starting with 12b4da77c15b0e3abf9589e469af08a2dc756735349b26715735315b3d83cc84 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.392682 4917 scope.go:117] "RemoveContainer" containerID="8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.392996 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a\": container with ID starting with 8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a not found: ID does not exist" containerID="8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.393016 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a"} err="failed to get container status \"8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a\": rpc error: code = NotFound desc = could not find container \"8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a\": container with ID starting with 8d35e63e7548680948b500c246c9265b7615bf2471e1abd03a661f7227ec082a not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.393032 4917 scope.go:117] "RemoveContainer" containerID="33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.393341 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5\": container with ID starting with 33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5 not found: ID does not exist" containerID="33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.393373 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5"} err="failed to get container status \"33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5\": rpc error: code = NotFound desc = could not find container \"33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5\": container with ID starting with 33b84f0fcac2919129cc88cdf57987804d8e14fb025393e30cb67cbb264992c5 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.393388 4917 scope.go:117] "RemoveContainer" containerID="ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.393675 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e\": container with ID starting with ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e not found: ID does not exist" containerID="ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.393695 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e"} err="failed to get container status \"ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e\": rpc error: code = NotFound desc = could not find container \"ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e\": container with ID starting with ec7ef2fb6e05daee3ee7de33cfedff9b962db9dfcc0268346375d7359bdc7d4e not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.393708 4917 scope.go:117] "RemoveContainer" containerID="1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.393960 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850\": container with ID starting with 1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850 not found: ID does not exist" containerID="1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.393978 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850"} err="failed to get container status \"1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850\": rpc error: code = NotFound desc = could not find container \"1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850\": container with ID starting with 1b40bdd623b932aef3582f254f0bf1bdd37257dd5b1f58019eb856b2ac66f850 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.393989 4917 scope.go:117] "RemoveContainer" containerID="fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.394251 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62\": container with ID starting with fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62 not found: ID does not exist" containerID="fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.394276 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62"} err="failed to get container status \"fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62\": rpc error: code = NotFound desc = could not find container \"fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62\": container with ID starting with fe91bd43988c8705835fbc8012c5ad2d259bffffe7e96905fcbc2f3dfa701a62 not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.394290 4917 scope.go:117] "RemoveContainer" containerID="473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.394561 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a\": container with ID starting with 473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a not found: ID does not exist" containerID="473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.394608 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a"} err="failed to get container status \"473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a\": rpc error: code = NotFound desc = could not find container \"473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a\": container with ID starting with 473288e3d9cb91258a424a2fbdc3f9cd2f7670ebdd7fe46358e5db9a1a4b316a not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.394627 4917 scope.go:117] "RemoveContainer" containerID="d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b" Mar 18 07:10:41 crc kubenswrapper[4917]: E0318 07:10:41.394894 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b\": container with ID starting with d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b not found: ID does not exist" containerID="d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.394927 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b"} err="failed to get container status \"d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b\": rpc error: code = NotFound desc = could not find container \"d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b\": container with ID starting with d8566b0dccbbc41337078498a5435684bd4bc3b3375f9eb7d82cf24c5fe00c8b not found: ID does not exist" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.423911 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7385bce6-e9e7-4f8b-84be-9afb342f7134-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7385bce6-e9e7-4f8b-84be-9afb342f7134" (UID: "7385bce6-e9e7-4f8b-84be-9afb342f7134"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.430613 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7385bce6-e9e7-4f8b-84be-9afb342f7134-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.430642 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24kf9\" (UniqueName: \"kubernetes.io/projected/dbba9465-1e4b-4b42-b512-addd628093d3-kube-api-access-24kf9\") on node \"crc\" DevicePath \"\"" Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.689306 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.694352 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 18 07:10:41 crc kubenswrapper[4917]: I0318 07:10:41.785915 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" path="/var/lib/kubelet/pods/7385bce6-e9e7-4f8b-84be-9afb342f7134/volumes" Mar 18 07:10:42 crc kubenswrapper[4917]: I0318 07:10:42.039778 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jwxq2_dbba9465-1e4b-4b42-b512-addd628093d3/ovs-vswitchd/0.log" Mar 18 07:10:42 crc kubenswrapper[4917]: I0318 07:10:42.041240 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jwxq2" event={"ID":"dbba9465-1e4b-4b42-b512-addd628093d3","Type":"ContainerDied","Data":"f6faece2d16c067365c6f3099d85ada197216bdcf072c28f76abfc592fa49775"} Mar 18 07:10:42 crc kubenswrapper[4917]: I0318 07:10:42.041296 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jwxq2" Mar 18 07:10:42 crc kubenswrapper[4917]: I0318 07:10:42.041312 4917 scope.go:117] "RemoveContainer" containerID="5af4bc8f062e2d5f9687d2e8ca7c163c53bd594c9626130542fd836bf6ba65e8" Mar 18 07:10:42 crc kubenswrapper[4917]: I0318 07:10:42.072225 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-jwxq2"] Mar 18 07:10:42 crc kubenswrapper[4917]: I0318 07:10:42.078454 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-jwxq2"] Mar 18 07:10:42 crc kubenswrapper[4917]: I0318 07:10:42.081059 4917 scope.go:117] "RemoveContainer" containerID="faa48031f43916214825e57a52dd5b5a45c429d4a830dbd5ad8924f9122c9304" Mar 18 07:10:42 crc kubenswrapper[4917]: I0318 07:10:42.111810 4917 scope.go:117] "RemoveContainer" containerID="f1fb7e855791c24d686bd1d7eb986bfa5f8a90f822e71b3a82a95f7eccb12cb5" Mar 18 07:10:43 crc kubenswrapper[4917]: I0318 07:10:43.784359 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" path="/var/lib/kubelet/pods/dbba9465-1e4b-4b42-b512-addd628093d3/volumes" Mar 18 07:11:00 crc kubenswrapper[4917]: I0318 07:11:00.239770 4917 scope.go:117] "RemoveContainer" containerID="519a8823772650121fd4e0d821d4540e3c6794d038c69e58bbdf20d9315eb955" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.892579 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h6qtc"] Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893419 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84500248-15f0-4049-8423-43502d6587cf" containerName="galera" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893436 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="84500248-15f0-4049-8423-43502d6587cf" containerName="galera" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893449 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-updater" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893457 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-updater" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893475 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="rsync" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893483 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="rsync" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893500 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server-init" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893509 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server-init" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893522 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerName="registry-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893529 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerName="registry-server" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893544 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="proxy-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893552 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="proxy-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893562 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-replicator" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893569 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-replicator" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893577 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893604 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api-log" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893618 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fb09df-78b6-44c6-a78f-2b720a98cfad" containerName="setup-container" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893626 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fb09df-78b6-44c6-a78f-2b720a98cfad" containerName="setup-container" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893636 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84500248-15f0-4049-8423-43502d6587cf" containerName="mysql-bootstrap" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893644 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="84500248-15f0-4049-8423-43502d6587cf" containerName="mysql-bootstrap" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893659 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893666 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-log" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893677 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerName="barbican-worker-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893686 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerName="barbican-worker-log" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893699 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerName="barbican-keystone-listener-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893707 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerName="barbican-keystone-listener-log" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893718 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-expirer" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893726 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-expirer" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893734 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893742 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893755 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893807 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-server" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893829 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cfbb53-8521-4c39-98ee-2d666b5682d3" containerName="nova-cell0-conductor-conductor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893837 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cfbb53-8521-4c39-98ee-2d666b5682d3" containerName="nova-cell0-conductor-conductor" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893846 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2ed8f1-269d-45fb-a766-46c867bd0a91" containerName="setup-container" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893854 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2ed8f1-269d-45fb-a766-46c867bd0a91" containerName="setup-container" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893892 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerName="extract-utilities" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893901 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerName="extract-utilities" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893915 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893923 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893941 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="sg-core" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893948 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="sg-core" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893982 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="ceilometer-central-agent" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.893991 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="ceilometer-central-agent" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.893999 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerName="barbican-worker" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894007 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerName="barbican-worker" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894017 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerName="openstack-network-exporter" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894025 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerName="openstack-network-exporter" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894057 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894067 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894077 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-auditor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894157 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-auditor" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894173 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="ceilometer-notification-agent" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894181 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="ceilometer-notification-agent" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894197 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9f629f-c02c-4416-ae21-2b49cea903b5" containerName="mariadb-account-create-update" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894204 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9f629f-c02c-4416-ae21-2b49cea903b5" containerName="mariadb-account-create-update" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894216 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-auditor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894247 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-auditor" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894263 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-replicator" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894271 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-replicator" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894283 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerName="neutron-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894290 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerName="neutron-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894299 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="033e4d58-03e5-49fa-ad5b-169464bf7ba9" containerName="memcached" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894331 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="033e4d58-03e5-49fa-ad5b-169464bf7ba9" containerName="memcached" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894343 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fe020eb-0bd4-4efa-9711-3f07ce31907c" containerName="keystone-api" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894351 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fe020eb-0bd4-4efa-9711-3f07ce31907c" containerName="keystone-api" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894361 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-auditor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894369 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-auditor" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894378 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62df143-348a-4ec7-b331-04db3857e847" containerName="nova-scheduler-scheduler" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894389 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62df143-348a-4ec7-b331-04db3857e847" containerName="nova-scheduler-scheduler" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894426 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerName="extract-content" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894434 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerName="extract-content" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894447 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="swift-recon-cron" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894455 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="swift-recon-cron" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894464 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894471 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-server" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894502 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-reaper" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894510 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-reaper" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894518 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-updater" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894527 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-updater" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894541 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894549 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-server" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894559 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-replicator" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894566 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-replicator" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894576 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerName="ovn-northd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894597 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerName="ovn-northd" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894609 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894616 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-log" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894626 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11fb09df-78b6-44c6-a78f-2b720a98cfad" containerName="rabbitmq" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894635 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fb09df-78b6-44c6-a78f-2b720a98cfad" containerName="rabbitmq" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894645 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerName="barbican-keystone-listener" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894653 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerName="barbican-keystone-listener" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894665 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerName="neutron-api" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894672 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerName="neutron-api" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894683 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894691 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894702 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f9f629f-c02c-4416-ae21-2b49cea903b5" containerName="mariadb-account-create-update" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894710 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f9f629f-c02c-4416-ae21-2b49cea903b5" containerName="mariadb-account-create-update" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894721 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2ed8f1-269d-45fb-a766-46c867bd0a91" containerName="rabbitmq" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894729 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2ed8f1-269d-45fb-a766-46c867bd0a91" containerName="rabbitmq" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894739 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894747 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894756 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4ec623-4dba-48e7-89f5-6cd3eadce847" containerName="nova-cell1-conductor-conductor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894764 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4ec623-4dba-48e7-89f5-6cd3eadce847" containerName="nova-cell1-conductor-conductor" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894772 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-metadata" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894779 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-metadata" Mar 18 07:11:51 crc kubenswrapper[4917]: E0318 07:11:51.894788 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894796 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894958 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-expirer" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894973 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerName="barbican-worker" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894983 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.894992 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-replicator" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895003 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerName="neutron-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895013 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4ec623-4dba-48e7-89f5-6cd3eadce847" containerName="nova-cell1-conductor-conductor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895023 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="proxy-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895033 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="rsync" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895045 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerName="barbican-keystone-listener" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895055 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895065 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895077 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895088 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="ceilometer-central-agent" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895097 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="ceilometer-notification-agent" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895128 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895138 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3ff50e-a301-4abe-bbaf-2b0075b80b47" containerName="barbican-api-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895149 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2ed8f1-269d-45fb-a766-46c867bd0a91" containerName="rabbitmq" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895158 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-replicator" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895172 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a4114f4-5182-4b59-b9be-72a6f4ed11fb" containerName="sg-core" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895186 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a62df143-348a-4ec7-b331-04db3857e847" containerName="nova-scheduler-scheduler" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895194 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerName="openstack-network-exporter" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895206 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="156b1187-23ee-4a81-8d1d-ad91c2468b7d" containerName="barbican-keystone-listener-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895214 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-reaper" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895222 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cfbb53-8521-4c39-98ee-2d666b5682d3" containerName="nova-cell0-conductor-conductor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895233 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bf8cc3-5674-418a-a126-f43e3d2f092d" containerName="glance-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895242 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be30c2f-97f1-4477-8322-8ed29dbd3c60" containerName="neutron-api" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895253 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-auditor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895264 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="84500248-15f0-4049-8423-43502d6587cf" containerName="galera" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895271 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fe020eb-0bd4-4efa-9711-3f07ce31907c" containerName="keystone-api" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895281 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="swift-recon-cron" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895290 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3eb3c95e-6b59-4ca7-aef2-8bae3eb9dca6" containerName="barbican-worker-log" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895299 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b83ec86-5c66-4dc6-9236-a437f37611a9" containerName="ovn-northd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895309 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-auditor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895318 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9f629f-c02c-4416-ae21-2b49cea903b5" containerName="mariadb-account-create-update" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895330 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovsdb-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895341 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-updater" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895352 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-auditor" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895364 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="object-updater" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895373 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895383 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="033e4d58-03e5-49fa-ad5b-169464bf7ba9" containerName="memcached" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895395 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a55197a-92c3-451c-9d5d-d3a6426c995b" containerName="glance-httpd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895405 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="container-replicator" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895416 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbba9465-1e4b-4b42-b512-addd628093d3" containerName="ovs-vswitchd" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895424 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f9f629f-c02c-4416-ae21-2b49cea903b5" containerName="mariadb-account-create-update" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895436 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8ecba0e-a87f-4b57-9f3d-ac772febac6c" containerName="registry-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895445 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="06578cec-8f13-4daf-966e-89f743a134fe" containerName="nova-metadata-metadata" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895458 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7385bce6-e9e7-4f8b-84be-9afb342f7134" containerName="account-server" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.895474 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="11fb09df-78b6-44c6-a78f-2b720a98cfad" containerName="rabbitmq" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.896878 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.919550 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6qtc"] Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.990569 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-catalog-content\") pod \"community-operators-h6qtc\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.990686 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqqnh\" (UniqueName: \"kubernetes.io/projected/30105150-84f4-44e0-a05d-34e7eb5fab44-kube-api-access-sqqnh\") pod \"community-operators-h6qtc\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:51 crc kubenswrapper[4917]: I0318 07:11:51.990758 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-utilities\") pod \"community-operators-h6qtc\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.091755 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqqnh\" (UniqueName: \"kubernetes.io/projected/30105150-84f4-44e0-a05d-34e7eb5fab44-kube-api-access-sqqnh\") pod \"community-operators-h6qtc\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.091834 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-utilities\") pod \"community-operators-h6qtc\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.091897 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-catalog-content\") pod \"community-operators-h6qtc\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.092392 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-catalog-content\") pod \"community-operators-h6qtc\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.092428 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-utilities\") pod \"community-operators-h6qtc\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.114950 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqqnh\" (UniqueName: \"kubernetes.io/projected/30105150-84f4-44e0-a05d-34e7eb5fab44-kube-api-access-sqqnh\") pod \"community-operators-h6qtc\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.217217 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.696740 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6qtc"] Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.911003 4917 generic.go:334] "Generic (PLEG): container finished" podID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerID="0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3" exitCode=0 Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.911072 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6qtc" event={"ID":"30105150-84f4-44e0-a05d-34e7eb5fab44","Type":"ContainerDied","Data":"0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3"} Mar 18 07:11:52 crc kubenswrapper[4917]: I0318 07:11:52.911142 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6qtc" event={"ID":"30105150-84f4-44e0-a05d-34e7eb5fab44","Type":"ContainerStarted","Data":"3b752e10d2b8ec24b4ee1d9eb4c2bc4ab3b87ea2e97972e61c7ee07c732399aa"} Mar 18 07:11:54 crc kubenswrapper[4917]: I0318 07:11:54.936256 4917 generic.go:334] "Generic (PLEG): container finished" podID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerID="778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e" exitCode=0 Mar 18 07:11:54 crc kubenswrapper[4917]: I0318 07:11:54.936486 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6qtc" event={"ID":"30105150-84f4-44e0-a05d-34e7eb5fab44","Type":"ContainerDied","Data":"778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e"} Mar 18 07:11:55 crc kubenswrapper[4917]: I0318 07:11:55.950734 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6qtc" event={"ID":"30105150-84f4-44e0-a05d-34e7eb5fab44","Type":"ContainerStarted","Data":"bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe"} Mar 18 07:11:55 crc kubenswrapper[4917]: I0318 07:11:55.980247 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h6qtc" podStartSLOduration=2.579192394 podStartE2EDuration="4.980216336s" podCreationTimestamp="2026-03-18 07:11:51 +0000 UTC" firstStartedPulling="2026-03-18 07:11:52.913768007 +0000 UTC m=+1497.854922721" lastFinishedPulling="2026-03-18 07:11:55.314791939 +0000 UTC m=+1500.255946663" observedRunningTime="2026-03-18 07:11:55.970437925 +0000 UTC m=+1500.911592669" watchObservedRunningTime="2026-03-18 07:11:55.980216336 +0000 UTC m=+1500.921371080" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.167873 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563632-cpv84"] Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.169161 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563632-cpv84" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.174641 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.174839 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.176769 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.190712 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563632-cpv84"] Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.221028 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjfd\" (UniqueName: \"kubernetes.io/projected/5dfd320a-b048-45a8-951e-0268f8931561-kube-api-access-ppjfd\") pod \"auto-csr-approver-29563632-cpv84\" (UID: \"5dfd320a-b048-45a8-951e-0268f8931561\") " pod="openshift-infra/auto-csr-approver-29563632-cpv84" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.322253 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjfd\" (UniqueName: \"kubernetes.io/projected/5dfd320a-b048-45a8-951e-0268f8931561-kube-api-access-ppjfd\") pod \"auto-csr-approver-29563632-cpv84\" (UID: \"5dfd320a-b048-45a8-951e-0268f8931561\") " pod="openshift-infra/auto-csr-approver-29563632-cpv84" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.347064 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjfd\" (UniqueName: \"kubernetes.io/projected/5dfd320a-b048-45a8-951e-0268f8931561-kube-api-access-ppjfd\") pod \"auto-csr-approver-29563632-cpv84\" (UID: \"5dfd320a-b048-45a8-951e-0268f8931561\") " pod="openshift-infra/auto-csr-approver-29563632-cpv84" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.491947 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563632-cpv84" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.935668 4917 scope.go:117] "RemoveContainer" containerID="e8321e394102520dcb88380357df3d2f6b7fb4eebed49b73947cce669ba85bdb" Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.938051 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563632-cpv84"] Mar 18 07:12:00 crc kubenswrapper[4917]: I0318 07:12:00.991392 4917 scope.go:117] "RemoveContainer" containerID="0affa86d9ad40088e59a3f25bf18df604583664b66c3e9037725dd7d8fb8047d" Mar 18 07:12:01 crc kubenswrapper[4917]: I0318 07:12:01.002868 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563632-cpv84" event={"ID":"5dfd320a-b048-45a8-951e-0268f8931561","Type":"ContainerStarted","Data":"ec9f0d4d54807078d2560507d23fff3076d91d1cd327f7cff46d2c56c2bc0c2d"} Mar 18 07:12:01 crc kubenswrapper[4917]: I0318 07:12:01.010609 4917 scope.go:117] "RemoveContainer" containerID="a5c8dcc8b00fb61a998e2de57eda4651c17c337a9d22a5003f5ddff672299ec8" Mar 18 07:12:01 crc kubenswrapper[4917]: I0318 07:12:01.030744 4917 scope.go:117] "RemoveContainer" containerID="804617347f439740e66792317b763d4d4061eab3918a6e260b4ba919d324b619" Mar 18 07:12:01 crc kubenswrapper[4917]: I0318 07:12:01.054981 4917 scope.go:117] "RemoveContainer" containerID="7087316d4bc5f762711dcb6e2fd729e279467784e67e4f93f5ce5dbba379f5dc" Mar 18 07:12:01 crc kubenswrapper[4917]: I0318 07:12:01.070649 4917 scope.go:117] "RemoveContainer" containerID="229ed9b75a6c6ab364d819b8ba387d93d101381df8c8e058d0975c3e44b2ee17" Mar 18 07:12:01 crc kubenswrapper[4917]: I0318 07:12:01.111423 4917 scope.go:117] "RemoveContainer" containerID="a6b7dc9e685c315ce6e8cbdd47d2f213c0a46ab7945b7ae81297a06be7d1900b" Mar 18 07:12:01 crc kubenswrapper[4917]: I0318 07:12:01.132890 4917 scope.go:117] "RemoveContainer" containerID="a6b74816997731ee88b4c0a3507aa5ccade9c70b7ede35bfa632b3f05414456c" Mar 18 07:12:01 crc kubenswrapper[4917]: I0318 07:12:01.149796 4917 scope.go:117] "RemoveContainer" containerID="b569ea190b5d3209285e816291e69308cddca6c166c4343e82c025cda3198375" Mar 18 07:12:02 crc kubenswrapper[4917]: I0318 07:12:02.218057 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:12:02 crc kubenswrapper[4917]: I0318 07:12:02.218333 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:12:02 crc kubenswrapper[4917]: I0318 07:12:02.296416 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:12:02 crc kubenswrapper[4917]: I0318 07:12:02.929472 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:12:02 crc kubenswrapper[4917]: I0318 07:12:02.930202 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:12:03 crc kubenswrapper[4917]: I0318 07:12:03.024299 4917 generic.go:334] "Generic (PLEG): container finished" podID="5dfd320a-b048-45a8-951e-0268f8931561" containerID="6cbb41ff5ad3df72c0af3439c73f1059c248c77578c7f84281cb1cee8a64008a" exitCode=0 Mar 18 07:12:03 crc kubenswrapper[4917]: I0318 07:12:03.024391 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563632-cpv84" event={"ID":"5dfd320a-b048-45a8-951e-0268f8931561","Type":"ContainerDied","Data":"6cbb41ff5ad3df72c0af3439c73f1059c248c77578c7f84281cb1cee8a64008a"} Mar 18 07:12:03 crc kubenswrapper[4917]: I0318 07:12:03.067381 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:12:03 crc kubenswrapper[4917]: I0318 07:12:03.127761 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6qtc"] Mar 18 07:12:04 crc kubenswrapper[4917]: I0318 07:12:04.363779 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563632-cpv84" Mar 18 07:12:04 crc kubenswrapper[4917]: I0318 07:12:04.509750 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppjfd\" (UniqueName: \"kubernetes.io/projected/5dfd320a-b048-45a8-951e-0268f8931561-kube-api-access-ppjfd\") pod \"5dfd320a-b048-45a8-951e-0268f8931561\" (UID: \"5dfd320a-b048-45a8-951e-0268f8931561\") " Mar 18 07:12:04 crc kubenswrapper[4917]: I0318 07:12:04.518404 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dfd320a-b048-45a8-951e-0268f8931561-kube-api-access-ppjfd" (OuterVolumeSpecName: "kube-api-access-ppjfd") pod "5dfd320a-b048-45a8-951e-0268f8931561" (UID: "5dfd320a-b048-45a8-951e-0268f8931561"). InnerVolumeSpecName "kube-api-access-ppjfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:12:04 crc kubenswrapper[4917]: I0318 07:12:04.612347 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppjfd\" (UniqueName: \"kubernetes.io/projected/5dfd320a-b048-45a8-951e-0268f8931561-kube-api-access-ppjfd\") on node \"crc\" DevicePath \"\"" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.050043 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563632-cpv84" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.050041 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563632-cpv84" event={"ID":"5dfd320a-b048-45a8-951e-0268f8931561","Type":"ContainerDied","Data":"ec9f0d4d54807078d2560507d23fff3076d91d1cd327f7cff46d2c56c2bc0c2d"} Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.050137 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec9f0d4d54807078d2560507d23fff3076d91d1cd327f7cff46d2c56c2bc0c2d" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.050205 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h6qtc" podUID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerName="registry-server" containerID="cri-o://bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe" gracePeriod=2 Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.447681 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563626-vlkxh"] Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.454532 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563626-vlkxh"] Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.560498 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.726660 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-utilities\") pod \"30105150-84f4-44e0-a05d-34e7eb5fab44\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.726723 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-catalog-content\") pod \"30105150-84f4-44e0-a05d-34e7eb5fab44\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.726815 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqqnh\" (UniqueName: \"kubernetes.io/projected/30105150-84f4-44e0-a05d-34e7eb5fab44-kube-api-access-sqqnh\") pod \"30105150-84f4-44e0-a05d-34e7eb5fab44\" (UID: \"30105150-84f4-44e0-a05d-34e7eb5fab44\") " Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.728618 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-utilities" (OuterVolumeSpecName: "utilities") pod "30105150-84f4-44e0-a05d-34e7eb5fab44" (UID: "30105150-84f4-44e0-a05d-34e7eb5fab44"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.734499 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30105150-84f4-44e0-a05d-34e7eb5fab44-kube-api-access-sqqnh" (OuterVolumeSpecName: "kube-api-access-sqqnh") pod "30105150-84f4-44e0-a05d-34e7eb5fab44" (UID: "30105150-84f4-44e0-a05d-34e7eb5fab44"). InnerVolumeSpecName "kube-api-access-sqqnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.786837 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1f16f1b-109d-44a6-9ac8-a954e43c41c5" path="/var/lib/kubelet/pods/c1f16f1b-109d-44a6-9ac8-a954e43c41c5/volumes" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.796814 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "30105150-84f4-44e0-a05d-34e7eb5fab44" (UID: "30105150-84f4-44e0-a05d-34e7eb5fab44"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.829522 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.829655 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30105150-84f4-44e0-a05d-34e7eb5fab44-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:12:05 crc kubenswrapper[4917]: I0318 07:12:05.829980 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqqnh\" (UniqueName: \"kubernetes.io/projected/30105150-84f4-44e0-a05d-34e7eb5fab44-kube-api-access-sqqnh\") on node \"crc\" DevicePath \"\"" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.064926 4917 generic.go:334] "Generic (PLEG): container finished" podID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerID="bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe" exitCode=0 Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.064990 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6qtc" event={"ID":"30105150-84f4-44e0-a05d-34e7eb5fab44","Type":"ContainerDied","Data":"bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe"} Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.064998 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6qtc" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.065040 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6qtc" event={"ID":"30105150-84f4-44e0-a05d-34e7eb5fab44","Type":"ContainerDied","Data":"3b752e10d2b8ec24b4ee1d9eb4c2bc4ab3b87ea2e97972e61c7ee07c732399aa"} Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.065100 4917 scope.go:117] "RemoveContainer" containerID="bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.092572 4917 scope.go:117] "RemoveContainer" containerID="778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.121207 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6qtc"] Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.132097 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h6qtc"] Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.149437 4917 scope.go:117] "RemoveContainer" containerID="0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.177199 4917 scope.go:117] "RemoveContainer" containerID="bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe" Mar 18 07:12:06 crc kubenswrapper[4917]: E0318 07:12:06.177776 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe\": container with ID starting with bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe not found: ID does not exist" containerID="bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.177833 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe"} err="failed to get container status \"bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe\": rpc error: code = NotFound desc = could not find container \"bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe\": container with ID starting with bfce88ae09601329505581cd62f8258806067a69cf6d57d6a8c4beee1e0a3dbe not found: ID does not exist" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.177866 4917 scope.go:117] "RemoveContainer" containerID="778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e" Mar 18 07:12:06 crc kubenswrapper[4917]: E0318 07:12:06.178191 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e\": container with ID starting with 778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e not found: ID does not exist" containerID="778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.178229 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e"} err="failed to get container status \"778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e\": rpc error: code = NotFound desc = could not find container \"778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e\": container with ID starting with 778e83efcee7fbc8c2cc0de75502d5805a4b79372b032939d7ec5795bc964c8e not found: ID does not exist" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.178256 4917 scope.go:117] "RemoveContainer" containerID="0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3" Mar 18 07:12:06 crc kubenswrapper[4917]: E0318 07:12:06.178886 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3\": container with ID starting with 0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3 not found: ID does not exist" containerID="0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3" Mar 18 07:12:06 crc kubenswrapper[4917]: I0318 07:12:06.178928 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3"} err="failed to get container status \"0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3\": rpc error: code = NotFound desc = could not find container \"0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3\": container with ID starting with 0318785c83fc54972eaf6b4017e8aa16e9c506ee51a2d78a53690cc147484fb3 not found: ID does not exist" Mar 18 07:12:07 crc kubenswrapper[4917]: I0318 07:12:07.785652 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30105150-84f4-44e0-a05d-34e7eb5fab44" path="/var/lib/kubelet/pods/30105150-84f4-44e0-a05d-34e7eb5fab44/volumes" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.893343 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8lgn2"] Mar 18 07:12:20 crc kubenswrapper[4917]: E0318 07:12:20.894525 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerName="extract-content" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.894554 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerName="extract-content" Mar 18 07:12:20 crc kubenswrapper[4917]: E0318 07:12:20.894619 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerName="registry-server" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.894638 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerName="registry-server" Mar 18 07:12:20 crc kubenswrapper[4917]: E0318 07:12:20.894668 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerName="extract-utilities" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.894686 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerName="extract-utilities" Mar 18 07:12:20 crc kubenswrapper[4917]: E0318 07:12:20.894714 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dfd320a-b048-45a8-951e-0268f8931561" containerName="oc" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.894732 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dfd320a-b048-45a8-951e-0268f8931561" containerName="oc" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.895076 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dfd320a-b048-45a8-951e-0268f8931561" containerName="oc" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.895121 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="30105150-84f4-44e0-a05d-34e7eb5fab44" containerName="registry-server" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.898877 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.916442 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lgn2"] Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.982519 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxxdv\" (UniqueName: \"kubernetes.io/projected/c48c5622-c99d-4642-815f-03efbae95815-kube-api-access-cxxdv\") pod \"redhat-marketplace-8lgn2\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.982616 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-catalog-content\") pod \"redhat-marketplace-8lgn2\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:20 crc kubenswrapper[4917]: I0318 07:12:20.982656 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-utilities\") pod \"redhat-marketplace-8lgn2\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:21 crc kubenswrapper[4917]: I0318 07:12:21.083486 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxxdv\" (UniqueName: \"kubernetes.io/projected/c48c5622-c99d-4642-815f-03efbae95815-kube-api-access-cxxdv\") pod \"redhat-marketplace-8lgn2\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:21 crc kubenswrapper[4917]: I0318 07:12:21.083959 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-catalog-content\") pod \"redhat-marketplace-8lgn2\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:21 crc kubenswrapper[4917]: I0318 07:12:21.084514 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-catalog-content\") pod \"redhat-marketplace-8lgn2\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:21 crc kubenswrapper[4917]: I0318 07:12:21.084555 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-utilities\") pod \"redhat-marketplace-8lgn2\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:21 crc kubenswrapper[4917]: I0318 07:12:21.084855 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-utilities\") pod \"redhat-marketplace-8lgn2\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:21 crc kubenswrapper[4917]: I0318 07:12:21.118732 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxxdv\" (UniqueName: \"kubernetes.io/projected/c48c5622-c99d-4642-815f-03efbae95815-kube-api-access-cxxdv\") pod \"redhat-marketplace-8lgn2\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:21 crc kubenswrapper[4917]: I0318 07:12:21.229353 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:21 crc kubenswrapper[4917]: I0318 07:12:21.718382 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lgn2"] Mar 18 07:12:22 crc kubenswrapper[4917]: I0318 07:12:22.229356 4917 generic.go:334] "Generic (PLEG): container finished" podID="c48c5622-c99d-4642-815f-03efbae95815" containerID="9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4" exitCode=0 Mar 18 07:12:22 crc kubenswrapper[4917]: I0318 07:12:22.229869 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lgn2" event={"ID":"c48c5622-c99d-4642-815f-03efbae95815","Type":"ContainerDied","Data":"9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4"} Mar 18 07:12:22 crc kubenswrapper[4917]: I0318 07:12:22.229986 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lgn2" event={"ID":"c48c5622-c99d-4642-815f-03efbae95815","Type":"ContainerStarted","Data":"ef82afba060519c55e0e2a3c7e824f01465c13078568052e4102e18311026836"} Mar 18 07:12:23 crc kubenswrapper[4917]: I0318 07:12:23.245282 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lgn2" event={"ID":"c48c5622-c99d-4642-815f-03efbae95815","Type":"ContainerStarted","Data":"79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84"} Mar 18 07:12:24 crc kubenswrapper[4917]: I0318 07:12:24.262427 4917 generic.go:334] "Generic (PLEG): container finished" podID="c48c5622-c99d-4642-815f-03efbae95815" containerID="79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84" exitCode=0 Mar 18 07:12:24 crc kubenswrapper[4917]: I0318 07:12:24.262512 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lgn2" event={"ID":"c48c5622-c99d-4642-815f-03efbae95815","Type":"ContainerDied","Data":"79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84"} Mar 18 07:12:25 crc kubenswrapper[4917]: I0318 07:12:25.274362 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lgn2" event={"ID":"c48c5622-c99d-4642-815f-03efbae95815","Type":"ContainerStarted","Data":"bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0"} Mar 18 07:12:25 crc kubenswrapper[4917]: I0318 07:12:25.310507 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8lgn2" podStartSLOduration=2.843972424 podStartE2EDuration="5.310481949s" podCreationTimestamp="2026-03-18 07:12:20 +0000 UTC" firstStartedPulling="2026-03-18 07:12:22.231543232 +0000 UTC m=+1527.172697956" lastFinishedPulling="2026-03-18 07:12:24.698052727 +0000 UTC m=+1529.639207481" observedRunningTime="2026-03-18 07:12:25.299356205 +0000 UTC m=+1530.240510939" watchObservedRunningTime="2026-03-18 07:12:25.310481949 +0000 UTC m=+1530.251636703" Mar 18 07:12:31 crc kubenswrapper[4917]: I0318 07:12:31.229865 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:31 crc kubenswrapper[4917]: I0318 07:12:31.230481 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:31 crc kubenswrapper[4917]: I0318 07:12:31.284636 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:31 crc kubenswrapper[4917]: I0318 07:12:31.417854 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:31 crc kubenswrapper[4917]: I0318 07:12:31.554775 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lgn2"] Mar 18 07:12:32 crc kubenswrapper[4917]: I0318 07:12:32.929237 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:12:32 crc kubenswrapper[4917]: I0318 07:12:32.929329 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:12:33 crc kubenswrapper[4917]: I0318 07:12:33.387671 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8lgn2" podUID="c48c5622-c99d-4642-815f-03efbae95815" containerName="registry-server" containerID="cri-o://bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0" gracePeriod=2 Mar 18 07:12:33 crc kubenswrapper[4917]: I0318 07:12:33.979686 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.097094 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxxdv\" (UniqueName: \"kubernetes.io/projected/c48c5622-c99d-4642-815f-03efbae95815-kube-api-access-cxxdv\") pod \"c48c5622-c99d-4642-815f-03efbae95815\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.097143 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-catalog-content\") pod \"c48c5622-c99d-4642-815f-03efbae95815\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.097238 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-utilities\") pod \"c48c5622-c99d-4642-815f-03efbae95815\" (UID: \"c48c5622-c99d-4642-815f-03efbae95815\") " Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.098368 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-utilities" (OuterVolumeSpecName: "utilities") pod "c48c5622-c99d-4642-815f-03efbae95815" (UID: "c48c5622-c99d-4642-815f-03efbae95815"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.103183 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c48c5622-c99d-4642-815f-03efbae95815-kube-api-access-cxxdv" (OuterVolumeSpecName: "kube-api-access-cxxdv") pod "c48c5622-c99d-4642-815f-03efbae95815" (UID: "c48c5622-c99d-4642-815f-03efbae95815"). InnerVolumeSpecName "kube-api-access-cxxdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.127138 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c48c5622-c99d-4642-815f-03efbae95815" (UID: "c48c5622-c99d-4642-815f-03efbae95815"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.199677 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.199776 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxxdv\" (UniqueName: \"kubernetes.io/projected/c48c5622-c99d-4642-815f-03efbae95815-kube-api-access-cxxdv\") on node \"crc\" DevicePath \"\"" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.199840 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c48c5622-c99d-4642-815f-03efbae95815-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.404248 4917 generic.go:334] "Generic (PLEG): container finished" podID="c48c5622-c99d-4642-815f-03efbae95815" containerID="bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0" exitCode=0 Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.404340 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lgn2" event={"ID":"c48c5622-c99d-4642-815f-03efbae95815","Type":"ContainerDied","Data":"bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0"} Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.404440 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8lgn2" event={"ID":"c48c5622-c99d-4642-815f-03efbae95815","Type":"ContainerDied","Data":"ef82afba060519c55e0e2a3c7e824f01465c13078568052e4102e18311026836"} Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.404456 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8lgn2" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.404482 4917 scope.go:117] "RemoveContainer" containerID="bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.441281 4917 scope.go:117] "RemoveContainer" containerID="79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.476506 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lgn2"] Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.494758 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8lgn2"] Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.504076 4917 scope.go:117] "RemoveContainer" containerID="9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.529645 4917 scope.go:117] "RemoveContainer" containerID="bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0" Mar 18 07:12:34 crc kubenswrapper[4917]: E0318 07:12:34.531342 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0\": container with ID starting with bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0 not found: ID does not exist" containerID="bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.531399 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0"} err="failed to get container status \"bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0\": rpc error: code = NotFound desc = could not find container \"bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0\": container with ID starting with bbf79886ecaeb1aa080b4027f6157176fa86631a391fe89ed214fc520df755f0 not found: ID does not exist" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.531432 4917 scope.go:117] "RemoveContainer" containerID="79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84" Mar 18 07:12:34 crc kubenswrapper[4917]: E0318 07:12:34.532164 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84\": container with ID starting with 79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84 not found: ID does not exist" containerID="79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.532198 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84"} err="failed to get container status \"79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84\": rpc error: code = NotFound desc = could not find container \"79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84\": container with ID starting with 79945b2d31580931cbb66edcdb5c769bd0f5b13543c6ad685bbfe8cc4220eb84 not found: ID does not exist" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.532337 4917 scope.go:117] "RemoveContainer" containerID="9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4" Mar 18 07:12:34 crc kubenswrapper[4917]: E0318 07:12:34.532805 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4\": container with ID starting with 9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4 not found: ID does not exist" containerID="9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4" Mar 18 07:12:34 crc kubenswrapper[4917]: I0318 07:12:34.532838 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4"} err="failed to get container status \"9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4\": rpc error: code = NotFound desc = could not find container \"9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4\": container with ID starting with 9ef04b6bb39cbc19290ce1e5a7330dfc13ae0cc00623d56e8c0e7d37f903e4f4 not found: ID does not exist" Mar 18 07:12:35 crc kubenswrapper[4917]: I0318 07:12:35.788536 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c48c5622-c99d-4642-815f-03efbae95815" path="/var/lib/kubelet/pods/c48c5622-c99d-4642-815f-03efbae95815/volumes" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.282656 4917 scope.go:117] "RemoveContainer" containerID="041ca039bbadb98cfa9331d8184c76abf835c6ed1fca960800be969281f8d5cf" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.339672 4917 scope.go:117] "RemoveContainer" containerID="752c81b1b01a876106a6228f6985ce5bc96cbced050904b608ad646b342f8660" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.373408 4917 scope.go:117] "RemoveContainer" containerID="16287dc8eac10673799a08556004c46778aa8c41292bde808ec3d9bab0a363e5" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.404480 4917 scope.go:117] "RemoveContainer" containerID="50901c43b0a391f980b67cc6a8fed1a750a0bb81ba22d72473b2eb1c0bc04435" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.431187 4917 scope.go:117] "RemoveContainer" containerID="d627fc6ae8a31ea609ee3334c418704d6c2c5ee092724d1282be71d8044d788b" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.455108 4917 scope.go:117] "RemoveContainer" containerID="bf71597a3ae9eb5e1990277fba0cd9d1929f0579a69880e7a39dc9d56f71c9e7" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.474158 4917 scope.go:117] "RemoveContainer" containerID="f5a7a9e3add3a8e667ae87dcb116cbb2c525acdb1a13b33992c0b97a900e43cd" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.505501 4917 scope.go:117] "RemoveContainer" containerID="95a4a889f876df802ea135e6c177b8ee43bb49af49b9ca4d3a6a07e00df978be" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.545649 4917 scope.go:117] "RemoveContainer" containerID="2658026f893c6e07fb2028dc78145eb95678a1a39faeea5221b841ebdc338c12" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.581687 4917 scope.go:117] "RemoveContainer" containerID="3d4a6c920818ecac494b79ab66ff2137f371b2371675750c4dca016c56372311" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.603280 4917 scope.go:117] "RemoveContainer" containerID="d3940946653763a23bb02d86f1beb1ca5e1898736d5352253b4ad9d1636ec614" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.649517 4917 scope.go:117] "RemoveContainer" containerID="7ac1a2d7faea106e84f19c26e3b566ab583c3f762d3854030133338c142969fb" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.671934 4917 scope.go:117] "RemoveContainer" containerID="77bd9dbc3e9fe7eaafb31487aa7fefec3a629522eea305344a3a6d2d662a7f94" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.717133 4917 scope.go:117] "RemoveContainer" containerID="11408c301fda5c885632b28cf6089dd76b4a73e7d406e2ed4352533d71d8e7d9" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.745662 4917 scope.go:117] "RemoveContainer" containerID="396f23d5b17e5e096dc0c629319f3621212491bb20b57e78e011422165082e9d" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.765059 4917 scope.go:117] "RemoveContainer" containerID="86cf9a99fc294f76759213f08ff613566913303572b79c76fb1e75d3604f58f3" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.799386 4917 scope.go:117] "RemoveContainer" containerID="f12fd537a1df880673bc6d68c046c68708eca4a3e38f151f3ad0288bc357683d" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.835601 4917 scope.go:117] "RemoveContainer" containerID="ef99ff0c8bb9a55c09de7e1fee465d5b783cfe8b54162cde0f1b11477802c4a9" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.880315 4917 scope.go:117] "RemoveContainer" containerID="bec7bb8157826d333559c4ee554e54a0b97bbeb1ae1a10fa37e602246aa02d77" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.904539 4917 scope.go:117] "RemoveContainer" containerID="82b032445208a2c8a4834db5814b65f14434c9626d0d60b8ce4d9d30da355c1b" Mar 18 07:13:01 crc kubenswrapper[4917]: I0318 07:13:01.918406 4917 scope.go:117] "RemoveContainer" containerID="42786c541794c50119f11de4d6878499b506c8fd49a01bf9b4c397fce1a43ced" Mar 18 07:13:02 crc kubenswrapper[4917]: I0318 07:13:02.928904 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:13:02 crc kubenswrapper[4917]: I0318 07:13:02.929427 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:13:02 crc kubenswrapper[4917]: I0318 07:13:02.929487 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:13:02 crc kubenswrapper[4917]: I0318 07:13:02.930240 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dcff2e5f0e4be43f74115b69504e568cfe4f21b6f21979f45d6bebc1faf21caf"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:13:02 crc kubenswrapper[4917]: I0318 07:13:02.930304 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://dcff2e5f0e4be43f74115b69504e568cfe4f21b6f21979f45d6bebc1faf21caf" gracePeriod=600 Mar 18 07:13:03 crc kubenswrapper[4917]: I0318 07:13:03.715493 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="dcff2e5f0e4be43f74115b69504e568cfe4f21b6f21979f45d6bebc1faf21caf" exitCode=0 Mar 18 07:13:03 crc kubenswrapper[4917]: I0318 07:13:03.715537 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"dcff2e5f0e4be43f74115b69504e568cfe4f21b6f21979f45d6bebc1faf21caf"} Mar 18 07:13:03 crc kubenswrapper[4917]: I0318 07:13:03.715848 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4"} Mar 18 07:13:03 crc kubenswrapper[4917]: I0318 07:13:03.715871 4917 scope.go:117] "RemoveContainer" containerID="ed205158cfd88c24a2618c8398681343eec6e1ff531ca763ff821abed75c51f1" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.150690 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563634-87q2d"] Mar 18 07:14:00 crc kubenswrapper[4917]: E0318 07:14:00.154493 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48c5622-c99d-4642-815f-03efbae95815" containerName="extract-content" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.154696 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48c5622-c99d-4642-815f-03efbae95815" containerName="extract-content" Mar 18 07:14:00 crc kubenswrapper[4917]: E0318 07:14:00.154859 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48c5622-c99d-4642-815f-03efbae95815" containerName="registry-server" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.154980 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48c5622-c99d-4642-815f-03efbae95815" containerName="registry-server" Mar 18 07:14:00 crc kubenswrapper[4917]: E0318 07:14:00.155144 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c48c5622-c99d-4642-815f-03efbae95815" containerName="extract-utilities" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.155266 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c48c5622-c99d-4642-815f-03efbae95815" containerName="extract-utilities" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.156028 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c48c5622-c99d-4642-815f-03efbae95815" containerName="registry-server" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.157034 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563634-87q2d" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.160807 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.161106 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.161388 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.162423 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563634-87q2d"] Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.250723 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wsk5\" (UniqueName: \"kubernetes.io/projected/e589dd70-066e-4ade-9042-78bef37bba3f-kube-api-access-9wsk5\") pod \"auto-csr-approver-29563634-87q2d\" (UID: \"e589dd70-066e-4ade-9042-78bef37bba3f\") " pod="openshift-infra/auto-csr-approver-29563634-87q2d" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.352990 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wsk5\" (UniqueName: \"kubernetes.io/projected/e589dd70-066e-4ade-9042-78bef37bba3f-kube-api-access-9wsk5\") pod \"auto-csr-approver-29563634-87q2d\" (UID: \"e589dd70-066e-4ade-9042-78bef37bba3f\") " pod="openshift-infra/auto-csr-approver-29563634-87q2d" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.389789 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wsk5\" (UniqueName: \"kubernetes.io/projected/e589dd70-066e-4ade-9042-78bef37bba3f-kube-api-access-9wsk5\") pod \"auto-csr-approver-29563634-87q2d\" (UID: \"e589dd70-066e-4ade-9042-78bef37bba3f\") " pod="openshift-infra/auto-csr-approver-29563634-87q2d" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.479554 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563634-87q2d" Mar 18 07:14:00 crc kubenswrapper[4917]: I0318 07:14:00.957551 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563634-87q2d"] Mar 18 07:14:01 crc kubenswrapper[4917]: I0318 07:14:01.287983 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563634-87q2d" event={"ID":"e589dd70-066e-4ade-9042-78bef37bba3f","Type":"ContainerStarted","Data":"975bca8e342dedf7902143cd3f383eefcfa4f48166760ee2d274086c849e01e5"} Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.264194 4917 scope.go:117] "RemoveContainer" containerID="34cb7d14ec735ff0e328ee9c042c8e5f4b080009d5c97efbb47af3b0fca71c21" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.290001 4917 scope.go:117] "RemoveContainer" containerID="9a35ebbfeb0fef433b98cf0eda6e6f539c140c481a83d869a292012ac839622b" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.335196 4917 scope.go:117] "RemoveContainer" containerID="259eaeef31695e4a95c6ead3dec4e94a3b00c523ac3406e2f36ae99dc209ff02" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.361887 4917 scope.go:117] "RemoveContainer" containerID="46dba83aa670356fa4004906b5fa0253b13f834d3150c8b025c8282d92229617" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.397460 4917 scope.go:117] "RemoveContainer" containerID="ffdaf1affb8cdd37abdd624f362cd39b00ca3fc290879d1fa10e21ae85d99228" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.417783 4917 scope.go:117] "RemoveContainer" containerID="d6e8064caadde781912b56059dd14c3925340d5dcf0edfb08f0a245a4f8a6412" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.445369 4917 scope.go:117] "RemoveContainer" containerID="207b783340dbb078486c8bbf78cc7f1b5f6a610500418d725d472abd712d7974" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.468753 4917 scope.go:117] "RemoveContainer" containerID="d0540dd64f9371f312f25eb071dea28339a516150812d0bc566670670bc1cd75" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.511480 4917 scope.go:117] "RemoveContainer" containerID="e7762f0ffafe3ca8573f7fd63a2d74e00c6e853e9b80b754389deaea5ffccc20" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.527960 4917 scope.go:117] "RemoveContainer" containerID="9b936e59ea438fb9339e8163868ab0d80ef3aa21be40b1d3519fc252bb7cee7c" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.559803 4917 scope.go:117] "RemoveContainer" containerID="b7dd7e7dcd0ead924d5151b493b692b8cda6c133769ce8b19017ed20a1493c30" Mar 18 07:14:02 crc kubenswrapper[4917]: I0318 07:14:02.587623 4917 scope.go:117] "RemoveContainer" containerID="245ba55e672fdb718ef99d4f553f1eb46be5e64e2ad362be6d8902aa8556ed41" Mar 18 07:14:03 crc kubenswrapper[4917]: I0318 07:14:03.313572 4917 generic.go:334] "Generic (PLEG): container finished" podID="e589dd70-066e-4ade-9042-78bef37bba3f" containerID="d25c9d6395268fe5eeac63114c4b35806b45e14b2c6f9aee8a8b0e487fbb528d" exitCode=0 Mar 18 07:14:03 crc kubenswrapper[4917]: I0318 07:14:03.313693 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563634-87q2d" event={"ID":"e589dd70-066e-4ade-9042-78bef37bba3f","Type":"ContainerDied","Data":"d25c9d6395268fe5eeac63114c4b35806b45e14b2c6f9aee8a8b0e487fbb528d"} Mar 18 07:14:04 crc kubenswrapper[4917]: I0318 07:14:04.591297 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563634-87q2d" Mar 18 07:14:04 crc kubenswrapper[4917]: I0318 07:14:04.718543 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wsk5\" (UniqueName: \"kubernetes.io/projected/e589dd70-066e-4ade-9042-78bef37bba3f-kube-api-access-9wsk5\") pod \"e589dd70-066e-4ade-9042-78bef37bba3f\" (UID: \"e589dd70-066e-4ade-9042-78bef37bba3f\") " Mar 18 07:14:04 crc kubenswrapper[4917]: I0318 07:14:04.726885 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e589dd70-066e-4ade-9042-78bef37bba3f-kube-api-access-9wsk5" (OuterVolumeSpecName: "kube-api-access-9wsk5") pod "e589dd70-066e-4ade-9042-78bef37bba3f" (UID: "e589dd70-066e-4ade-9042-78bef37bba3f"). InnerVolumeSpecName "kube-api-access-9wsk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:14:04 crc kubenswrapper[4917]: I0318 07:14:04.820446 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wsk5\" (UniqueName: \"kubernetes.io/projected/e589dd70-066e-4ade-9042-78bef37bba3f-kube-api-access-9wsk5\") on node \"crc\" DevicePath \"\"" Mar 18 07:14:05 crc kubenswrapper[4917]: I0318 07:14:05.327680 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563634-87q2d" event={"ID":"e589dd70-066e-4ade-9042-78bef37bba3f","Type":"ContainerDied","Data":"975bca8e342dedf7902143cd3f383eefcfa4f48166760ee2d274086c849e01e5"} Mar 18 07:14:05 crc kubenswrapper[4917]: I0318 07:14:05.327727 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="975bca8e342dedf7902143cd3f383eefcfa4f48166760ee2d274086c849e01e5" Mar 18 07:14:05 crc kubenswrapper[4917]: I0318 07:14:05.327817 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563634-87q2d" Mar 18 07:14:05 crc kubenswrapper[4917]: I0318 07:14:05.663841 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563628-44zvp"] Mar 18 07:14:05 crc kubenswrapper[4917]: I0318 07:14:05.670382 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563628-44zvp"] Mar 18 07:14:05 crc kubenswrapper[4917]: I0318 07:14:05.781701 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87c2b17b-d754-406a-98e6-466bf76cf5a0" path="/var/lib/kubelet/pods/87c2b17b-d754-406a-98e6-466bf76cf5a0/volumes" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.172981 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv"] Mar 18 07:15:00 crc kubenswrapper[4917]: E0318 07:15:00.174441 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e589dd70-066e-4ade-9042-78bef37bba3f" containerName="oc" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.174462 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e589dd70-066e-4ade-9042-78bef37bba3f" containerName="oc" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.174684 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e589dd70-066e-4ade-9042-78bef37bba3f" containerName="oc" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.175563 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.177766 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.177813 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.186648 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv"] Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.367213 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e92583-8234-44fb-9d55-eedaaaf5bc93-config-volume\") pod \"collect-profiles-29563635-gspnv\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.367534 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42mkv\" (UniqueName: \"kubernetes.io/projected/45e92583-8234-44fb-9d55-eedaaaf5bc93-kube-api-access-42mkv\") pod \"collect-profiles-29563635-gspnv\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.367704 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e92583-8234-44fb-9d55-eedaaaf5bc93-secret-volume\") pod \"collect-profiles-29563635-gspnv\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.469342 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e92583-8234-44fb-9d55-eedaaaf5bc93-secret-volume\") pod \"collect-profiles-29563635-gspnv\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.469401 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e92583-8234-44fb-9d55-eedaaaf5bc93-config-volume\") pod \"collect-profiles-29563635-gspnv\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.469459 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42mkv\" (UniqueName: \"kubernetes.io/projected/45e92583-8234-44fb-9d55-eedaaaf5bc93-kube-api-access-42mkv\") pod \"collect-profiles-29563635-gspnv\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.472150 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e92583-8234-44fb-9d55-eedaaaf5bc93-config-volume\") pod \"collect-profiles-29563635-gspnv\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.483806 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e92583-8234-44fb-9d55-eedaaaf5bc93-secret-volume\") pod \"collect-profiles-29563635-gspnv\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.500543 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42mkv\" (UniqueName: \"kubernetes.io/projected/45e92583-8234-44fb-9d55-eedaaaf5bc93-kube-api-access-42mkv\") pod \"collect-profiles-29563635-gspnv\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.511860 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.745089 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv"] Mar 18 07:15:00 crc kubenswrapper[4917]: I0318 07:15:00.815248 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" event={"ID":"45e92583-8234-44fb-9d55-eedaaaf5bc93","Type":"ContainerStarted","Data":"622a0f47cff393946392a2c5884d664138a7bb460c6351dc63b98c2c86bc6c8c"} Mar 18 07:15:01 crc kubenswrapper[4917]: I0318 07:15:01.822913 4917 generic.go:334] "Generic (PLEG): container finished" podID="45e92583-8234-44fb-9d55-eedaaaf5bc93" containerID="22270db9537b0b78ef3ed19d802b4a00b2fda30aee694f1a251e366cf3bab844" exitCode=0 Mar 18 07:15:01 crc kubenswrapper[4917]: I0318 07:15:01.823005 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" event={"ID":"45e92583-8234-44fb-9d55-eedaaaf5bc93","Type":"ContainerDied","Data":"22270db9537b0b78ef3ed19d802b4a00b2fda30aee694f1a251e366cf3bab844"} Mar 18 07:15:02 crc kubenswrapper[4917]: I0318 07:15:02.695434 4917 scope.go:117] "RemoveContainer" containerID="fac743df19fc27567027fac378bd664127918b928ea3a9d066d90306e3e5e18a" Mar 18 07:15:02 crc kubenswrapper[4917]: I0318 07:15:02.723447 4917 scope.go:117] "RemoveContainer" containerID="2cbe01ea23a91faf50474f6bd4a4594d76a5f55fad33ee844a2f70a508be810a" Mar 18 07:15:02 crc kubenswrapper[4917]: I0318 07:15:02.788313 4917 scope.go:117] "RemoveContainer" containerID="8e200a7db258824eecca5378a51a1a017254e035828c262f2cbb5e58066808d0" Mar 18 07:15:02 crc kubenswrapper[4917]: I0318 07:15:02.834785 4917 scope.go:117] "RemoveContainer" containerID="5654a91cf8a897ae0481e23ff9270d8a66879056e36416c6479086df34c2176d" Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.132193 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.309967 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e92583-8234-44fb-9d55-eedaaaf5bc93-config-volume\") pod \"45e92583-8234-44fb-9d55-eedaaaf5bc93\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.310927 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e92583-8234-44fb-9d55-eedaaaf5bc93-secret-volume\") pod \"45e92583-8234-44fb-9d55-eedaaaf5bc93\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.311119 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42mkv\" (UniqueName: \"kubernetes.io/projected/45e92583-8234-44fb-9d55-eedaaaf5bc93-kube-api-access-42mkv\") pod \"45e92583-8234-44fb-9d55-eedaaaf5bc93\" (UID: \"45e92583-8234-44fb-9d55-eedaaaf5bc93\") " Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.311467 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e92583-8234-44fb-9d55-eedaaaf5bc93-config-volume" (OuterVolumeSpecName: "config-volume") pod "45e92583-8234-44fb-9d55-eedaaaf5bc93" (UID: "45e92583-8234-44fb-9d55-eedaaaf5bc93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.311821 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45e92583-8234-44fb-9d55-eedaaaf5bc93-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.317049 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e92583-8234-44fb-9d55-eedaaaf5bc93-kube-api-access-42mkv" (OuterVolumeSpecName: "kube-api-access-42mkv") pod "45e92583-8234-44fb-9d55-eedaaaf5bc93" (UID: "45e92583-8234-44fb-9d55-eedaaaf5bc93"). InnerVolumeSpecName "kube-api-access-42mkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.317404 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e92583-8234-44fb-9d55-eedaaaf5bc93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "45e92583-8234-44fb-9d55-eedaaaf5bc93" (UID: "45e92583-8234-44fb-9d55-eedaaaf5bc93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.413490 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42mkv\" (UniqueName: \"kubernetes.io/projected/45e92583-8234-44fb-9d55-eedaaaf5bc93-kube-api-access-42mkv\") on node \"crc\" DevicePath \"\"" Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.413561 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45e92583-8234-44fb-9d55-eedaaaf5bc93-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.845762 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" event={"ID":"45e92583-8234-44fb-9d55-eedaaaf5bc93","Type":"ContainerDied","Data":"622a0f47cff393946392a2c5884d664138a7bb460c6351dc63b98c2c86bc6c8c"} Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.845843 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv" Mar 18 07:15:03 crc kubenswrapper[4917]: I0318 07:15:03.845852 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="622a0f47cff393946392a2c5884d664138a7bb460c6351dc63b98c2c86bc6c8c" Mar 18 07:15:32 crc kubenswrapper[4917]: I0318 07:15:32.929210 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:15:32 crc kubenswrapper[4917]: I0318 07:15:32.930844 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.151464 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563636-zjtcd"] Mar 18 07:16:00 crc kubenswrapper[4917]: E0318 07:16:00.152401 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e92583-8234-44fb-9d55-eedaaaf5bc93" containerName="collect-profiles" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.152418 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e92583-8234-44fb-9d55-eedaaaf5bc93" containerName="collect-profiles" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.152624 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e92583-8234-44fb-9d55-eedaaaf5bc93" containerName="collect-profiles" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.153181 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563636-zjtcd" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.155918 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.155961 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.156647 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.163525 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563636-zjtcd"] Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.276738 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2tqr\" (UniqueName: \"kubernetes.io/projected/1e9e9790-2203-473b-a66a-d101887d5a81-kube-api-access-n2tqr\") pod \"auto-csr-approver-29563636-zjtcd\" (UID: \"1e9e9790-2203-473b-a66a-d101887d5a81\") " pod="openshift-infra/auto-csr-approver-29563636-zjtcd" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.378821 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2tqr\" (UniqueName: \"kubernetes.io/projected/1e9e9790-2203-473b-a66a-d101887d5a81-kube-api-access-n2tqr\") pod \"auto-csr-approver-29563636-zjtcd\" (UID: \"1e9e9790-2203-473b-a66a-d101887d5a81\") " pod="openshift-infra/auto-csr-approver-29563636-zjtcd" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.402025 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2tqr\" (UniqueName: \"kubernetes.io/projected/1e9e9790-2203-473b-a66a-d101887d5a81-kube-api-access-n2tqr\") pod \"auto-csr-approver-29563636-zjtcd\" (UID: \"1e9e9790-2203-473b-a66a-d101887d5a81\") " pod="openshift-infra/auto-csr-approver-29563636-zjtcd" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.472150 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563636-zjtcd" Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.940860 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563636-zjtcd"] Mar 18 07:16:00 crc kubenswrapper[4917]: I0318 07:16:00.952663 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:16:01 crc kubenswrapper[4917]: I0318 07:16:01.345704 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563636-zjtcd" event={"ID":"1e9e9790-2203-473b-a66a-d101887d5a81","Type":"ContainerStarted","Data":"ef8ad9f0cd3ffc81ce93930307ae94e98ec667ff4d5eac4779d24861f9c8fcb9"} Mar 18 07:16:02 crc kubenswrapper[4917]: I0318 07:16:02.929195 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:16:02 crc kubenswrapper[4917]: I0318 07:16:02.929699 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:16:02 crc kubenswrapper[4917]: I0318 07:16:02.960178 4917 scope.go:117] "RemoveContainer" containerID="ea146ca86a61e4a78f1a0bb0d88a01ef5621fff4b0ee2505b6b7913116f8dc17" Mar 18 07:16:03 crc kubenswrapper[4917]: I0318 07:16:03.363132 4917 generic.go:334] "Generic (PLEG): container finished" podID="1e9e9790-2203-473b-a66a-d101887d5a81" containerID="1fb0855b08c2c5b380cd3ec7be98072701122f563ef5aa88de138ec8ec544fe8" exitCode=0 Mar 18 07:16:03 crc kubenswrapper[4917]: I0318 07:16:03.363198 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563636-zjtcd" event={"ID":"1e9e9790-2203-473b-a66a-d101887d5a81","Type":"ContainerDied","Data":"1fb0855b08c2c5b380cd3ec7be98072701122f563ef5aa88de138ec8ec544fe8"} Mar 18 07:16:04 crc kubenswrapper[4917]: I0318 07:16:04.671909 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563636-zjtcd" Mar 18 07:16:04 crc kubenswrapper[4917]: I0318 07:16:04.743343 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2tqr\" (UniqueName: \"kubernetes.io/projected/1e9e9790-2203-473b-a66a-d101887d5a81-kube-api-access-n2tqr\") pod \"1e9e9790-2203-473b-a66a-d101887d5a81\" (UID: \"1e9e9790-2203-473b-a66a-d101887d5a81\") " Mar 18 07:16:04 crc kubenswrapper[4917]: I0318 07:16:04.751962 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9e9790-2203-473b-a66a-d101887d5a81-kube-api-access-n2tqr" (OuterVolumeSpecName: "kube-api-access-n2tqr") pod "1e9e9790-2203-473b-a66a-d101887d5a81" (UID: "1e9e9790-2203-473b-a66a-d101887d5a81"). InnerVolumeSpecName "kube-api-access-n2tqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:16:04 crc kubenswrapper[4917]: I0318 07:16:04.844623 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2tqr\" (UniqueName: \"kubernetes.io/projected/1e9e9790-2203-473b-a66a-d101887d5a81-kube-api-access-n2tqr\") on node \"crc\" DevicePath \"\"" Mar 18 07:16:05 crc kubenswrapper[4917]: I0318 07:16:05.380325 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563636-zjtcd" event={"ID":"1e9e9790-2203-473b-a66a-d101887d5a81","Type":"ContainerDied","Data":"ef8ad9f0cd3ffc81ce93930307ae94e98ec667ff4d5eac4779d24861f9c8fcb9"} Mar 18 07:16:05 crc kubenswrapper[4917]: I0318 07:16:05.380391 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef8ad9f0cd3ffc81ce93930307ae94e98ec667ff4d5eac4779d24861f9c8fcb9" Mar 18 07:16:05 crc kubenswrapper[4917]: I0318 07:16:05.380469 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563636-zjtcd" Mar 18 07:16:05 crc kubenswrapper[4917]: I0318 07:16:05.749814 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563630-lm6sk"] Mar 18 07:16:05 crc kubenswrapper[4917]: I0318 07:16:05.756870 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563630-lm6sk"] Mar 18 07:16:05 crc kubenswrapper[4917]: I0318 07:16:05.781471 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50ce1300-1bea-4223-84b2-e2ca681962a6" path="/var/lib/kubelet/pods/50ce1300-1bea-4223-84b2-e2ca681962a6/volumes" Mar 18 07:16:32 crc kubenswrapper[4917]: I0318 07:16:32.928547 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:16:32 crc kubenswrapper[4917]: I0318 07:16:32.929285 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:16:32 crc kubenswrapper[4917]: I0318 07:16:32.929349 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:16:32 crc kubenswrapper[4917]: I0318 07:16:32.930133 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:16:32 crc kubenswrapper[4917]: I0318 07:16:32.930221 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" gracePeriod=600 Mar 18 07:16:33 crc kubenswrapper[4917]: E0318 07:16:33.072096 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:16:33 crc kubenswrapper[4917]: I0318 07:16:33.666363 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" exitCode=0 Mar 18 07:16:33 crc kubenswrapper[4917]: I0318 07:16:33.666415 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4"} Mar 18 07:16:33 crc kubenswrapper[4917]: I0318 07:16:33.666503 4917 scope.go:117] "RemoveContainer" containerID="dcff2e5f0e4be43f74115b69504e568cfe4f21b6f21979f45d6bebc1faf21caf" Mar 18 07:16:33 crc kubenswrapper[4917]: I0318 07:16:33.667251 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:16:33 crc kubenswrapper[4917]: E0318 07:16:33.667669 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:16:46 crc kubenswrapper[4917]: I0318 07:16:46.773019 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:16:46 crc kubenswrapper[4917]: E0318 07:16:46.775062 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:17:00 crc kubenswrapper[4917]: I0318 07:17:00.772546 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:17:00 crc kubenswrapper[4917]: E0318 07:17:00.773698 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:17:03 crc kubenswrapper[4917]: I0318 07:17:03.035489 4917 scope.go:117] "RemoveContainer" containerID="c962aa87c069585684597428a945085dba9c26dd06ff54afb7f098c0ddd1c7b9" Mar 18 07:17:12 crc kubenswrapper[4917]: I0318 07:17:12.773024 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:17:12 crc kubenswrapper[4917]: E0318 07:17:12.774066 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:17:24 crc kubenswrapper[4917]: I0318 07:17:24.774726 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:17:24 crc kubenswrapper[4917]: E0318 07:17:24.775814 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:17:36 crc kubenswrapper[4917]: I0318 07:17:36.772874 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:17:36 crc kubenswrapper[4917]: E0318 07:17:36.773912 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:17:48 crc kubenswrapper[4917]: I0318 07:17:48.773037 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:17:48 crc kubenswrapper[4917]: E0318 07:17:48.774163 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.158145 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563638-6qdf5"] Mar 18 07:18:00 crc kubenswrapper[4917]: E0318 07:18:00.159042 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9e9790-2203-473b-a66a-d101887d5a81" containerName="oc" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.159058 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9e9790-2203-473b-a66a-d101887d5a81" containerName="oc" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.159302 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9e9790-2203-473b-a66a-d101887d5a81" containerName="oc" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.160043 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563638-6qdf5" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.162480 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.163103 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.163454 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.172744 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563638-6qdf5"] Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.261250 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9f8\" (UniqueName: \"kubernetes.io/projected/2a894cd3-9073-473b-8295-364683bae145-kube-api-access-rr9f8\") pod \"auto-csr-approver-29563638-6qdf5\" (UID: \"2a894cd3-9073-473b-8295-364683bae145\") " pod="openshift-infra/auto-csr-approver-29563638-6qdf5" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.362373 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9f8\" (UniqueName: \"kubernetes.io/projected/2a894cd3-9073-473b-8295-364683bae145-kube-api-access-rr9f8\") pod \"auto-csr-approver-29563638-6qdf5\" (UID: \"2a894cd3-9073-473b-8295-364683bae145\") " pod="openshift-infra/auto-csr-approver-29563638-6qdf5" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.393816 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9f8\" (UniqueName: \"kubernetes.io/projected/2a894cd3-9073-473b-8295-364683bae145-kube-api-access-rr9f8\") pod \"auto-csr-approver-29563638-6qdf5\" (UID: \"2a894cd3-9073-473b-8295-364683bae145\") " pod="openshift-infra/auto-csr-approver-29563638-6qdf5" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.489881 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563638-6qdf5" Mar 18 07:18:00 crc kubenswrapper[4917]: I0318 07:18:00.813188 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563638-6qdf5"] Mar 18 07:18:01 crc kubenswrapper[4917]: I0318 07:18:01.515647 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563638-6qdf5" event={"ID":"2a894cd3-9073-473b-8295-364683bae145","Type":"ContainerStarted","Data":"be5c7970e2edffd2d8181610c5a25c7208da79b5194a1940edcb3399c130d4a9"} Mar 18 07:18:02 crc kubenswrapper[4917]: I0318 07:18:02.531225 4917 generic.go:334] "Generic (PLEG): container finished" podID="2a894cd3-9073-473b-8295-364683bae145" containerID="a949f014f7fad32d98dced8f464f959772d56d06f45109469d7215f67391273e" exitCode=0 Mar 18 07:18:02 crc kubenswrapper[4917]: I0318 07:18:02.531346 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563638-6qdf5" event={"ID":"2a894cd3-9073-473b-8295-364683bae145","Type":"ContainerDied","Data":"a949f014f7fad32d98dced8f464f959772d56d06f45109469d7215f67391273e"} Mar 18 07:18:02 crc kubenswrapper[4917]: I0318 07:18:02.773672 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:18:02 crc kubenswrapper[4917]: E0318 07:18:02.774088 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:18:03 crc kubenswrapper[4917]: I0318 07:18:03.925171 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563638-6qdf5" Mar 18 07:18:04 crc kubenswrapper[4917]: I0318 07:18:04.021417 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr9f8\" (UniqueName: \"kubernetes.io/projected/2a894cd3-9073-473b-8295-364683bae145-kube-api-access-rr9f8\") pod \"2a894cd3-9073-473b-8295-364683bae145\" (UID: \"2a894cd3-9073-473b-8295-364683bae145\") " Mar 18 07:18:04 crc kubenswrapper[4917]: I0318 07:18:04.030024 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a894cd3-9073-473b-8295-364683bae145-kube-api-access-rr9f8" (OuterVolumeSpecName: "kube-api-access-rr9f8") pod "2a894cd3-9073-473b-8295-364683bae145" (UID: "2a894cd3-9073-473b-8295-364683bae145"). InnerVolumeSpecName "kube-api-access-rr9f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:18:04 crc kubenswrapper[4917]: I0318 07:18:04.124080 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr9f8\" (UniqueName: \"kubernetes.io/projected/2a894cd3-9073-473b-8295-364683bae145-kube-api-access-rr9f8\") on node \"crc\" DevicePath \"\"" Mar 18 07:18:04 crc kubenswrapper[4917]: I0318 07:18:04.550447 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563638-6qdf5" event={"ID":"2a894cd3-9073-473b-8295-364683bae145","Type":"ContainerDied","Data":"be5c7970e2edffd2d8181610c5a25c7208da79b5194a1940edcb3399c130d4a9"} Mar 18 07:18:04 crc kubenswrapper[4917]: I0318 07:18:04.550766 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be5c7970e2edffd2d8181610c5a25c7208da79b5194a1940edcb3399c130d4a9" Mar 18 07:18:04 crc kubenswrapper[4917]: I0318 07:18:04.550575 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563638-6qdf5" Mar 18 07:18:05 crc kubenswrapper[4917]: I0318 07:18:05.006278 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563632-cpv84"] Mar 18 07:18:05 crc kubenswrapper[4917]: I0318 07:18:05.013066 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563632-cpv84"] Mar 18 07:18:05 crc kubenswrapper[4917]: I0318 07:18:05.786907 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dfd320a-b048-45a8-951e-0268f8931561" path="/var/lib/kubelet/pods/5dfd320a-b048-45a8-951e-0268f8931561/volumes" Mar 18 07:18:14 crc kubenswrapper[4917]: I0318 07:18:14.773049 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:18:14 crc kubenswrapper[4917]: E0318 07:18:14.773708 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:18:28 crc kubenswrapper[4917]: I0318 07:18:28.772957 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:18:28 crc kubenswrapper[4917]: E0318 07:18:28.773603 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:18:41 crc kubenswrapper[4917]: I0318 07:18:41.773382 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:18:41 crc kubenswrapper[4917]: E0318 07:18:41.774422 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:18:54 crc kubenswrapper[4917]: I0318 07:18:54.774021 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:18:54 crc kubenswrapper[4917]: E0318 07:18:54.775157 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:19:03 crc kubenswrapper[4917]: I0318 07:19:03.141084 4917 scope.go:117] "RemoveContainer" containerID="6cbb41ff5ad3df72c0af3439c73f1059c248c77578c7f84281cb1cee8a64008a" Mar 18 07:19:07 crc kubenswrapper[4917]: I0318 07:19:07.773323 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:19:07 crc kubenswrapper[4917]: E0318 07:19:07.774370 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:19:21 crc kubenswrapper[4917]: I0318 07:19:21.773063 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:19:21 crc kubenswrapper[4917]: E0318 07:19:21.774082 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:19:36 crc kubenswrapper[4917]: I0318 07:19:36.774148 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:19:36 crc kubenswrapper[4917]: E0318 07:19:36.775468 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:19:50 crc kubenswrapper[4917]: I0318 07:19:50.773274 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:19:50 crc kubenswrapper[4917]: E0318 07:19:50.774256 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.456006 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nghf6"] Mar 18 07:19:54 crc kubenswrapper[4917]: E0318 07:19:54.456746 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a894cd3-9073-473b-8295-364683bae145" containerName="oc" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.456763 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a894cd3-9073-473b-8295-364683bae145" containerName="oc" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.456987 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a894cd3-9073-473b-8295-364683bae145" containerName="oc" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.458261 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.476836 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nghf6"] Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.538160 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-catalog-content\") pod \"certified-operators-nghf6\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.538232 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-utilities\") pod \"certified-operators-nghf6\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.538321 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grpcd\" (UniqueName: \"kubernetes.io/projected/bc82b574-4d3e-4c81-9426-64c236a4d2a2-kube-api-access-grpcd\") pod \"certified-operators-nghf6\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.639372 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-catalog-content\") pod \"certified-operators-nghf6\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.639669 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-utilities\") pod \"certified-operators-nghf6\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.639780 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grpcd\" (UniqueName: \"kubernetes.io/projected/bc82b574-4d3e-4c81-9426-64c236a4d2a2-kube-api-access-grpcd\") pod \"certified-operators-nghf6\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.640055 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-catalog-content\") pod \"certified-operators-nghf6\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.640093 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-utilities\") pod \"certified-operators-nghf6\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.667693 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grpcd\" (UniqueName: \"kubernetes.io/projected/bc82b574-4d3e-4c81-9426-64c236a4d2a2-kube-api-access-grpcd\") pod \"certified-operators-nghf6\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:54 crc kubenswrapper[4917]: I0318 07:19:54.796752 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:19:55 crc kubenswrapper[4917]: I0318 07:19:55.089376 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nghf6"] Mar 18 07:19:55 crc kubenswrapper[4917]: I0318 07:19:55.588902 4917 generic.go:334] "Generic (PLEG): container finished" podID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerID="bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60" exitCode=0 Mar 18 07:19:55 crc kubenswrapper[4917]: I0318 07:19:55.588981 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nghf6" event={"ID":"bc82b574-4d3e-4c81-9426-64c236a4d2a2","Type":"ContainerDied","Data":"bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60"} Mar 18 07:19:55 crc kubenswrapper[4917]: I0318 07:19:55.589028 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nghf6" event={"ID":"bc82b574-4d3e-4c81-9426-64c236a4d2a2","Type":"ContainerStarted","Data":"3100954f849bb59c788d4cbadd2360128139f93d6a1bcff947ff8fe67abf9bb1"} Mar 18 07:19:56 crc kubenswrapper[4917]: I0318 07:19:56.611822 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nghf6" event={"ID":"bc82b574-4d3e-4c81-9426-64c236a4d2a2","Type":"ContainerStarted","Data":"4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835"} Mar 18 07:19:57 crc kubenswrapper[4917]: I0318 07:19:57.625232 4917 generic.go:334] "Generic (PLEG): container finished" podID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerID="4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835" exitCode=0 Mar 18 07:19:57 crc kubenswrapper[4917]: I0318 07:19:57.625375 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nghf6" event={"ID":"bc82b574-4d3e-4c81-9426-64c236a4d2a2","Type":"ContainerDied","Data":"4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835"} Mar 18 07:19:58 crc kubenswrapper[4917]: I0318 07:19:58.637015 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nghf6" event={"ID":"bc82b574-4d3e-4c81-9426-64c236a4d2a2","Type":"ContainerStarted","Data":"bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f"} Mar 18 07:19:58 crc kubenswrapper[4917]: I0318 07:19:58.669221 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nghf6" podStartSLOduration=2.23281382 podStartE2EDuration="4.669204054s" podCreationTimestamp="2026-03-18 07:19:54 +0000 UTC" firstStartedPulling="2026-03-18 07:19:55.592217432 +0000 UTC m=+1980.533372146" lastFinishedPulling="2026-03-18 07:19:58.028607666 +0000 UTC m=+1982.969762380" observedRunningTime="2026-03-18 07:19:58.66655071 +0000 UTC m=+1983.607705434" watchObservedRunningTime="2026-03-18 07:19:58.669204054 +0000 UTC m=+1983.610358768" Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.140208 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563640-bxcc2"] Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.141903 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563640-bxcc2" Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.145156 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.145401 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.148919 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.158187 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563640-bxcc2"] Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.224872 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8tz\" (UniqueName: \"kubernetes.io/projected/88900076-a48e-4175-ac37-454811fcf9d5-kube-api-access-mw8tz\") pod \"auto-csr-approver-29563640-bxcc2\" (UID: \"88900076-a48e-4175-ac37-454811fcf9d5\") " pod="openshift-infra/auto-csr-approver-29563640-bxcc2" Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.326560 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw8tz\" (UniqueName: \"kubernetes.io/projected/88900076-a48e-4175-ac37-454811fcf9d5-kube-api-access-mw8tz\") pod \"auto-csr-approver-29563640-bxcc2\" (UID: \"88900076-a48e-4175-ac37-454811fcf9d5\") " pod="openshift-infra/auto-csr-approver-29563640-bxcc2" Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.350111 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw8tz\" (UniqueName: \"kubernetes.io/projected/88900076-a48e-4175-ac37-454811fcf9d5-kube-api-access-mw8tz\") pod \"auto-csr-approver-29563640-bxcc2\" (UID: \"88900076-a48e-4175-ac37-454811fcf9d5\") " pod="openshift-infra/auto-csr-approver-29563640-bxcc2" Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.464013 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563640-bxcc2" Mar 18 07:20:00 crc kubenswrapper[4917]: I0318 07:20:00.919152 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563640-bxcc2"] Mar 18 07:20:00 crc kubenswrapper[4917]: W0318 07:20:00.921844 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod88900076_a48e_4175_ac37_454811fcf9d5.slice/crio-404e175b84e234692a87aec9ec7b818a5fdb3b3b2d69555522c46f36d2d0e410 WatchSource:0}: Error finding container 404e175b84e234692a87aec9ec7b818a5fdb3b3b2d69555522c46f36d2d0e410: Status 404 returned error can't find the container with id 404e175b84e234692a87aec9ec7b818a5fdb3b3b2d69555522c46f36d2d0e410 Mar 18 07:20:01 crc kubenswrapper[4917]: I0318 07:20:01.659060 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563640-bxcc2" event={"ID":"88900076-a48e-4175-ac37-454811fcf9d5","Type":"ContainerStarted","Data":"404e175b84e234692a87aec9ec7b818a5fdb3b3b2d69555522c46f36d2d0e410"} Mar 18 07:20:03 crc kubenswrapper[4917]: I0318 07:20:03.680338 4917 generic.go:334] "Generic (PLEG): container finished" podID="88900076-a48e-4175-ac37-454811fcf9d5" containerID="9b22930d56c4fc93b9b6ec793fd51cadda55b8b0a69e89bc8596b2ca30ae0bc3" exitCode=0 Mar 18 07:20:03 crc kubenswrapper[4917]: I0318 07:20:03.680435 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563640-bxcc2" event={"ID":"88900076-a48e-4175-ac37-454811fcf9d5","Type":"ContainerDied","Data":"9b22930d56c4fc93b9b6ec793fd51cadda55b8b0a69e89bc8596b2ca30ae0bc3"} Mar 18 07:20:04 crc kubenswrapper[4917]: I0318 07:20:04.772941 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:20:04 crc kubenswrapper[4917]: E0318 07:20:04.773414 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:20:04 crc kubenswrapper[4917]: I0318 07:20:04.798004 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:20:04 crc kubenswrapper[4917]: I0318 07:20:04.798080 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:20:04 crc kubenswrapper[4917]: I0318 07:20:04.891881 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:20:05 crc kubenswrapper[4917]: I0318 07:20:05.028854 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563640-bxcc2" Mar 18 07:20:05 crc kubenswrapper[4917]: I0318 07:20:05.200118 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw8tz\" (UniqueName: \"kubernetes.io/projected/88900076-a48e-4175-ac37-454811fcf9d5-kube-api-access-mw8tz\") pod \"88900076-a48e-4175-ac37-454811fcf9d5\" (UID: \"88900076-a48e-4175-ac37-454811fcf9d5\") " Mar 18 07:20:05 crc kubenswrapper[4917]: I0318 07:20:05.206240 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88900076-a48e-4175-ac37-454811fcf9d5-kube-api-access-mw8tz" (OuterVolumeSpecName: "kube-api-access-mw8tz") pod "88900076-a48e-4175-ac37-454811fcf9d5" (UID: "88900076-a48e-4175-ac37-454811fcf9d5"). InnerVolumeSpecName "kube-api-access-mw8tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:20:05 crc kubenswrapper[4917]: I0318 07:20:05.302099 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw8tz\" (UniqueName: \"kubernetes.io/projected/88900076-a48e-4175-ac37-454811fcf9d5-kube-api-access-mw8tz\") on node \"crc\" DevicePath \"\"" Mar 18 07:20:05 crc kubenswrapper[4917]: I0318 07:20:05.701195 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563640-bxcc2" Mar 18 07:20:05 crc kubenswrapper[4917]: I0318 07:20:05.701247 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563640-bxcc2" event={"ID":"88900076-a48e-4175-ac37-454811fcf9d5","Type":"ContainerDied","Data":"404e175b84e234692a87aec9ec7b818a5fdb3b3b2d69555522c46f36d2d0e410"} Mar 18 07:20:05 crc kubenswrapper[4917]: I0318 07:20:05.701284 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="404e175b84e234692a87aec9ec7b818a5fdb3b3b2d69555522c46f36d2d0e410" Mar 18 07:20:05 crc kubenswrapper[4917]: I0318 07:20:05.758939 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:20:05 crc kubenswrapper[4917]: I0318 07:20:05.845173 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nghf6"] Mar 18 07:20:06 crc kubenswrapper[4917]: I0318 07:20:06.119632 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563634-87q2d"] Mar 18 07:20:06 crc kubenswrapper[4917]: I0318 07:20:06.124520 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563634-87q2d"] Mar 18 07:20:07 crc kubenswrapper[4917]: I0318 07:20:07.716486 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nghf6" podUID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerName="registry-server" containerID="cri-o://bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f" gracePeriod=2 Mar 18 07:20:07 crc kubenswrapper[4917]: I0318 07:20:07.783178 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e589dd70-066e-4ade-9042-78bef37bba3f" path="/var/lib/kubelet/pods/e589dd70-066e-4ade-9042-78bef37bba3f/volumes" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.171195 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.253448 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-utilities\") pod \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.253577 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grpcd\" (UniqueName: \"kubernetes.io/projected/bc82b574-4d3e-4c81-9426-64c236a4d2a2-kube-api-access-grpcd\") pod \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.253653 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-catalog-content\") pod \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\" (UID: \"bc82b574-4d3e-4c81-9426-64c236a4d2a2\") " Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.254613 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-utilities" (OuterVolumeSpecName: "utilities") pod "bc82b574-4d3e-4c81-9426-64c236a4d2a2" (UID: "bc82b574-4d3e-4c81-9426-64c236a4d2a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.260076 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc82b574-4d3e-4c81-9426-64c236a4d2a2-kube-api-access-grpcd" (OuterVolumeSpecName: "kube-api-access-grpcd") pod "bc82b574-4d3e-4c81-9426-64c236a4d2a2" (UID: "bc82b574-4d3e-4c81-9426-64c236a4d2a2"). InnerVolumeSpecName "kube-api-access-grpcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.337007 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc82b574-4d3e-4c81-9426-64c236a4d2a2" (UID: "bc82b574-4d3e-4c81-9426-64c236a4d2a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.356287 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grpcd\" (UniqueName: \"kubernetes.io/projected/bc82b574-4d3e-4c81-9426-64c236a4d2a2-kube-api-access-grpcd\") on node \"crc\" DevicePath \"\"" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.356358 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.356384 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc82b574-4d3e-4c81-9426-64c236a4d2a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.732070 4917 generic.go:334] "Generic (PLEG): container finished" podID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerID="bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f" exitCode=0 Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.732134 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nghf6" event={"ID":"bc82b574-4d3e-4c81-9426-64c236a4d2a2","Type":"ContainerDied","Data":"bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f"} Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.732178 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nghf6" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.732251 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nghf6" event={"ID":"bc82b574-4d3e-4c81-9426-64c236a4d2a2","Type":"ContainerDied","Data":"3100954f849bb59c788d4cbadd2360128139f93d6a1bcff947ff8fe67abf9bb1"} Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.732296 4917 scope.go:117] "RemoveContainer" containerID="bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.759041 4917 scope.go:117] "RemoveContainer" containerID="4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.780575 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nghf6"] Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.790831 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nghf6"] Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.806001 4917 scope.go:117] "RemoveContainer" containerID="bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.821138 4917 scope.go:117] "RemoveContainer" containerID="bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f" Mar 18 07:20:08 crc kubenswrapper[4917]: E0318 07:20:08.821482 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f\": container with ID starting with bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f not found: ID does not exist" containerID="bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.821511 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f"} err="failed to get container status \"bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f\": rpc error: code = NotFound desc = could not find container \"bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f\": container with ID starting with bdf733000ecc0a8e45123e73009af72cfa0f57fa8c2ceec894345f1e24736f5f not found: ID does not exist" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.821532 4917 scope.go:117] "RemoveContainer" containerID="4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835" Mar 18 07:20:08 crc kubenswrapper[4917]: E0318 07:20:08.821739 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835\": container with ID starting with 4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835 not found: ID does not exist" containerID="4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.821761 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835"} err="failed to get container status \"4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835\": rpc error: code = NotFound desc = could not find container \"4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835\": container with ID starting with 4c29fb13273d3da3d316a2d81e143ee35125a21a360ca78082028677aea75835 not found: ID does not exist" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.821774 4917 scope.go:117] "RemoveContainer" containerID="bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60" Mar 18 07:20:08 crc kubenswrapper[4917]: E0318 07:20:08.821989 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60\": container with ID starting with bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60 not found: ID does not exist" containerID="bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60" Mar 18 07:20:08 crc kubenswrapper[4917]: I0318 07:20:08.822043 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60"} err="failed to get container status \"bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60\": rpc error: code = NotFound desc = could not find container \"bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60\": container with ID starting with bfb8a777b1f14735d887f4daec87aa2d852075cdf708183ea431963820042a60 not found: ID does not exist" Mar 18 07:20:09 crc kubenswrapper[4917]: I0318 07:20:09.784731 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" path="/var/lib/kubelet/pods/bc82b574-4d3e-4c81-9426-64c236a4d2a2/volumes" Mar 18 07:20:16 crc kubenswrapper[4917]: I0318 07:20:16.773183 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:20:16 crc kubenswrapper[4917]: E0318 07:20:16.774536 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.445091 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dvjhm"] Mar 18 07:20:22 crc kubenswrapper[4917]: E0318 07:20:22.446084 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerName="registry-server" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.446103 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerName="registry-server" Mar 18 07:20:22 crc kubenswrapper[4917]: E0318 07:20:22.446126 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerName="extract-utilities" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.446135 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerName="extract-utilities" Mar 18 07:20:22 crc kubenswrapper[4917]: E0318 07:20:22.446150 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88900076-a48e-4175-ac37-454811fcf9d5" containerName="oc" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.446157 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="88900076-a48e-4175-ac37-454811fcf9d5" containerName="oc" Mar 18 07:20:22 crc kubenswrapper[4917]: E0318 07:20:22.446172 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerName="extract-content" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.446179 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerName="extract-content" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.446320 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc82b574-4d3e-4c81-9426-64c236a4d2a2" containerName="registry-server" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.446340 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="88900076-a48e-4175-ac37-454811fcf9d5" containerName="oc" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.447351 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.468129 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvjhm"] Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.575698 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-utilities\") pod \"redhat-operators-dvjhm\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.575738 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-catalog-content\") pod \"redhat-operators-dvjhm\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.575814 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nbnr\" (UniqueName: \"kubernetes.io/projected/66b6f83d-7c4d-4176-b859-2f6a2a04931d-kube-api-access-8nbnr\") pod \"redhat-operators-dvjhm\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.677113 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nbnr\" (UniqueName: \"kubernetes.io/projected/66b6f83d-7c4d-4176-b859-2f6a2a04931d-kube-api-access-8nbnr\") pod \"redhat-operators-dvjhm\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.677241 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-utilities\") pod \"redhat-operators-dvjhm\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.677277 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-catalog-content\") pod \"redhat-operators-dvjhm\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.677769 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-utilities\") pod \"redhat-operators-dvjhm\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.677949 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-catalog-content\") pod \"redhat-operators-dvjhm\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.695014 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nbnr\" (UniqueName: \"kubernetes.io/projected/66b6f83d-7c4d-4176-b859-2f6a2a04931d-kube-api-access-8nbnr\") pod \"redhat-operators-dvjhm\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:22 crc kubenswrapper[4917]: I0318 07:20:22.794558 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:23 crc kubenswrapper[4917]: I0318 07:20:23.218806 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dvjhm"] Mar 18 07:20:23 crc kubenswrapper[4917]: I0318 07:20:23.863651 4917 generic.go:334] "Generic (PLEG): container finished" podID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerID="cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde" exitCode=0 Mar 18 07:20:23 crc kubenswrapper[4917]: I0318 07:20:23.863730 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvjhm" event={"ID":"66b6f83d-7c4d-4176-b859-2f6a2a04931d","Type":"ContainerDied","Data":"cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde"} Mar 18 07:20:23 crc kubenswrapper[4917]: I0318 07:20:23.863990 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvjhm" event={"ID":"66b6f83d-7c4d-4176-b859-2f6a2a04931d","Type":"ContainerStarted","Data":"a0f29210b8e97d0721ef5a136fd95be05d6fedc332896d94aab3e5d58179773a"} Mar 18 07:20:24 crc kubenswrapper[4917]: I0318 07:20:24.872640 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvjhm" event={"ID":"66b6f83d-7c4d-4176-b859-2f6a2a04931d","Type":"ContainerStarted","Data":"fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8"} Mar 18 07:20:25 crc kubenswrapper[4917]: I0318 07:20:25.885233 4917 generic.go:334] "Generic (PLEG): container finished" podID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerID="fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8" exitCode=0 Mar 18 07:20:25 crc kubenswrapper[4917]: I0318 07:20:25.885358 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvjhm" event={"ID":"66b6f83d-7c4d-4176-b859-2f6a2a04931d","Type":"ContainerDied","Data":"fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8"} Mar 18 07:20:26 crc kubenswrapper[4917]: I0318 07:20:26.894682 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvjhm" event={"ID":"66b6f83d-7c4d-4176-b859-2f6a2a04931d","Type":"ContainerStarted","Data":"46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7"} Mar 18 07:20:26 crc kubenswrapper[4917]: I0318 07:20:26.927667 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dvjhm" podStartSLOduration=2.3998127240000002 podStartE2EDuration="4.927646601s" podCreationTimestamp="2026-03-18 07:20:22 +0000 UTC" firstStartedPulling="2026-03-18 07:20:23.866146254 +0000 UTC m=+2008.807300968" lastFinishedPulling="2026-03-18 07:20:26.393980131 +0000 UTC m=+2011.335134845" observedRunningTime="2026-03-18 07:20:26.925678073 +0000 UTC m=+2011.866832797" watchObservedRunningTime="2026-03-18 07:20:26.927646601 +0000 UTC m=+2011.868801325" Mar 18 07:20:28 crc kubenswrapper[4917]: I0318 07:20:28.772822 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:20:28 crc kubenswrapper[4917]: E0318 07:20:28.773198 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:20:32 crc kubenswrapper[4917]: I0318 07:20:32.795391 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:32 crc kubenswrapper[4917]: I0318 07:20:32.795980 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:33 crc kubenswrapper[4917]: I0318 07:20:33.858560 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dvjhm" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerName="registry-server" probeResult="failure" output=< Mar 18 07:20:33 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 07:20:33 crc kubenswrapper[4917]: > Mar 18 07:20:40 crc kubenswrapper[4917]: I0318 07:20:40.773223 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:20:40 crc kubenswrapper[4917]: E0318 07:20:40.775669 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:20:42 crc kubenswrapper[4917]: I0318 07:20:42.869022 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:42 crc kubenswrapper[4917]: I0318 07:20:42.947783 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:43 crc kubenswrapper[4917]: I0318 07:20:43.114543 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvjhm"] Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.042871 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dvjhm" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerName="registry-server" containerID="cri-o://46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7" gracePeriod=2 Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.435417 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.515973 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nbnr\" (UniqueName: \"kubernetes.io/projected/66b6f83d-7c4d-4176-b859-2f6a2a04931d-kube-api-access-8nbnr\") pod \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.516043 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-utilities\") pod \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.516071 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-catalog-content\") pod \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\" (UID: \"66b6f83d-7c4d-4176-b859-2f6a2a04931d\") " Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.517148 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-utilities" (OuterVolumeSpecName: "utilities") pod "66b6f83d-7c4d-4176-b859-2f6a2a04931d" (UID: "66b6f83d-7c4d-4176-b859-2f6a2a04931d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.521930 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66b6f83d-7c4d-4176-b859-2f6a2a04931d-kube-api-access-8nbnr" (OuterVolumeSpecName: "kube-api-access-8nbnr") pod "66b6f83d-7c4d-4176-b859-2f6a2a04931d" (UID: "66b6f83d-7c4d-4176-b859-2f6a2a04931d"). InnerVolumeSpecName "kube-api-access-8nbnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.617428 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nbnr\" (UniqueName: \"kubernetes.io/projected/66b6f83d-7c4d-4176-b859-2f6a2a04931d-kube-api-access-8nbnr\") on node \"crc\" DevicePath \"\"" Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.617464 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.660932 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66b6f83d-7c4d-4176-b859-2f6a2a04931d" (UID: "66b6f83d-7c4d-4176-b859-2f6a2a04931d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:20:44 crc kubenswrapper[4917]: I0318 07:20:44.718563 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66b6f83d-7c4d-4176-b859-2f6a2a04931d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.055894 4917 generic.go:334] "Generic (PLEG): container finished" podID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerID="46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7" exitCode=0 Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.056025 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dvjhm" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.056021 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvjhm" event={"ID":"66b6f83d-7c4d-4176-b859-2f6a2a04931d","Type":"ContainerDied","Data":"46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7"} Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.056502 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dvjhm" event={"ID":"66b6f83d-7c4d-4176-b859-2f6a2a04931d","Type":"ContainerDied","Data":"a0f29210b8e97d0721ef5a136fd95be05d6fedc332896d94aab3e5d58179773a"} Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.056525 4917 scope.go:117] "RemoveContainer" containerID="46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.103257 4917 scope.go:117] "RemoveContainer" containerID="fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.106928 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dvjhm"] Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.114736 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dvjhm"] Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.136082 4917 scope.go:117] "RemoveContainer" containerID="cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.165445 4917 scope.go:117] "RemoveContainer" containerID="46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7" Mar 18 07:20:45 crc kubenswrapper[4917]: E0318 07:20:45.166058 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7\": container with ID starting with 46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7 not found: ID does not exist" containerID="46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.166109 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7"} err="failed to get container status \"46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7\": rpc error: code = NotFound desc = could not find container \"46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7\": container with ID starting with 46afcb73ee910096a3a562e8423c39a7ef656bb1e9e3335c8124e2b357b60ec7 not found: ID does not exist" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.166142 4917 scope.go:117] "RemoveContainer" containerID="fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8" Mar 18 07:20:45 crc kubenswrapper[4917]: E0318 07:20:45.166806 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8\": container with ID starting with fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8 not found: ID does not exist" containerID="fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.166848 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8"} err="failed to get container status \"fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8\": rpc error: code = NotFound desc = could not find container \"fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8\": container with ID starting with fc65fd7d35994417032308bb3b096275dd63149848bbff7f2f9ec8f286f788a8 not found: ID does not exist" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.166876 4917 scope.go:117] "RemoveContainer" containerID="cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde" Mar 18 07:20:45 crc kubenswrapper[4917]: E0318 07:20:45.167239 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde\": container with ID starting with cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde not found: ID does not exist" containerID="cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.167273 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde"} err="failed to get container status \"cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde\": rpc error: code = NotFound desc = could not find container \"cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde\": container with ID starting with cd5fe7b90a7584305df74fe1f6859c42707fdb6fc2d08405588772cd21cf0dde not found: ID does not exist" Mar 18 07:20:45 crc kubenswrapper[4917]: I0318 07:20:45.784999 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" path="/var/lib/kubelet/pods/66b6f83d-7c4d-4176-b859-2f6a2a04931d/volumes" Mar 18 07:20:55 crc kubenswrapper[4917]: I0318 07:20:55.782407 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:20:55 crc kubenswrapper[4917]: E0318 07:20:55.783254 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:21:03 crc kubenswrapper[4917]: I0318 07:21:03.250236 4917 scope.go:117] "RemoveContainer" containerID="d25c9d6395268fe5eeac63114c4b35806b45e14b2c6f9aee8a8b0e487fbb528d" Mar 18 07:21:10 crc kubenswrapper[4917]: I0318 07:21:10.774084 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:21:10 crc kubenswrapper[4917]: E0318 07:21:10.775028 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:21:22 crc kubenswrapper[4917]: I0318 07:21:22.773354 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:21:22 crc kubenswrapper[4917]: E0318 07:21:22.774525 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:21:37 crc kubenswrapper[4917]: I0318 07:21:37.772720 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:21:38 crc kubenswrapper[4917]: I0318 07:21:38.528066 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"7d99a0a5de9ab844119f5f07dbd4c53c691d8276a2994404375328e295cae9e4"} Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.168333 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563642-2pljf"] Mar 18 07:22:00 crc kubenswrapper[4917]: E0318 07:22:00.169560 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerName="extract-utilities" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.169614 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerName="extract-utilities" Mar 18 07:22:00 crc kubenswrapper[4917]: E0318 07:22:00.169641 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerName="extract-content" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.169653 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerName="extract-content" Mar 18 07:22:00 crc kubenswrapper[4917]: E0318 07:22:00.169675 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerName="registry-server" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.169687 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerName="registry-server" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.169939 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="66b6f83d-7c4d-4176-b859-2f6a2a04931d" containerName="registry-server" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.170797 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563642-2pljf" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.174149 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.174160 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.178845 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.186264 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563642-2pljf"] Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.284578 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxzwq\" (UniqueName: \"kubernetes.io/projected/6077eb1c-ebc6-4b83-9809-4cf12ae02512-kube-api-access-cxzwq\") pod \"auto-csr-approver-29563642-2pljf\" (UID: \"6077eb1c-ebc6-4b83-9809-4cf12ae02512\") " pod="openshift-infra/auto-csr-approver-29563642-2pljf" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.386427 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxzwq\" (UniqueName: \"kubernetes.io/projected/6077eb1c-ebc6-4b83-9809-4cf12ae02512-kube-api-access-cxzwq\") pod \"auto-csr-approver-29563642-2pljf\" (UID: \"6077eb1c-ebc6-4b83-9809-4cf12ae02512\") " pod="openshift-infra/auto-csr-approver-29563642-2pljf" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.411148 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxzwq\" (UniqueName: \"kubernetes.io/projected/6077eb1c-ebc6-4b83-9809-4cf12ae02512-kube-api-access-cxzwq\") pod \"auto-csr-approver-29563642-2pljf\" (UID: \"6077eb1c-ebc6-4b83-9809-4cf12ae02512\") " pod="openshift-infra/auto-csr-approver-29563642-2pljf" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.505972 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563642-2pljf" Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.960360 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563642-2pljf"] Mar 18 07:22:00 crc kubenswrapper[4917]: I0318 07:22:00.972559 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:22:01 crc kubenswrapper[4917]: I0318 07:22:01.723239 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563642-2pljf" event={"ID":"6077eb1c-ebc6-4b83-9809-4cf12ae02512","Type":"ContainerStarted","Data":"cd601a53bfce42cf058e4fd42195ecb7ae9a2bbf07eeeb1a151cf421c220a1c5"} Mar 18 07:22:02 crc kubenswrapper[4917]: I0318 07:22:02.737365 4917 generic.go:334] "Generic (PLEG): container finished" podID="6077eb1c-ebc6-4b83-9809-4cf12ae02512" containerID="c6d70710a8d91c08b313986d14db660a66f60d43138375eee9fa01d8e5440fed" exitCode=0 Mar 18 07:22:02 crc kubenswrapper[4917]: I0318 07:22:02.737445 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563642-2pljf" event={"ID":"6077eb1c-ebc6-4b83-9809-4cf12ae02512","Type":"ContainerDied","Data":"c6d70710a8d91c08b313986d14db660a66f60d43138375eee9fa01d8e5440fed"} Mar 18 07:22:04 crc kubenswrapper[4917]: I0318 07:22:04.080170 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563642-2pljf" Mar 18 07:22:04 crc kubenswrapper[4917]: I0318 07:22:04.145750 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxzwq\" (UniqueName: \"kubernetes.io/projected/6077eb1c-ebc6-4b83-9809-4cf12ae02512-kube-api-access-cxzwq\") pod \"6077eb1c-ebc6-4b83-9809-4cf12ae02512\" (UID: \"6077eb1c-ebc6-4b83-9809-4cf12ae02512\") " Mar 18 07:22:04 crc kubenswrapper[4917]: I0318 07:22:04.157047 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6077eb1c-ebc6-4b83-9809-4cf12ae02512-kube-api-access-cxzwq" (OuterVolumeSpecName: "kube-api-access-cxzwq") pod "6077eb1c-ebc6-4b83-9809-4cf12ae02512" (UID: "6077eb1c-ebc6-4b83-9809-4cf12ae02512"). InnerVolumeSpecName "kube-api-access-cxzwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:22:04 crc kubenswrapper[4917]: I0318 07:22:04.247616 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxzwq\" (UniqueName: \"kubernetes.io/projected/6077eb1c-ebc6-4b83-9809-4cf12ae02512-kube-api-access-cxzwq\") on node \"crc\" DevicePath \"\"" Mar 18 07:22:04 crc kubenswrapper[4917]: I0318 07:22:04.770936 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563642-2pljf" event={"ID":"6077eb1c-ebc6-4b83-9809-4cf12ae02512","Type":"ContainerDied","Data":"cd601a53bfce42cf058e4fd42195ecb7ae9a2bbf07eeeb1a151cf421c220a1c5"} Mar 18 07:22:04 crc kubenswrapper[4917]: I0318 07:22:04.770997 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd601a53bfce42cf058e4fd42195ecb7ae9a2bbf07eeeb1a151cf421c220a1c5" Mar 18 07:22:04 crc kubenswrapper[4917]: I0318 07:22:04.771028 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563642-2pljf" Mar 18 07:22:05 crc kubenswrapper[4917]: I0318 07:22:05.160965 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563636-zjtcd"] Mar 18 07:22:05 crc kubenswrapper[4917]: I0318 07:22:05.170832 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563636-zjtcd"] Mar 18 07:22:05 crc kubenswrapper[4917]: I0318 07:22:05.788686 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e9e9790-2203-473b-a66a-d101887d5a81" path="/var/lib/kubelet/pods/1e9e9790-2203-473b-a66a-d101887d5a81/volumes" Mar 18 07:22:57 crc kubenswrapper[4917]: I0318 07:22:57.940334 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtvf"] Mar 18 07:22:57 crc kubenswrapper[4917]: E0318 07:22:57.942269 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6077eb1c-ebc6-4b83-9809-4cf12ae02512" containerName="oc" Mar 18 07:22:57 crc kubenswrapper[4917]: I0318 07:22:57.942295 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6077eb1c-ebc6-4b83-9809-4cf12ae02512" containerName="oc" Mar 18 07:22:57 crc kubenswrapper[4917]: I0318 07:22:57.942967 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6077eb1c-ebc6-4b83-9809-4cf12ae02512" containerName="oc" Mar 18 07:22:57 crc kubenswrapper[4917]: I0318 07:22:57.944164 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:57 crc kubenswrapper[4917]: I0318 07:22:57.955369 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtvf"] Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.030831 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qnqh\" (UniqueName: \"kubernetes.io/projected/d1d1c329-59d3-4fd7-acaa-7c48cd692856-kube-api-access-7qnqh\") pod \"redhat-marketplace-nqtvf\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.030915 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-utilities\") pod \"redhat-marketplace-nqtvf\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.031193 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-catalog-content\") pod \"redhat-marketplace-nqtvf\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.132383 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-catalog-content\") pod \"redhat-marketplace-nqtvf\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.132825 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qnqh\" (UniqueName: \"kubernetes.io/projected/d1d1c329-59d3-4fd7-acaa-7c48cd692856-kube-api-access-7qnqh\") pod \"redhat-marketplace-nqtvf\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.132867 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-utilities\") pod \"redhat-marketplace-nqtvf\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.132987 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-catalog-content\") pod \"redhat-marketplace-nqtvf\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.133550 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-utilities\") pod \"redhat-marketplace-nqtvf\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.157260 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qnqh\" (UniqueName: \"kubernetes.io/projected/d1d1c329-59d3-4fd7-acaa-7c48cd692856-kube-api-access-7qnqh\") pod \"redhat-marketplace-nqtvf\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.266923 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:22:58 crc kubenswrapper[4917]: I0318 07:22:58.767474 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtvf"] Mar 18 07:22:59 crc kubenswrapper[4917]: I0318 07:22:59.253190 4917 generic.go:334] "Generic (PLEG): container finished" podID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerID="1812609e93b9e85bfb26d3a195f7b005d8fdd72eb836a98ce420229d37a68bd6" exitCode=0 Mar 18 07:22:59 crc kubenswrapper[4917]: I0318 07:22:59.253244 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtvf" event={"ID":"d1d1c329-59d3-4fd7-acaa-7c48cd692856","Type":"ContainerDied","Data":"1812609e93b9e85bfb26d3a195f7b005d8fdd72eb836a98ce420229d37a68bd6"} Mar 18 07:22:59 crc kubenswrapper[4917]: I0318 07:22:59.253289 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtvf" event={"ID":"d1d1c329-59d3-4fd7-acaa-7c48cd692856","Type":"ContainerStarted","Data":"0a4df99a2ba707cfb7da01c7e30f78197dafd7729c67afe577609d50da8d2082"} Mar 18 07:23:00 crc kubenswrapper[4917]: I0318 07:23:00.267031 4917 generic.go:334] "Generic (PLEG): container finished" podID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerID="b5aeb999e3ed9599dc07fd4800eb171bf0d62827ca6081b71459099e73c22038" exitCode=0 Mar 18 07:23:00 crc kubenswrapper[4917]: I0318 07:23:00.267117 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtvf" event={"ID":"d1d1c329-59d3-4fd7-acaa-7c48cd692856","Type":"ContainerDied","Data":"b5aeb999e3ed9599dc07fd4800eb171bf0d62827ca6081b71459099e73c22038"} Mar 18 07:23:01 crc kubenswrapper[4917]: I0318 07:23:01.277165 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtvf" event={"ID":"d1d1c329-59d3-4fd7-acaa-7c48cd692856","Type":"ContainerStarted","Data":"f653e9d5a5c47f8c2f422b956248b95e9f49ca4943ef42e5761220267cea9646"} Mar 18 07:23:01 crc kubenswrapper[4917]: I0318 07:23:01.297859 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nqtvf" podStartSLOduration=2.790663838 podStartE2EDuration="4.297843765s" podCreationTimestamp="2026-03-18 07:22:57 +0000 UTC" firstStartedPulling="2026-03-18 07:22:59.255657405 +0000 UTC m=+2164.196812159" lastFinishedPulling="2026-03-18 07:23:00.762837362 +0000 UTC m=+2165.703992086" observedRunningTime="2026-03-18 07:23:01.295558089 +0000 UTC m=+2166.236712843" watchObservedRunningTime="2026-03-18 07:23:01.297843765 +0000 UTC m=+2166.238998479" Mar 18 07:23:03 crc kubenswrapper[4917]: I0318 07:23:03.376668 4917 scope.go:117] "RemoveContainer" containerID="1fb0855b08c2c5b380cd3ec7be98072701122f563ef5aa88de138ec8ec544fe8" Mar 18 07:23:08 crc kubenswrapper[4917]: I0318 07:23:08.267891 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:23:08 crc kubenswrapper[4917]: I0318 07:23:08.268170 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:23:08 crc kubenswrapper[4917]: I0318 07:23:08.339040 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:23:08 crc kubenswrapper[4917]: I0318 07:23:08.413210 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:23:08 crc kubenswrapper[4917]: I0318 07:23:08.587876 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtvf"] Mar 18 07:23:10 crc kubenswrapper[4917]: I0318 07:23:10.358885 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nqtvf" podUID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerName="registry-server" containerID="cri-o://f653e9d5a5c47f8c2f422b956248b95e9f49ca4943ef42e5761220267cea9646" gracePeriod=2 Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.368391 4917 generic.go:334] "Generic (PLEG): container finished" podID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerID="f653e9d5a5c47f8c2f422b956248b95e9f49ca4943ef42e5761220267cea9646" exitCode=0 Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.368484 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtvf" event={"ID":"d1d1c329-59d3-4fd7-acaa-7c48cd692856","Type":"ContainerDied","Data":"f653e9d5a5c47f8c2f422b956248b95e9f49ca4943ef42e5761220267cea9646"} Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.368867 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqtvf" event={"ID":"d1d1c329-59d3-4fd7-acaa-7c48cd692856","Type":"ContainerDied","Data":"0a4df99a2ba707cfb7da01c7e30f78197dafd7729c67afe577609d50da8d2082"} Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.368896 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a4df99a2ba707cfb7da01c7e30f78197dafd7729c67afe577609d50da8d2082" Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.387115 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.538458 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-utilities\") pod \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.538613 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-catalog-content\") pod \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.538730 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qnqh\" (UniqueName: \"kubernetes.io/projected/d1d1c329-59d3-4fd7-acaa-7c48cd692856-kube-api-access-7qnqh\") pod \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\" (UID: \"d1d1c329-59d3-4fd7-acaa-7c48cd692856\") " Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.540189 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-utilities" (OuterVolumeSpecName: "utilities") pod "d1d1c329-59d3-4fd7-acaa-7c48cd692856" (UID: "d1d1c329-59d3-4fd7-acaa-7c48cd692856"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.547029 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1d1c329-59d3-4fd7-acaa-7c48cd692856-kube-api-access-7qnqh" (OuterVolumeSpecName: "kube-api-access-7qnqh") pod "d1d1c329-59d3-4fd7-acaa-7c48cd692856" (UID: "d1d1c329-59d3-4fd7-acaa-7c48cd692856"). InnerVolumeSpecName "kube-api-access-7qnqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.567918 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1d1c329-59d3-4fd7-acaa-7c48cd692856" (UID: "d1d1c329-59d3-4fd7-acaa-7c48cd692856"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.641080 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qnqh\" (UniqueName: \"kubernetes.io/projected/d1d1c329-59d3-4fd7-acaa-7c48cd692856-kube-api-access-7qnqh\") on node \"crc\" DevicePath \"\"" Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.641135 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:23:11 crc kubenswrapper[4917]: I0318 07:23:11.641144 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1d1c329-59d3-4fd7-acaa-7c48cd692856-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:23:12 crc kubenswrapper[4917]: I0318 07:23:12.378513 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqtvf" Mar 18 07:23:12 crc kubenswrapper[4917]: I0318 07:23:12.407191 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtvf"] Mar 18 07:23:12 crc kubenswrapper[4917]: I0318 07:23:12.414139 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqtvf"] Mar 18 07:23:13 crc kubenswrapper[4917]: I0318 07:23:13.790795 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" path="/var/lib/kubelet/pods/d1d1c329-59d3-4fd7-acaa-7c48cd692856/volumes" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.283163 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gjjfx"] Mar 18 07:23:43 crc kubenswrapper[4917]: E0318 07:23:43.284117 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerName="extract-content" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.284130 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerName="extract-content" Mar 18 07:23:43 crc kubenswrapper[4917]: E0318 07:23:43.284143 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerName="extract-utilities" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.284150 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerName="extract-utilities" Mar 18 07:23:43 crc kubenswrapper[4917]: E0318 07:23:43.284166 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerName="registry-server" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.284173 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerName="registry-server" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.284294 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1d1c329-59d3-4fd7-acaa-7c48cd692856" containerName="registry-server" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.285480 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.316420 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjjfx"] Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.356989 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-catalog-content\") pod \"community-operators-gjjfx\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.357153 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-utilities\") pod \"community-operators-gjjfx\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.357274 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b2z2\" (UniqueName: \"kubernetes.io/projected/88854de1-d536-4704-98c1-860169acf296-kube-api-access-9b2z2\") pod \"community-operators-gjjfx\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.458551 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-catalog-content\") pod \"community-operators-gjjfx\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.459120 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-utilities\") pod \"community-operators-gjjfx\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.459283 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-catalog-content\") pod \"community-operators-gjjfx\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.460031 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-utilities\") pod \"community-operators-gjjfx\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.460387 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b2z2\" (UniqueName: \"kubernetes.io/projected/88854de1-d536-4704-98c1-860169acf296-kube-api-access-9b2z2\") pod \"community-operators-gjjfx\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.497901 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b2z2\" (UniqueName: \"kubernetes.io/projected/88854de1-d536-4704-98c1-860169acf296-kube-api-access-9b2z2\") pod \"community-operators-gjjfx\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.622651 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:43 crc kubenswrapper[4917]: I0318 07:23:43.883570 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gjjfx"] Mar 18 07:23:44 crc kubenswrapper[4917]: I0318 07:23:44.730534 4917 generic.go:334] "Generic (PLEG): container finished" podID="88854de1-d536-4704-98c1-860169acf296" containerID="d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836" exitCode=0 Mar 18 07:23:44 crc kubenswrapper[4917]: I0318 07:23:44.730711 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjjfx" event={"ID":"88854de1-d536-4704-98c1-860169acf296","Type":"ContainerDied","Data":"d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836"} Mar 18 07:23:44 crc kubenswrapper[4917]: I0318 07:23:44.730937 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjjfx" event={"ID":"88854de1-d536-4704-98c1-860169acf296","Type":"ContainerStarted","Data":"3fe917f665a876705687a09bb6d01d330a47c8246bd1acb795d3da11b776b172"} Mar 18 07:23:45 crc kubenswrapper[4917]: I0318 07:23:45.741265 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjjfx" event={"ID":"88854de1-d536-4704-98c1-860169acf296","Type":"ContainerStarted","Data":"7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412"} Mar 18 07:23:46 crc kubenswrapper[4917]: I0318 07:23:46.787460 4917 generic.go:334] "Generic (PLEG): container finished" podID="88854de1-d536-4704-98c1-860169acf296" containerID="7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412" exitCode=0 Mar 18 07:23:46 crc kubenswrapper[4917]: I0318 07:23:46.787528 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjjfx" event={"ID":"88854de1-d536-4704-98c1-860169acf296","Type":"ContainerDied","Data":"7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412"} Mar 18 07:23:48 crc kubenswrapper[4917]: I0318 07:23:48.813659 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjjfx" event={"ID":"88854de1-d536-4704-98c1-860169acf296","Type":"ContainerStarted","Data":"495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b"} Mar 18 07:23:48 crc kubenswrapper[4917]: I0318 07:23:48.838930 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gjjfx" podStartSLOduration=2.163582085 podStartE2EDuration="5.838900386s" podCreationTimestamp="2026-03-18 07:23:43 +0000 UTC" firstStartedPulling="2026-03-18 07:23:44.732706066 +0000 UTC m=+2209.673860810" lastFinishedPulling="2026-03-18 07:23:48.408024397 +0000 UTC m=+2213.349179111" observedRunningTime="2026-03-18 07:23:48.833508275 +0000 UTC m=+2213.774663009" watchObservedRunningTime="2026-03-18 07:23:48.838900386 +0000 UTC m=+2213.780055140" Mar 18 07:23:53 crc kubenswrapper[4917]: I0318 07:23:53.623267 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:53 crc kubenswrapper[4917]: I0318 07:23:53.623606 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:53 crc kubenswrapper[4917]: I0318 07:23:53.691054 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:53 crc kubenswrapper[4917]: I0318 07:23:53.936629 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:53 crc kubenswrapper[4917]: I0318 07:23:53.997458 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjjfx"] Mar 18 07:23:55 crc kubenswrapper[4917]: I0318 07:23:55.881659 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gjjfx" podUID="88854de1-d536-4704-98c1-860169acf296" containerName="registry-server" containerID="cri-o://495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b" gracePeriod=2 Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.482402 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.676746 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-utilities\") pod \"88854de1-d536-4704-98c1-860169acf296\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.676838 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b2z2\" (UniqueName: \"kubernetes.io/projected/88854de1-d536-4704-98c1-860169acf296-kube-api-access-9b2z2\") pod \"88854de1-d536-4704-98c1-860169acf296\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.676925 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-catalog-content\") pod \"88854de1-d536-4704-98c1-860169acf296\" (UID: \"88854de1-d536-4704-98c1-860169acf296\") " Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.677503 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-utilities" (OuterVolumeSpecName: "utilities") pod "88854de1-d536-4704-98c1-860169acf296" (UID: "88854de1-d536-4704-98c1-860169acf296"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.686055 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88854de1-d536-4704-98c1-860169acf296-kube-api-access-9b2z2" (OuterVolumeSpecName: "kube-api-access-9b2z2") pod "88854de1-d536-4704-98c1-860169acf296" (UID: "88854de1-d536-4704-98c1-860169acf296"). InnerVolumeSpecName "kube-api-access-9b2z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.731438 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88854de1-d536-4704-98c1-860169acf296" (UID: "88854de1-d536-4704-98c1-860169acf296"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.779176 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.779223 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b2z2\" (UniqueName: \"kubernetes.io/projected/88854de1-d536-4704-98c1-860169acf296-kube-api-access-9b2z2\") on node \"crc\" DevicePath \"\"" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.779244 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88854de1-d536-4704-98c1-860169acf296-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.892738 4917 generic.go:334] "Generic (PLEG): container finished" podID="88854de1-d536-4704-98c1-860169acf296" containerID="495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b" exitCode=0 Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.892785 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjjfx" event={"ID":"88854de1-d536-4704-98c1-860169acf296","Type":"ContainerDied","Data":"495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b"} Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.892853 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gjjfx" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.893152 4917 scope.go:117] "RemoveContainer" containerID="495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.893132 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gjjfx" event={"ID":"88854de1-d536-4704-98c1-860169acf296","Type":"ContainerDied","Data":"3fe917f665a876705687a09bb6d01d330a47c8246bd1acb795d3da11b776b172"} Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.918640 4917 scope.go:117] "RemoveContainer" containerID="7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.961805 4917 scope.go:117] "RemoveContainer" containerID="d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.966669 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gjjfx"] Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.974165 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gjjfx"] Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.990744 4917 scope.go:117] "RemoveContainer" containerID="495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b" Mar 18 07:23:56 crc kubenswrapper[4917]: E0318 07:23:56.991252 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b\": container with ID starting with 495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b not found: ID does not exist" containerID="495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.991317 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b"} err="failed to get container status \"495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b\": rpc error: code = NotFound desc = could not find container \"495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b\": container with ID starting with 495a49852c070201f1a7c8197e409e7af2a0254da2cd35c23cc2c174dcd25b9b not found: ID does not exist" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.991360 4917 scope.go:117] "RemoveContainer" containerID="7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412" Mar 18 07:23:56 crc kubenswrapper[4917]: E0318 07:23:56.991815 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412\": container with ID starting with 7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412 not found: ID does not exist" containerID="7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.991844 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412"} err="failed to get container status \"7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412\": rpc error: code = NotFound desc = could not find container \"7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412\": container with ID starting with 7438a565d3f14d5cb10d84b047c60d5a219a42feea0b64e7c324460d4a41f412 not found: ID does not exist" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.991863 4917 scope.go:117] "RemoveContainer" containerID="d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836" Mar 18 07:23:56 crc kubenswrapper[4917]: E0318 07:23:56.992141 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836\": container with ID starting with d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836 not found: ID does not exist" containerID="d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836" Mar 18 07:23:56 crc kubenswrapper[4917]: I0318 07:23:56.992185 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836"} err="failed to get container status \"d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836\": rpc error: code = NotFound desc = could not find container \"d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836\": container with ID starting with d669ba02e32745e66463739652360276962ba0d05443124a9f74feeeb8a15836 not found: ID does not exist" Mar 18 07:23:57 crc kubenswrapper[4917]: I0318 07:23:57.789325 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88854de1-d536-4704-98c1-860169acf296" path="/var/lib/kubelet/pods/88854de1-d536-4704-98c1-860169acf296/volumes" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.157451 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563644-q5rm7"] Mar 18 07:24:00 crc kubenswrapper[4917]: E0318 07:24:00.158081 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88854de1-d536-4704-98c1-860169acf296" containerName="extract-content" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.158101 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="88854de1-d536-4704-98c1-860169acf296" containerName="extract-content" Mar 18 07:24:00 crc kubenswrapper[4917]: E0318 07:24:00.158123 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88854de1-d536-4704-98c1-860169acf296" containerName="registry-server" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.158131 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="88854de1-d536-4704-98c1-860169acf296" containerName="registry-server" Mar 18 07:24:00 crc kubenswrapper[4917]: E0318 07:24:00.158142 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88854de1-d536-4704-98c1-860169acf296" containerName="extract-utilities" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.158150 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="88854de1-d536-4704-98c1-860169acf296" containerName="extract-utilities" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.158311 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="88854de1-d536-4704-98c1-860169acf296" containerName="registry-server" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.158960 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563644-q5rm7" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.161718 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.162147 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.162455 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.167522 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563644-q5rm7"] Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.336856 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6wsm\" (UniqueName: \"kubernetes.io/projected/e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb-kube-api-access-l6wsm\") pod \"auto-csr-approver-29563644-q5rm7\" (UID: \"e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb\") " pod="openshift-infra/auto-csr-approver-29563644-q5rm7" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.438451 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6wsm\" (UniqueName: \"kubernetes.io/projected/e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb-kube-api-access-l6wsm\") pod \"auto-csr-approver-29563644-q5rm7\" (UID: \"e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb\") " pod="openshift-infra/auto-csr-approver-29563644-q5rm7" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.472999 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6wsm\" (UniqueName: \"kubernetes.io/projected/e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb-kube-api-access-l6wsm\") pod \"auto-csr-approver-29563644-q5rm7\" (UID: \"e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb\") " pod="openshift-infra/auto-csr-approver-29563644-q5rm7" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.496852 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563644-q5rm7" Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.780128 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563644-q5rm7"] Mar 18 07:24:00 crc kubenswrapper[4917]: I0318 07:24:00.938050 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563644-q5rm7" event={"ID":"e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb","Type":"ContainerStarted","Data":"fac843db9e1bc08f1c270bb4b936d0c5163025fabbc4ad0a4fe9ea5beea5bc38"} Mar 18 07:24:02 crc kubenswrapper[4917]: I0318 07:24:02.928735 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:24:02 crc kubenswrapper[4917]: I0318 07:24:02.929200 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:24:02 crc kubenswrapper[4917]: I0318 07:24:02.985663 4917 generic.go:334] "Generic (PLEG): container finished" podID="e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb" containerID="f17c04c11aeb92d4011ec40e91150515c24c40f23fd80a2a2d950a8d10bb11e1" exitCode=0 Mar 18 07:24:02 crc kubenswrapper[4917]: I0318 07:24:02.985742 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563644-q5rm7" event={"ID":"e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb","Type":"ContainerDied","Data":"f17c04c11aeb92d4011ec40e91150515c24c40f23fd80a2a2d950a8d10bb11e1"} Mar 18 07:24:04 crc kubenswrapper[4917]: I0318 07:24:04.309334 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563644-q5rm7" Mar 18 07:24:04 crc kubenswrapper[4917]: I0318 07:24:04.503146 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6wsm\" (UniqueName: \"kubernetes.io/projected/e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb-kube-api-access-l6wsm\") pod \"e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb\" (UID: \"e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb\") " Mar 18 07:24:04 crc kubenswrapper[4917]: I0318 07:24:04.510003 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb-kube-api-access-l6wsm" (OuterVolumeSpecName: "kube-api-access-l6wsm") pod "e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb" (UID: "e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb"). InnerVolumeSpecName "kube-api-access-l6wsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:24:04 crc kubenswrapper[4917]: I0318 07:24:04.604900 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6wsm\" (UniqueName: \"kubernetes.io/projected/e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb-kube-api-access-l6wsm\") on node \"crc\" DevicePath \"\"" Mar 18 07:24:05 crc kubenswrapper[4917]: I0318 07:24:05.003795 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563644-q5rm7" event={"ID":"e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb","Type":"ContainerDied","Data":"fac843db9e1bc08f1c270bb4b936d0c5163025fabbc4ad0a4fe9ea5beea5bc38"} Mar 18 07:24:05 crc kubenswrapper[4917]: I0318 07:24:05.004410 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fac843db9e1bc08f1c270bb4b936d0c5163025fabbc4ad0a4fe9ea5beea5bc38" Mar 18 07:24:05 crc kubenswrapper[4917]: I0318 07:24:05.003999 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563644-q5rm7" Mar 18 07:24:05 crc kubenswrapper[4917]: I0318 07:24:05.394266 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563638-6qdf5"] Mar 18 07:24:05 crc kubenswrapper[4917]: I0318 07:24:05.401452 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563638-6qdf5"] Mar 18 07:24:05 crc kubenswrapper[4917]: I0318 07:24:05.792050 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a894cd3-9073-473b-8295-364683bae145" path="/var/lib/kubelet/pods/2a894cd3-9073-473b-8295-364683bae145/volumes" Mar 18 07:24:32 crc kubenswrapper[4917]: I0318 07:24:32.928794 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:24:32 crc kubenswrapper[4917]: I0318 07:24:32.929497 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:25:02 crc kubenswrapper[4917]: I0318 07:25:02.929299 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:25:02 crc kubenswrapper[4917]: I0318 07:25:02.930035 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:25:02 crc kubenswrapper[4917]: I0318 07:25:02.930106 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:25:02 crc kubenswrapper[4917]: I0318 07:25:02.930932 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d99a0a5de9ab844119f5f07dbd4c53c691d8276a2994404375328e295cae9e4"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:25:02 crc kubenswrapper[4917]: I0318 07:25:02.931001 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://7d99a0a5de9ab844119f5f07dbd4c53c691d8276a2994404375328e295cae9e4" gracePeriod=600 Mar 18 07:25:03 crc kubenswrapper[4917]: I0318 07:25:03.508356 4917 scope.go:117] "RemoveContainer" containerID="a949f014f7fad32d98dced8f464f959772d56d06f45109469d7215f67391273e" Mar 18 07:25:03 crc kubenswrapper[4917]: I0318 07:25:03.553340 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="7d99a0a5de9ab844119f5f07dbd4c53c691d8276a2994404375328e295cae9e4" exitCode=0 Mar 18 07:25:03 crc kubenswrapper[4917]: I0318 07:25:03.553699 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"7d99a0a5de9ab844119f5f07dbd4c53c691d8276a2994404375328e295cae9e4"} Mar 18 07:25:03 crc kubenswrapper[4917]: I0318 07:25:03.553798 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5"} Mar 18 07:25:03 crc kubenswrapper[4917]: I0318 07:25:03.553833 4917 scope.go:117] "RemoveContainer" containerID="d0d5d36d14e5becf7e53a394ba32241c21d9df3314cc1f68a2ec8cf772c18fc4" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.150779 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563646-v25q6"] Mar 18 07:26:00 crc kubenswrapper[4917]: E0318 07:26:00.152409 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb" containerName="oc" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.152430 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb" containerName="oc" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.152631 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb" containerName="oc" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.153302 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563646-v25q6" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.156734 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.156959 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.157705 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.174152 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563646-v25q6"] Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.262987 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2s4m\" (UniqueName: \"kubernetes.io/projected/ad58cffc-ac00-49cc-a81d-8e6c0c182d83-kube-api-access-p2s4m\") pod \"auto-csr-approver-29563646-v25q6\" (UID: \"ad58cffc-ac00-49cc-a81d-8e6c0c182d83\") " pod="openshift-infra/auto-csr-approver-29563646-v25q6" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.364079 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2s4m\" (UniqueName: \"kubernetes.io/projected/ad58cffc-ac00-49cc-a81d-8e6c0c182d83-kube-api-access-p2s4m\") pod \"auto-csr-approver-29563646-v25q6\" (UID: \"ad58cffc-ac00-49cc-a81d-8e6c0c182d83\") " pod="openshift-infra/auto-csr-approver-29563646-v25q6" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.395373 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2s4m\" (UniqueName: \"kubernetes.io/projected/ad58cffc-ac00-49cc-a81d-8e6c0c182d83-kube-api-access-p2s4m\") pod \"auto-csr-approver-29563646-v25q6\" (UID: \"ad58cffc-ac00-49cc-a81d-8e6c0c182d83\") " pod="openshift-infra/auto-csr-approver-29563646-v25q6" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.474665 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563646-v25q6" Mar 18 07:26:00 crc kubenswrapper[4917]: I0318 07:26:00.753345 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563646-v25q6"] Mar 18 07:26:01 crc kubenswrapper[4917]: I0318 07:26:01.164952 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563646-v25q6" event={"ID":"ad58cffc-ac00-49cc-a81d-8e6c0c182d83","Type":"ContainerStarted","Data":"bfdc4790e16140ce1c35a1a97a2a5c9669e00a79346f9348506e854689430fa9"} Mar 18 07:26:02 crc kubenswrapper[4917]: I0318 07:26:02.175000 4917 generic.go:334] "Generic (PLEG): container finished" podID="ad58cffc-ac00-49cc-a81d-8e6c0c182d83" containerID="68b811cb22f471a85a7cecd6e2f6ec10b6f09e25471eec5b8f528e88c309521b" exitCode=0 Mar 18 07:26:02 crc kubenswrapper[4917]: I0318 07:26:02.175092 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563646-v25q6" event={"ID":"ad58cffc-ac00-49cc-a81d-8e6c0c182d83","Type":"ContainerDied","Data":"68b811cb22f471a85a7cecd6e2f6ec10b6f09e25471eec5b8f528e88c309521b"} Mar 18 07:26:03 crc kubenswrapper[4917]: I0318 07:26:03.564680 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563646-v25q6" Mar 18 07:26:03 crc kubenswrapper[4917]: I0318 07:26:03.713026 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2s4m\" (UniqueName: \"kubernetes.io/projected/ad58cffc-ac00-49cc-a81d-8e6c0c182d83-kube-api-access-p2s4m\") pod \"ad58cffc-ac00-49cc-a81d-8e6c0c182d83\" (UID: \"ad58cffc-ac00-49cc-a81d-8e6c0c182d83\") " Mar 18 07:26:03 crc kubenswrapper[4917]: I0318 07:26:03.723384 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad58cffc-ac00-49cc-a81d-8e6c0c182d83-kube-api-access-p2s4m" (OuterVolumeSpecName: "kube-api-access-p2s4m") pod "ad58cffc-ac00-49cc-a81d-8e6c0c182d83" (UID: "ad58cffc-ac00-49cc-a81d-8e6c0c182d83"). InnerVolumeSpecName "kube-api-access-p2s4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:26:03 crc kubenswrapper[4917]: I0318 07:26:03.814792 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2s4m\" (UniqueName: \"kubernetes.io/projected/ad58cffc-ac00-49cc-a81d-8e6c0c182d83-kube-api-access-p2s4m\") on node \"crc\" DevicePath \"\"" Mar 18 07:26:04 crc kubenswrapper[4917]: I0318 07:26:04.199117 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563646-v25q6" event={"ID":"ad58cffc-ac00-49cc-a81d-8e6c0c182d83","Type":"ContainerDied","Data":"bfdc4790e16140ce1c35a1a97a2a5c9669e00a79346f9348506e854689430fa9"} Mar 18 07:26:04 crc kubenswrapper[4917]: I0318 07:26:04.199190 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563646-v25q6" Mar 18 07:26:04 crc kubenswrapper[4917]: I0318 07:26:04.199204 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfdc4790e16140ce1c35a1a97a2a5c9669e00a79346f9348506e854689430fa9" Mar 18 07:26:04 crc kubenswrapper[4917]: I0318 07:26:04.669389 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563640-bxcc2"] Mar 18 07:26:04 crc kubenswrapper[4917]: I0318 07:26:04.680173 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563640-bxcc2"] Mar 18 07:26:05 crc kubenswrapper[4917]: I0318 07:26:05.792393 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88900076-a48e-4175-ac37-454811fcf9d5" path="/var/lib/kubelet/pods/88900076-a48e-4175-ac37-454811fcf9d5/volumes" Mar 18 07:27:03 crc kubenswrapper[4917]: I0318 07:27:03.637500 4917 scope.go:117] "RemoveContainer" containerID="9b22930d56c4fc93b9b6ec793fd51cadda55b8b0a69e89bc8596b2ca30ae0bc3" Mar 18 07:27:32 crc kubenswrapper[4917]: I0318 07:27:32.928963 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:27:32 crc kubenswrapper[4917]: I0318 07:27:32.929632 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.151689 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563648-mltwf"] Mar 18 07:28:00 crc kubenswrapper[4917]: E0318 07:28:00.152722 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad58cffc-ac00-49cc-a81d-8e6c0c182d83" containerName="oc" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.152744 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad58cffc-ac00-49cc-a81d-8e6c0c182d83" containerName="oc" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.152949 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad58cffc-ac00-49cc-a81d-8e6c0c182d83" containerName="oc" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.153485 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563648-mltwf" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.156141 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.157833 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.159715 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.163307 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563648-mltwf"] Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.231268 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j88r5\" (UniqueName: \"kubernetes.io/projected/4d5d68c9-31a7-4f1a-b628-fe2de8939add-kube-api-access-j88r5\") pod \"auto-csr-approver-29563648-mltwf\" (UID: \"4d5d68c9-31a7-4f1a-b628-fe2de8939add\") " pod="openshift-infra/auto-csr-approver-29563648-mltwf" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.332279 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j88r5\" (UniqueName: \"kubernetes.io/projected/4d5d68c9-31a7-4f1a-b628-fe2de8939add-kube-api-access-j88r5\") pod \"auto-csr-approver-29563648-mltwf\" (UID: \"4d5d68c9-31a7-4f1a-b628-fe2de8939add\") " pod="openshift-infra/auto-csr-approver-29563648-mltwf" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.358077 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j88r5\" (UniqueName: \"kubernetes.io/projected/4d5d68c9-31a7-4f1a-b628-fe2de8939add-kube-api-access-j88r5\") pod \"auto-csr-approver-29563648-mltwf\" (UID: \"4d5d68c9-31a7-4f1a-b628-fe2de8939add\") " pod="openshift-infra/auto-csr-approver-29563648-mltwf" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.483123 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563648-mltwf" Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.951480 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563648-mltwf"] Mar 18 07:28:00 crc kubenswrapper[4917]: I0318 07:28:00.967975 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:28:01 crc kubenswrapper[4917]: I0318 07:28:01.377101 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563648-mltwf" event={"ID":"4d5d68c9-31a7-4f1a-b628-fe2de8939add","Type":"ContainerStarted","Data":"fd55732e400f7a470fb430927d650eadabde91b174d57d1c7cd0c979d38e814e"} Mar 18 07:28:02 crc kubenswrapper[4917]: I0318 07:28:02.387053 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563648-mltwf" event={"ID":"4d5d68c9-31a7-4f1a-b628-fe2de8939add","Type":"ContainerStarted","Data":"993d6796ba4a603416c8c3eaa99aa50cce746b3c456d2474ab9661d06e3be950"} Mar 18 07:28:02 crc kubenswrapper[4917]: I0318 07:28:02.411489 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563648-mltwf" podStartSLOduration=1.523233007 podStartE2EDuration="2.41147169s" podCreationTimestamp="2026-03-18 07:28:00 +0000 UTC" firstStartedPulling="2026-03-18 07:28:00.967548354 +0000 UTC m=+2465.908703068" lastFinishedPulling="2026-03-18 07:28:01.855786997 +0000 UTC m=+2466.796941751" observedRunningTime="2026-03-18 07:28:02.403761736 +0000 UTC m=+2467.344916450" watchObservedRunningTime="2026-03-18 07:28:02.41147169 +0000 UTC m=+2467.352626404" Mar 18 07:28:02 crc kubenswrapper[4917]: I0318 07:28:02.929130 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:28:02 crc kubenswrapper[4917]: I0318 07:28:02.929218 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:28:03 crc kubenswrapper[4917]: I0318 07:28:03.396551 4917 generic.go:334] "Generic (PLEG): container finished" podID="4d5d68c9-31a7-4f1a-b628-fe2de8939add" containerID="993d6796ba4a603416c8c3eaa99aa50cce746b3c456d2474ab9661d06e3be950" exitCode=0 Mar 18 07:28:03 crc kubenswrapper[4917]: I0318 07:28:03.396664 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563648-mltwf" event={"ID":"4d5d68c9-31a7-4f1a-b628-fe2de8939add","Type":"ContainerDied","Data":"993d6796ba4a603416c8c3eaa99aa50cce746b3c456d2474ab9661d06e3be950"} Mar 18 07:28:04 crc kubenswrapper[4917]: I0318 07:28:04.811536 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563648-mltwf" Mar 18 07:28:04 crc kubenswrapper[4917]: I0318 07:28:04.909143 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j88r5\" (UniqueName: \"kubernetes.io/projected/4d5d68c9-31a7-4f1a-b628-fe2de8939add-kube-api-access-j88r5\") pod \"4d5d68c9-31a7-4f1a-b628-fe2de8939add\" (UID: \"4d5d68c9-31a7-4f1a-b628-fe2de8939add\") " Mar 18 07:28:04 crc kubenswrapper[4917]: I0318 07:28:04.916442 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d5d68c9-31a7-4f1a-b628-fe2de8939add-kube-api-access-j88r5" (OuterVolumeSpecName: "kube-api-access-j88r5") pod "4d5d68c9-31a7-4f1a-b628-fe2de8939add" (UID: "4d5d68c9-31a7-4f1a-b628-fe2de8939add"). InnerVolumeSpecName "kube-api-access-j88r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:28:05 crc kubenswrapper[4917]: I0318 07:28:05.012082 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j88r5\" (UniqueName: \"kubernetes.io/projected/4d5d68c9-31a7-4f1a-b628-fe2de8939add-kube-api-access-j88r5\") on node \"crc\" DevicePath \"\"" Mar 18 07:28:05 crc kubenswrapper[4917]: I0318 07:28:05.425479 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563648-mltwf" event={"ID":"4d5d68c9-31a7-4f1a-b628-fe2de8939add","Type":"ContainerDied","Data":"fd55732e400f7a470fb430927d650eadabde91b174d57d1c7cd0c979d38e814e"} Mar 18 07:28:05 crc kubenswrapper[4917]: I0318 07:28:05.425536 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd55732e400f7a470fb430927d650eadabde91b174d57d1c7cd0c979d38e814e" Mar 18 07:28:05 crc kubenswrapper[4917]: I0318 07:28:05.425626 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563648-mltwf" Mar 18 07:28:05 crc kubenswrapper[4917]: I0318 07:28:05.501627 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563642-2pljf"] Mar 18 07:28:05 crc kubenswrapper[4917]: I0318 07:28:05.511901 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563642-2pljf"] Mar 18 07:28:05 crc kubenswrapper[4917]: I0318 07:28:05.783635 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6077eb1c-ebc6-4b83-9809-4cf12ae02512" path="/var/lib/kubelet/pods/6077eb1c-ebc6-4b83-9809-4cf12ae02512/volumes" Mar 18 07:28:32 crc kubenswrapper[4917]: I0318 07:28:32.929750 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:28:32 crc kubenswrapper[4917]: I0318 07:28:32.930427 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:28:32 crc kubenswrapper[4917]: I0318 07:28:32.930493 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:28:32 crc kubenswrapper[4917]: I0318 07:28:32.931462 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:28:32 crc kubenswrapper[4917]: I0318 07:28:32.931562 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" gracePeriod=600 Mar 18 07:28:33 crc kubenswrapper[4917]: E0318 07:28:33.062678 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:28:33 crc kubenswrapper[4917]: I0318 07:28:33.706447 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" exitCode=0 Mar 18 07:28:33 crc kubenswrapper[4917]: I0318 07:28:33.706543 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5"} Mar 18 07:28:33 crc kubenswrapper[4917]: I0318 07:28:33.706647 4917 scope.go:117] "RemoveContainer" containerID="7d99a0a5de9ab844119f5f07dbd4c53c691d8276a2994404375328e295cae9e4" Mar 18 07:28:33 crc kubenswrapper[4917]: I0318 07:28:33.707337 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:28:33 crc kubenswrapper[4917]: E0318 07:28:33.707913 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:28:48 crc kubenswrapper[4917]: I0318 07:28:48.773118 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:28:48 crc kubenswrapper[4917]: E0318 07:28:48.774131 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:29:03 crc kubenswrapper[4917]: I0318 07:29:03.756318 4917 scope.go:117] "RemoveContainer" containerID="b5aeb999e3ed9599dc07fd4800eb171bf0d62827ca6081b71459099e73c22038" Mar 18 07:29:03 crc kubenswrapper[4917]: I0318 07:29:03.773823 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:29:03 crc kubenswrapper[4917]: E0318 07:29:03.774200 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:29:03 crc kubenswrapper[4917]: I0318 07:29:03.785940 4917 scope.go:117] "RemoveContainer" containerID="c6d70710a8d91c08b313986d14db660a66f60d43138375eee9fa01d8e5440fed" Mar 18 07:29:03 crc kubenswrapper[4917]: I0318 07:29:03.838273 4917 scope.go:117] "RemoveContainer" containerID="f653e9d5a5c47f8c2f422b956248b95e9f49ca4943ef42e5761220267cea9646" Mar 18 07:29:03 crc kubenswrapper[4917]: I0318 07:29:03.860278 4917 scope.go:117] "RemoveContainer" containerID="1812609e93b9e85bfb26d3a195f7b005d8fdd72eb836a98ce420229d37a68bd6" Mar 18 07:29:16 crc kubenswrapper[4917]: I0318 07:29:16.774069 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:29:16 crc kubenswrapper[4917]: E0318 07:29:16.775285 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:29:29 crc kubenswrapper[4917]: I0318 07:29:29.773960 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:29:29 crc kubenswrapper[4917]: E0318 07:29:29.775251 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:29:42 crc kubenswrapper[4917]: I0318 07:29:42.775840 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:29:42 crc kubenswrapper[4917]: E0318 07:29:42.777281 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:29:53 crc kubenswrapper[4917]: I0318 07:29:53.773717 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:29:53 crc kubenswrapper[4917]: E0318 07:29:53.774724 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.147950 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563650-cqnbt"] Mar 18 07:30:00 crc kubenswrapper[4917]: E0318 07:30:00.151102 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d5d68c9-31a7-4f1a-b628-fe2de8939add" containerName="oc" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.151319 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d5d68c9-31a7-4f1a-b628-fe2de8939add" containerName="oc" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.151878 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d5d68c9-31a7-4f1a-b628-fe2de8939add" containerName="oc" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.153034 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563650-cqnbt" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.156935 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.157164 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.157418 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.161354 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5"] Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.162363 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.164858 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.165109 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.180956 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563650-cqnbt"] Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.186004 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5"] Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.341943 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bwtc\" (UniqueName: \"kubernetes.io/projected/587c640f-3eb7-484d-acfa-7af4200547ae-kube-api-access-2bwtc\") pod \"auto-csr-approver-29563650-cqnbt\" (UID: \"587c640f-3eb7-484d-acfa-7af4200547ae\") " pod="openshift-infra/auto-csr-approver-29563650-cqnbt" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.342240 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-secret-volume\") pod \"collect-profiles-29563650-dpfc5\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.342395 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnkvs\" (UniqueName: \"kubernetes.io/projected/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-kube-api-access-tnkvs\") pod \"collect-profiles-29563650-dpfc5\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.342544 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-config-volume\") pod \"collect-profiles-29563650-dpfc5\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.443963 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bwtc\" (UniqueName: \"kubernetes.io/projected/587c640f-3eb7-484d-acfa-7af4200547ae-kube-api-access-2bwtc\") pod \"auto-csr-approver-29563650-cqnbt\" (UID: \"587c640f-3eb7-484d-acfa-7af4200547ae\") " pod="openshift-infra/auto-csr-approver-29563650-cqnbt" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.444026 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-secret-volume\") pod \"collect-profiles-29563650-dpfc5\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.444096 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnkvs\" (UniqueName: \"kubernetes.io/projected/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-kube-api-access-tnkvs\") pod \"collect-profiles-29563650-dpfc5\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.444147 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-config-volume\") pod \"collect-profiles-29563650-dpfc5\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.445404 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-config-volume\") pod \"collect-profiles-29563650-dpfc5\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.452148 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-secret-volume\") pod \"collect-profiles-29563650-dpfc5\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.463294 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnkvs\" (UniqueName: \"kubernetes.io/projected/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-kube-api-access-tnkvs\") pod \"collect-profiles-29563650-dpfc5\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.469332 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bwtc\" (UniqueName: \"kubernetes.io/projected/587c640f-3eb7-484d-acfa-7af4200547ae-kube-api-access-2bwtc\") pod \"auto-csr-approver-29563650-cqnbt\" (UID: \"587c640f-3eb7-484d-acfa-7af4200547ae\") " pod="openshift-infra/auto-csr-approver-29563650-cqnbt" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.481132 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563650-cqnbt" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.493647 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.936353 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563650-cqnbt"] Mar 18 07:30:00 crc kubenswrapper[4917]: W0318 07:30:00.938766 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod587c640f_3eb7_484d_acfa_7af4200547ae.slice/crio-f1bfde6f3f04131fc0d1f935e9001dffdeaa9547e33c81f1226658c77dc63f51 WatchSource:0}: Error finding container f1bfde6f3f04131fc0d1f935e9001dffdeaa9547e33c81f1226658c77dc63f51: Status 404 returned error can't find the container with id f1bfde6f3f04131fc0d1f935e9001dffdeaa9547e33c81f1226658c77dc63f51 Mar 18 07:30:00 crc kubenswrapper[4917]: I0318 07:30:00.993489 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5"] Mar 18 07:30:01 crc kubenswrapper[4917]: I0318 07:30:01.546197 4917 generic.go:334] "Generic (PLEG): container finished" podID="18d5cfc7-f7f0-488a-86bc-f75136f39ec3" containerID="9d74cb4063b6f51d9055cc46838ad8e4c5077a5581a908195d13add58f6d0b14" exitCode=0 Mar 18 07:30:01 crc kubenswrapper[4917]: I0318 07:30:01.546258 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" event={"ID":"18d5cfc7-f7f0-488a-86bc-f75136f39ec3","Type":"ContainerDied","Data":"9d74cb4063b6f51d9055cc46838ad8e4c5077a5581a908195d13add58f6d0b14"} Mar 18 07:30:01 crc kubenswrapper[4917]: I0318 07:30:01.546856 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" event={"ID":"18d5cfc7-f7f0-488a-86bc-f75136f39ec3","Type":"ContainerStarted","Data":"34c7383d8f72fa920dfc87bbc4dd0b01de52b63b3407f1051bebc4fb6f3b6a16"} Mar 18 07:30:01 crc kubenswrapper[4917]: I0318 07:30:01.548396 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563650-cqnbt" event={"ID":"587c640f-3eb7-484d-acfa-7af4200547ae","Type":"ContainerStarted","Data":"f1bfde6f3f04131fc0d1f935e9001dffdeaa9547e33c81f1226658c77dc63f51"} Mar 18 07:30:02 crc kubenswrapper[4917]: I0318 07:30:02.878103 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:02 crc kubenswrapper[4917]: I0318 07:30:02.981607 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnkvs\" (UniqueName: \"kubernetes.io/projected/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-kube-api-access-tnkvs\") pod \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " Mar 18 07:30:02 crc kubenswrapper[4917]: I0318 07:30:02.981732 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-secret-volume\") pod \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " Mar 18 07:30:02 crc kubenswrapper[4917]: I0318 07:30:02.981778 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-config-volume\") pod \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\" (UID: \"18d5cfc7-f7f0-488a-86bc-f75136f39ec3\") " Mar 18 07:30:02 crc kubenswrapper[4917]: I0318 07:30:02.982658 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-config-volume" (OuterVolumeSpecName: "config-volume") pod "18d5cfc7-f7f0-488a-86bc-f75136f39ec3" (UID: "18d5cfc7-f7f0-488a-86bc-f75136f39ec3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:30:02 crc kubenswrapper[4917]: I0318 07:30:02.987873 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "18d5cfc7-f7f0-488a-86bc-f75136f39ec3" (UID: "18d5cfc7-f7f0-488a-86bc-f75136f39ec3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:30:02 crc kubenswrapper[4917]: I0318 07:30:02.988334 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-kube-api-access-tnkvs" (OuterVolumeSpecName: "kube-api-access-tnkvs") pod "18d5cfc7-f7f0-488a-86bc-f75136f39ec3" (UID: "18d5cfc7-f7f0-488a-86bc-f75136f39ec3"). InnerVolumeSpecName "kube-api-access-tnkvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.083197 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.083574 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.083611 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnkvs\" (UniqueName: \"kubernetes.io/projected/18d5cfc7-f7f0-488a-86bc-f75136f39ec3-kube-api-access-tnkvs\") on node \"crc\" DevicePath \"\"" Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.569995 4917 generic.go:334] "Generic (PLEG): container finished" podID="587c640f-3eb7-484d-acfa-7af4200547ae" containerID="fb531fc4d6a115562987e9b45470ba2b14f9338323eaf60d4384e45eeeab0d9e" exitCode=0 Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.570070 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563650-cqnbt" event={"ID":"587c640f-3eb7-484d-acfa-7af4200547ae","Type":"ContainerDied","Data":"fb531fc4d6a115562987e9b45470ba2b14f9338323eaf60d4384e45eeeab0d9e"} Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.572553 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" event={"ID":"18d5cfc7-f7f0-488a-86bc-f75136f39ec3","Type":"ContainerDied","Data":"34c7383d8f72fa920dfc87bbc4dd0b01de52b63b3407f1051bebc4fb6f3b6a16"} Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.572630 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34c7383d8f72fa920dfc87bbc4dd0b01de52b63b3407f1051bebc4fb6f3b6a16" Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.572669 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5" Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.987031 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c"] Mar 18 07:30:03 crc kubenswrapper[4917]: I0318 07:30:03.996920 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563605-jgd8c"] Mar 18 07:30:04 crc kubenswrapper[4917]: I0318 07:30:04.963015 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563650-cqnbt" Mar 18 07:30:05 crc kubenswrapper[4917]: I0318 07:30:05.117184 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bwtc\" (UniqueName: \"kubernetes.io/projected/587c640f-3eb7-484d-acfa-7af4200547ae-kube-api-access-2bwtc\") pod \"587c640f-3eb7-484d-acfa-7af4200547ae\" (UID: \"587c640f-3eb7-484d-acfa-7af4200547ae\") " Mar 18 07:30:05 crc kubenswrapper[4917]: I0318 07:30:05.126218 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587c640f-3eb7-484d-acfa-7af4200547ae-kube-api-access-2bwtc" (OuterVolumeSpecName: "kube-api-access-2bwtc") pod "587c640f-3eb7-484d-acfa-7af4200547ae" (UID: "587c640f-3eb7-484d-acfa-7af4200547ae"). InnerVolumeSpecName "kube-api-access-2bwtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:30:05 crc kubenswrapper[4917]: I0318 07:30:05.254308 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bwtc\" (UniqueName: \"kubernetes.io/projected/587c640f-3eb7-484d-acfa-7af4200547ae-kube-api-access-2bwtc\") on node \"crc\" DevicePath \"\"" Mar 18 07:30:05 crc kubenswrapper[4917]: I0318 07:30:05.595968 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563650-cqnbt" event={"ID":"587c640f-3eb7-484d-acfa-7af4200547ae","Type":"ContainerDied","Data":"f1bfde6f3f04131fc0d1f935e9001dffdeaa9547e33c81f1226658c77dc63f51"} Mar 18 07:30:05 crc kubenswrapper[4917]: I0318 07:30:05.596035 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563650-cqnbt" Mar 18 07:30:05 crc kubenswrapper[4917]: I0318 07:30:05.596037 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1bfde6f3f04131fc0d1f935e9001dffdeaa9547e33c81f1226658c77dc63f51" Mar 18 07:30:05 crc kubenswrapper[4917]: I0318 07:30:05.779785 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:30:05 crc kubenswrapper[4917]: E0318 07:30:05.780288 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:30:05 crc kubenswrapper[4917]: I0318 07:30:05.801683 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39975bd4-f79a-464b-a4cc-65220c2ee731" path="/var/lib/kubelet/pods/39975bd4-f79a-464b-a4cc-65220c2ee731/volumes" Mar 18 07:30:06 crc kubenswrapper[4917]: I0318 07:30:06.020019 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563644-q5rm7"] Mar 18 07:30:06 crc kubenswrapper[4917]: I0318 07:30:06.026504 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563644-q5rm7"] Mar 18 07:30:07 crc kubenswrapper[4917]: I0318 07:30:07.782625 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb" path="/var/lib/kubelet/pods/e7beff81-0cc3-4e2f-ab59-b8cd6b3237fb/volumes" Mar 18 07:30:16 crc kubenswrapper[4917]: I0318 07:30:16.772729 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:30:16 crc kubenswrapper[4917]: E0318 07:30:16.773994 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:30:31 crc kubenswrapper[4917]: I0318 07:30:31.774933 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:30:31 crc kubenswrapper[4917]: E0318 07:30:31.776310 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:30:44 crc kubenswrapper[4917]: I0318 07:30:44.772620 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:30:44 crc kubenswrapper[4917]: E0318 07:30:44.773511 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:30:57 crc kubenswrapper[4917]: I0318 07:30:57.773410 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:30:57 crc kubenswrapper[4917]: E0318 07:30:57.775866 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:31:03 crc kubenswrapper[4917]: I0318 07:31:03.959093 4917 scope.go:117] "RemoveContainer" containerID="ec7f4e8b612cb727451561a1ec45ab77dbfd14e43853778c9dbfd5f6e0f8e66c" Mar 18 07:31:04 crc kubenswrapper[4917]: I0318 07:31:04.002528 4917 scope.go:117] "RemoveContainer" containerID="f17c04c11aeb92d4011ec40e91150515c24c40f23fd80a2a2d950a8d10bb11e1" Mar 18 07:31:10 crc kubenswrapper[4917]: I0318 07:31:10.773380 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:31:10 crc kubenswrapper[4917]: E0318 07:31:10.774340 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:31:22 crc kubenswrapper[4917]: I0318 07:31:22.772014 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:31:22 crc kubenswrapper[4917]: E0318 07:31:22.772723 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:31:37 crc kubenswrapper[4917]: I0318 07:31:37.773979 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:31:37 crc kubenswrapper[4917]: E0318 07:31:37.774994 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:31:49 crc kubenswrapper[4917]: I0318 07:31:49.773768 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:31:49 crc kubenswrapper[4917]: E0318 07:31:49.774843 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.195025 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563652-cvr68"] Mar 18 07:32:00 crc kubenswrapper[4917]: E0318 07:32:00.196673 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18d5cfc7-f7f0-488a-86bc-f75136f39ec3" containerName="collect-profiles" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.196706 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="18d5cfc7-f7f0-488a-86bc-f75136f39ec3" containerName="collect-profiles" Mar 18 07:32:00 crc kubenswrapper[4917]: E0318 07:32:00.196736 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="587c640f-3eb7-484d-acfa-7af4200547ae" containerName="oc" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.196753 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="587c640f-3eb7-484d-acfa-7af4200547ae" containerName="oc" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.197187 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="587c640f-3eb7-484d-acfa-7af4200547ae" containerName="oc" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.197227 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="18d5cfc7-f7f0-488a-86bc-f75136f39ec3" containerName="collect-profiles" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.198253 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563652-cvr68" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.201441 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.201542 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.203205 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.207913 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563652-cvr68"] Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.224137 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x5nn\" (UniqueName: \"kubernetes.io/projected/cde28581-741f-4fe7-a348-d73e844e317f-kube-api-access-6x5nn\") pod \"auto-csr-approver-29563652-cvr68\" (UID: \"cde28581-741f-4fe7-a348-d73e844e317f\") " pod="openshift-infra/auto-csr-approver-29563652-cvr68" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.325788 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x5nn\" (UniqueName: \"kubernetes.io/projected/cde28581-741f-4fe7-a348-d73e844e317f-kube-api-access-6x5nn\") pod \"auto-csr-approver-29563652-cvr68\" (UID: \"cde28581-741f-4fe7-a348-d73e844e317f\") " pod="openshift-infra/auto-csr-approver-29563652-cvr68" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.374442 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x5nn\" (UniqueName: \"kubernetes.io/projected/cde28581-741f-4fe7-a348-d73e844e317f-kube-api-access-6x5nn\") pod \"auto-csr-approver-29563652-cvr68\" (UID: \"cde28581-741f-4fe7-a348-d73e844e317f\") " pod="openshift-infra/auto-csr-approver-29563652-cvr68" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.523766 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563652-cvr68" Mar 18 07:32:00 crc kubenswrapper[4917]: I0318 07:32:00.772626 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:32:00 crc kubenswrapper[4917]: E0318 07:32:00.773363 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:32:01 crc kubenswrapper[4917]: I0318 07:32:01.022533 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563652-cvr68"] Mar 18 07:32:01 crc kubenswrapper[4917]: I0318 07:32:01.655769 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563652-cvr68" event={"ID":"cde28581-741f-4fe7-a348-d73e844e317f","Type":"ContainerStarted","Data":"c65a3fe1dbb3e77af2019d9d4878cdf21b39f98d083e6c1e0d68c14632ef0be6"} Mar 18 07:32:02 crc kubenswrapper[4917]: I0318 07:32:02.667321 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563652-cvr68" event={"ID":"cde28581-741f-4fe7-a348-d73e844e317f","Type":"ContainerStarted","Data":"6dfabd9ac15311c41a307e6003b821da2f1e64a3da73911d149eb90028ca3370"} Mar 18 07:32:02 crc kubenswrapper[4917]: I0318 07:32:02.697806 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563652-cvr68" podStartSLOduration=1.4822770809999999 podStartE2EDuration="2.697778195s" podCreationTimestamp="2026-03-18 07:32:00 +0000 UTC" firstStartedPulling="2026-03-18 07:32:01.039266916 +0000 UTC m=+2705.980421670" lastFinishedPulling="2026-03-18 07:32:02.25476803 +0000 UTC m=+2707.195922784" observedRunningTime="2026-03-18 07:32:02.692461595 +0000 UTC m=+2707.633616389" watchObservedRunningTime="2026-03-18 07:32:02.697778195 +0000 UTC m=+2707.638932949" Mar 18 07:32:03 crc kubenswrapper[4917]: I0318 07:32:03.687347 4917 generic.go:334] "Generic (PLEG): container finished" podID="cde28581-741f-4fe7-a348-d73e844e317f" containerID="6dfabd9ac15311c41a307e6003b821da2f1e64a3da73911d149eb90028ca3370" exitCode=0 Mar 18 07:32:03 crc kubenswrapper[4917]: I0318 07:32:03.687418 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563652-cvr68" event={"ID":"cde28581-741f-4fe7-a348-d73e844e317f","Type":"ContainerDied","Data":"6dfabd9ac15311c41a307e6003b821da2f1e64a3da73911d149eb90028ca3370"} Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.054039 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563652-cvr68" Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.103530 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x5nn\" (UniqueName: \"kubernetes.io/projected/cde28581-741f-4fe7-a348-d73e844e317f-kube-api-access-6x5nn\") pod \"cde28581-741f-4fe7-a348-d73e844e317f\" (UID: \"cde28581-741f-4fe7-a348-d73e844e317f\") " Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.115944 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde28581-741f-4fe7-a348-d73e844e317f-kube-api-access-6x5nn" (OuterVolumeSpecName: "kube-api-access-6x5nn") pod "cde28581-741f-4fe7-a348-d73e844e317f" (UID: "cde28581-741f-4fe7-a348-d73e844e317f"). InnerVolumeSpecName "kube-api-access-6x5nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.205830 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x5nn\" (UniqueName: \"kubernetes.io/projected/cde28581-741f-4fe7-a348-d73e844e317f-kube-api-access-6x5nn\") on node \"crc\" DevicePath \"\"" Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.703218 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563652-cvr68" event={"ID":"cde28581-741f-4fe7-a348-d73e844e317f","Type":"ContainerDied","Data":"c65a3fe1dbb3e77af2019d9d4878cdf21b39f98d083e6c1e0d68c14632ef0be6"} Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.703262 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c65a3fe1dbb3e77af2019d9d4878cdf21b39f98d083e6c1e0d68c14632ef0be6" Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.703318 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563652-cvr68" Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.762207 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563646-v25q6"] Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.769151 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563646-v25q6"] Mar 18 07:32:05 crc kubenswrapper[4917]: E0318 07:32:05.771807 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcde28581_741f_4fe7_a348_d73e844e317f.slice/crio-c65a3fe1dbb3e77af2019d9d4878cdf21b39f98d083e6c1e0d68c14632ef0be6\": RecentStats: unable to find data in memory cache]" Mar 18 07:32:05 crc kubenswrapper[4917]: I0318 07:32:05.786363 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad58cffc-ac00-49cc-a81d-8e6c0c182d83" path="/var/lib/kubelet/pods/ad58cffc-ac00-49cc-a81d-8e6c0c182d83/volumes" Mar 18 07:32:11 crc kubenswrapper[4917]: I0318 07:32:11.773045 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:32:11 crc kubenswrapper[4917]: E0318 07:32:11.774063 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:32:23 crc kubenswrapper[4917]: I0318 07:32:23.773394 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:32:23 crc kubenswrapper[4917]: E0318 07:32:23.774658 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:32:38 crc kubenswrapper[4917]: I0318 07:32:38.772559 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:32:38 crc kubenswrapper[4917]: E0318 07:32:38.773703 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:32:50 crc kubenswrapper[4917]: I0318 07:32:50.772310 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:32:50 crc kubenswrapper[4917]: E0318 07:32:50.773302 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:32:58 crc kubenswrapper[4917]: I0318 07:32:58.851874 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5npnh"] Mar 18 07:32:58 crc kubenswrapper[4917]: E0318 07:32:58.853424 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde28581-741f-4fe7-a348-d73e844e317f" containerName="oc" Mar 18 07:32:58 crc kubenswrapper[4917]: I0318 07:32:58.853451 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde28581-741f-4fe7-a348-d73e844e317f" containerName="oc" Mar 18 07:32:58 crc kubenswrapper[4917]: I0318 07:32:58.853729 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde28581-741f-4fe7-a348-d73e844e317f" containerName="oc" Mar 18 07:32:58 crc kubenswrapper[4917]: I0318 07:32:58.855379 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:58 crc kubenswrapper[4917]: I0318 07:32:58.879545 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5npnh"] Mar 18 07:32:58 crc kubenswrapper[4917]: I0318 07:32:58.975194 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-catalog-content\") pod \"redhat-marketplace-5npnh\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:58 crc kubenswrapper[4917]: I0318 07:32:58.975245 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-utilities\") pod \"redhat-marketplace-5npnh\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:58 crc kubenswrapper[4917]: I0318 07:32:58.975313 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdflb\" (UniqueName: \"kubernetes.io/projected/f010dbd6-9e86-4c3a-b233-550e33272dad-kube-api-access-hdflb\") pod \"redhat-marketplace-5npnh\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:59 crc kubenswrapper[4917]: I0318 07:32:59.076545 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdflb\" (UniqueName: \"kubernetes.io/projected/f010dbd6-9e86-4c3a-b233-550e33272dad-kube-api-access-hdflb\") pod \"redhat-marketplace-5npnh\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:59 crc kubenswrapper[4917]: I0318 07:32:59.076828 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-catalog-content\") pod \"redhat-marketplace-5npnh\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:59 crc kubenswrapper[4917]: I0318 07:32:59.076873 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-utilities\") pod \"redhat-marketplace-5npnh\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:59 crc kubenswrapper[4917]: I0318 07:32:59.077630 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-utilities\") pod \"redhat-marketplace-5npnh\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:59 crc kubenswrapper[4917]: I0318 07:32:59.079010 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-catalog-content\") pod \"redhat-marketplace-5npnh\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:59 crc kubenswrapper[4917]: I0318 07:32:59.104898 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdflb\" (UniqueName: \"kubernetes.io/projected/f010dbd6-9e86-4c3a-b233-550e33272dad-kube-api-access-hdflb\") pod \"redhat-marketplace-5npnh\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:59 crc kubenswrapper[4917]: I0318 07:32:59.191960 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:32:59 crc kubenswrapper[4917]: I0318 07:32:59.646879 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5npnh"] Mar 18 07:33:00 crc kubenswrapper[4917]: I0318 07:33:00.204053 4917 generic.go:334] "Generic (PLEG): container finished" podID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerID="8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1" exitCode=0 Mar 18 07:33:00 crc kubenswrapper[4917]: I0318 07:33:00.204130 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5npnh" event={"ID":"f010dbd6-9e86-4c3a-b233-550e33272dad","Type":"ContainerDied","Data":"8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1"} Mar 18 07:33:00 crc kubenswrapper[4917]: I0318 07:33:00.204441 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5npnh" event={"ID":"f010dbd6-9e86-4c3a-b233-550e33272dad","Type":"ContainerStarted","Data":"21d1f85dad0490866b69617ec88f64e5caa4534371aad8b1a8b0893ead79848f"} Mar 18 07:33:01 crc kubenswrapper[4917]: I0318 07:33:01.215023 4917 generic.go:334] "Generic (PLEG): container finished" podID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerID="93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290" exitCode=0 Mar 18 07:33:01 crc kubenswrapper[4917]: I0318 07:33:01.215156 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5npnh" event={"ID":"f010dbd6-9e86-4c3a-b233-550e33272dad","Type":"ContainerDied","Data":"93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290"} Mar 18 07:33:01 crc kubenswrapper[4917]: I0318 07:33:01.217859 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:33:02 crc kubenswrapper[4917]: I0318 07:33:02.229790 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5npnh" event={"ID":"f010dbd6-9e86-4c3a-b233-550e33272dad","Type":"ContainerStarted","Data":"8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa"} Mar 18 07:33:02 crc kubenswrapper[4917]: I0318 07:33:02.289242 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5npnh" podStartSLOduration=2.789471998 podStartE2EDuration="4.289211757s" podCreationTimestamp="2026-03-18 07:32:58 +0000 UTC" firstStartedPulling="2026-03-18 07:33:00.205300473 +0000 UTC m=+2765.146455187" lastFinishedPulling="2026-03-18 07:33:01.705040222 +0000 UTC m=+2766.646194946" observedRunningTime="2026-03-18 07:33:02.28152877 +0000 UTC m=+2767.222683544" watchObservedRunningTime="2026-03-18 07:33:02.289211757 +0000 UTC m=+2767.230366511" Mar 18 07:33:04 crc kubenswrapper[4917]: I0318 07:33:04.152868 4917 scope.go:117] "RemoveContainer" containerID="68b811cb22f471a85a7cecd6e2f6ec10b6f09e25471eec5b8f528e88c309521b" Mar 18 07:33:04 crc kubenswrapper[4917]: I0318 07:33:04.773436 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:33:04 crc kubenswrapper[4917]: E0318 07:33:04.773989 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:33:09 crc kubenswrapper[4917]: I0318 07:33:09.193168 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:33:09 crc kubenswrapper[4917]: I0318 07:33:09.193633 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:33:09 crc kubenswrapper[4917]: I0318 07:33:09.266123 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:33:09 crc kubenswrapper[4917]: I0318 07:33:09.379501 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:33:09 crc kubenswrapper[4917]: I0318 07:33:09.506304 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5npnh"] Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.316518 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5npnh" podUID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerName="registry-server" containerID="cri-o://8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa" gracePeriod=2 Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.779087 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.872193 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-catalog-content\") pod \"f010dbd6-9e86-4c3a-b233-550e33272dad\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.872530 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdflb\" (UniqueName: \"kubernetes.io/projected/f010dbd6-9e86-4c3a-b233-550e33272dad-kube-api-access-hdflb\") pod \"f010dbd6-9e86-4c3a-b233-550e33272dad\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.872640 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-utilities\") pod \"f010dbd6-9e86-4c3a-b233-550e33272dad\" (UID: \"f010dbd6-9e86-4c3a-b233-550e33272dad\") " Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.874521 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-utilities" (OuterVolumeSpecName: "utilities") pod "f010dbd6-9e86-4c3a-b233-550e33272dad" (UID: "f010dbd6-9e86-4c3a-b233-550e33272dad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.881893 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f010dbd6-9e86-4c3a-b233-550e33272dad-kube-api-access-hdflb" (OuterVolumeSpecName: "kube-api-access-hdflb") pod "f010dbd6-9e86-4c3a-b233-550e33272dad" (UID: "f010dbd6-9e86-4c3a-b233-550e33272dad"). InnerVolumeSpecName "kube-api-access-hdflb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.901933 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f010dbd6-9e86-4c3a-b233-550e33272dad" (UID: "f010dbd6-9e86-4c3a-b233-550e33272dad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.974245 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.974292 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f010dbd6-9e86-4c3a-b233-550e33272dad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:33:11 crc kubenswrapper[4917]: I0318 07:33:11.974312 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdflb\" (UniqueName: \"kubernetes.io/projected/f010dbd6-9e86-4c3a-b233-550e33272dad-kube-api-access-hdflb\") on node \"crc\" DevicePath \"\"" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.329801 4917 generic.go:334] "Generic (PLEG): container finished" podID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerID="8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa" exitCode=0 Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.329873 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5npnh" event={"ID":"f010dbd6-9e86-4c3a-b233-550e33272dad","Type":"ContainerDied","Data":"8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa"} Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.329922 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5npnh" event={"ID":"f010dbd6-9e86-4c3a-b233-550e33272dad","Type":"ContainerDied","Data":"21d1f85dad0490866b69617ec88f64e5caa4534371aad8b1a8b0893ead79848f"} Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.329964 4917 scope.go:117] "RemoveContainer" containerID="8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.330182 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5npnh" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.354921 4917 scope.go:117] "RemoveContainer" containerID="93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.394435 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5npnh"] Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.395917 4917 scope.go:117] "RemoveContainer" containerID="8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.405663 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5npnh"] Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.429625 4917 scope.go:117] "RemoveContainer" containerID="8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa" Mar 18 07:33:12 crc kubenswrapper[4917]: E0318 07:33:12.430127 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa\": container with ID starting with 8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa not found: ID does not exist" containerID="8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.430353 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa"} err="failed to get container status \"8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa\": rpc error: code = NotFound desc = could not find container \"8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa\": container with ID starting with 8be11936cedb489e46a7bb473dfef65b10c7f1a46c618bd330b206ecf61b7aaa not found: ID does not exist" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.430386 4917 scope.go:117] "RemoveContainer" containerID="93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290" Mar 18 07:33:12 crc kubenswrapper[4917]: E0318 07:33:12.431160 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290\": container with ID starting with 93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290 not found: ID does not exist" containerID="93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.431347 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290"} err="failed to get container status \"93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290\": rpc error: code = NotFound desc = could not find container \"93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290\": container with ID starting with 93f47fa46bde68b1893e3891e986e078c77cc666a82c7bb9b7393ab857a7f290 not found: ID does not exist" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.431525 4917 scope.go:117] "RemoveContainer" containerID="8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1" Mar 18 07:33:12 crc kubenswrapper[4917]: E0318 07:33:12.432266 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1\": container with ID starting with 8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1 not found: ID does not exist" containerID="8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1" Mar 18 07:33:12 crc kubenswrapper[4917]: I0318 07:33:12.432326 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1"} err="failed to get container status \"8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1\": rpc error: code = NotFound desc = could not find container \"8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1\": container with ID starting with 8b12b72230a466e350651034f468e8b08d5bc730728a0b964caaaec870c1cdd1 not found: ID does not exist" Mar 18 07:33:13 crc kubenswrapper[4917]: I0318 07:33:13.786189 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f010dbd6-9e86-4c3a-b233-550e33272dad" path="/var/lib/kubelet/pods/f010dbd6-9e86-4c3a-b233-550e33272dad/volumes" Mar 18 07:33:15 crc kubenswrapper[4917]: I0318 07:33:15.779723 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:33:15 crc kubenswrapper[4917]: E0318 07:33:15.780182 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:33:29 crc kubenswrapper[4917]: I0318 07:33:29.773342 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:33:29 crc kubenswrapper[4917]: E0318 07:33:29.774166 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:33:40 crc kubenswrapper[4917]: I0318 07:33:40.773275 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:33:41 crc kubenswrapper[4917]: I0318 07:33:41.601469 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"4bbf9144e4b69a26dc4113283f082a54cd447636d065f4b3cdcdc308d081feb2"} Mar 18 07:33:43 crc kubenswrapper[4917]: I0318 07:33:43.967173 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pzpvz"] Mar 18 07:33:43 crc kubenswrapper[4917]: E0318 07:33:43.967872 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerName="extract-utilities" Mar 18 07:33:43 crc kubenswrapper[4917]: I0318 07:33:43.967893 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerName="extract-utilities" Mar 18 07:33:43 crc kubenswrapper[4917]: E0318 07:33:43.967923 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerName="registry-server" Mar 18 07:33:43 crc kubenswrapper[4917]: I0318 07:33:43.967936 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerName="registry-server" Mar 18 07:33:43 crc kubenswrapper[4917]: E0318 07:33:43.967975 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerName="extract-content" Mar 18 07:33:43 crc kubenswrapper[4917]: I0318 07:33:43.967988 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerName="extract-content" Mar 18 07:33:43 crc kubenswrapper[4917]: I0318 07:33:43.968238 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f010dbd6-9e86-4c3a-b233-550e33272dad" containerName="registry-server" Mar 18 07:33:43 crc kubenswrapper[4917]: I0318 07:33:43.969851 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.004357 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzpvz"] Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.078497 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-catalog-content\") pod \"community-operators-pzpvz\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.078559 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-utilities\") pod \"community-operators-pzpvz\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.078603 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rls9s\" (UniqueName: \"kubernetes.io/projected/650fcf06-9932-47b7-8d7e-c989d1297586-kube-api-access-rls9s\") pod \"community-operators-pzpvz\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.180116 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-catalog-content\") pod \"community-operators-pzpvz\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.180169 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-utilities\") pod \"community-operators-pzpvz\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.180191 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rls9s\" (UniqueName: \"kubernetes.io/projected/650fcf06-9932-47b7-8d7e-c989d1297586-kube-api-access-rls9s\") pod \"community-operators-pzpvz\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.180671 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-catalog-content\") pod \"community-operators-pzpvz\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.180703 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-utilities\") pod \"community-operators-pzpvz\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.207327 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rls9s\" (UniqueName: \"kubernetes.io/projected/650fcf06-9932-47b7-8d7e-c989d1297586-kube-api-access-rls9s\") pod \"community-operators-pzpvz\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.298860 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:44 crc kubenswrapper[4917]: I0318 07:33:44.720445 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzpvz"] Mar 18 07:33:44 crc kubenswrapper[4917]: W0318 07:33:44.726927 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod650fcf06_9932_47b7_8d7e_c989d1297586.slice/crio-dc18e45a75f1757b3ff23112becb5e7ae79eb9c5cc9882eeb47b640bd3f68581 WatchSource:0}: Error finding container dc18e45a75f1757b3ff23112becb5e7ae79eb9c5cc9882eeb47b640bd3f68581: Status 404 returned error can't find the container with id dc18e45a75f1757b3ff23112becb5e7ae79eb9c5cc9882eeb47b640bd3f68581 Mar 18 07:33:45 crc kubenswrapper[4917]: I0318 07:33:45.634828 4917 generic.go:334] "Generic (PLEG): container finished" podID="650fcf06-9932-47b7-8d7e-c989d1297586" containerID="a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6" exitCode=0 Mar 18 07:33:45 crc kubenswrapper[4917]: I0318 07:33:45.634912 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpvz" event={"ID":"650fcf06-9932-47b7-8d7e-c989d1297586","Type":"ContainerDied","Data":"a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6"} Mar 18 07:33:45 crc kubenswrapper[4917]: I0318 07:33:45.635176 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpvz" event={"ID":"650fcf06-9932-47b7-8d7e-c989d1297586","Type":"ContainerStarted","Data":"dc18e45a75f1757b3ff23112becb5e7ae79eb9c5cc9882eeb47b640bd3f68581"} Mar 18 07:33:47 crc kubenswrapper[4917]: I0318 07:33:47.655624 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpvz" event={"ID":"650fcf06-9932-47b7-8d7e-c989d1297586","Type":"ContainerStarted","Data":"00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e"} Mar 18 07:33:48 crc kubenswrapper[4917]: I0318 07:33:48.667173 4917 generic.go:334] "Generic (PLEG): container finished" podID="650fcf06-9932-47b7-8d7e-c989d1297586" containerID="00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e" exitCode=0 Mar 18 07:33:48 crc kubenswrapper[4917]: I0318 07:33:48.667267 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpvz" event={"ID":"650fcf06-9932-47b7-8d7e-c989d1297586","Type":"ContainerDied","Data":"00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e"} Mar 18 07:33:49 crc kubenswrapper[4917]: I0318 07:33:49.677277 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpvz" event={"ID":"650fcf06-9932-47b7-8d7e-c989d1297586","Type":"ContainerStarted","Data":"c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228"} Mar 18 07:33:49 crc kubenswrapper[4917]: I0318 07:33:49.705430 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pzpvz" podStartSLOduration=3.06456006 podStartE2EDuration="6.705411531s" podCreationTimestamp="2026-03-18 07:33:43 +0000 UTC" firstStartedPulling="2026-03-18 07:33:45.636667562 +0000 UTC m=+2810.577822286" lastFinishedPulling="2026-03-18 07:33:49.277519013 +0000 UTC m=+2814.218673757" observedRunningTime="2026-03-18 07:33:49.695987968 +0000 UTC m=+2814.637142712" watchObservedRunningTime="2026-03-18 07:33:49.705411531 +0000 UTC m=+2814.646566255" Mar 18 07:33:54 crc kubenswrapper[4917]: I0318 07:33:54.299675 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:54 crc kubenswrapper[4917]: I0318 07:33:54.301906 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:54 crc kubenswrapper[4917]: I0318 07:33:54.364653 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:54 crc kubenswrapper[4917]: I0318 07:33:54.809819 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:54 crc kubenswrapper[4917]: I0318 07:33:54.871709 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzpvz"] Mar 18 07:33:56 crc kubenswrapper[4917]: I0318 07:33:56.756344 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pzpvz" podUID="650fcf06-9932-47b7-8d7e-c989d1297586" containerName="registry-server" containerID="cri-o://c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228" gracePeriod=2 Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.214543 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.316758 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-catalog-content\") pod \"650fcf06-9932-47b7-8d7e-c989d1297586\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.316894 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-utilities\") pod \"650fcf06-9932-47b7-8d7e-c989d1297586\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.317058 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rls9s\" (UniqueName: \"kubernetes.io/projected/650fcf06-9932-47b7-8d7e-c989d1297586-kube-api-access-rls9s\") pod \"650fcf06-9932-47b7-8d7e-c989d1297586\" (UID: \"650fcf06-9932-47b7-8d7e-c989d1297586\") " Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.318691 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-utilities" (OuterVolumeSpecName: "utilities") pod "650fcf06-9932-47b7-8d7e-c989d1297586" (UID: "650fcf06-9932-47b7-8d7e-c989d1297586"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.323423 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/650fcf06-9932-47b7-8d7e-c989d1297586-kube-api-access-rls9s" (OuterVolumeSpecName: "kube-api-access-rls9s") pod "650fcf06-9932-47b7-8d7e-c989d1297586" (UID: "650fcf06-9932-47b7-8d7e-c989d1297586"). InnerVolumeSpecName "kube-api-access-rls9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.419120 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.419173 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rls9s\" (UniqueName: \"kubernetes.io/projected/650fcf06-9932-47b7-8d7e-c989d1297586-kube-api-access-rls9s\") on node \"crc\" DevicePath \"\"" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.432484 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "650fcf06-9932-47b7-8d7e-c989d1297586" (UID: "650fcf06-9932-47b7-8d7e-c989d1297586"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.520406 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/650fcf06-9932-47b7-8d7e-c989d1297586-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.773250 4917 generic.go:334] "Generic (PLEG): container finished" podID="650fcf06-9932-47b7-8d7e-c989d1297586" containerID="c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228" exitCode=0 Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.773418 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzpvz" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.793277 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpvz" event={"ID":"650fcf06-9932-47b7-8d7e-c989d1297586","Type":"ContainerDied","Data":"c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228"} Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.793356 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpvz" event={"ID":"650fcf06-9932-47b7-8d7e-c989d1297586","Type":"ContainerDied","Data":"dc18e45a75f1757b3ff23112becb5e7ae79eb9c5cc9882eeb47b640bd3f68581"} Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.793400 4917 scope.go:117] "RemoveContainer" containerID="c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.827224 4917 scope.go:117] "RemoveContainer" containerID="00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.843891 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzpvz"] Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.857096 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pzpvz"] Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.864134 4917 scope.go:117] "RemoveContainer" containerID="a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.886248 4917 scope.go:117] "RemoveContainer" containerID="c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228" Mar 18 07:33:57 crc kubenswrapper[4917]: E0318 07:33:57.887386 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228\": container with ID starting with c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228 not found: ID does not exist" containerID="c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.887448 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228"} err="failed to get container status \"c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228\": rpc error: code = NotFound desc = could not find container \"c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228\": container with ID starting with c0df1b553ecd874c07f1730dd8eed78b00cf71b5bb1203fc09b1d22a11808228 not found: ID does not exist" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.887484 4917 scope.go:117] "RemoveContainer" containerID="00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e" Mar 18 07:33:57 crc kubenswrapper[4917]: E0318 07:33:57.887890 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e\": container with ID starting with 00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e not found: ID does not exist" containerID="00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.887950 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e"} err="failed to get container status \"00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e\": rpc error: code = NotFound desc = could not find container \"00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e\": container with ID starting with 00f6287c42388c94bb9a1942ff1c1dd1bea4b9446e50f052f39b2f718cb7e92e not found: ID does not exist" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.888037 4917 scope.go:117] "RemoveContainer" containerID="a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6" Mar 18 07:33:57 crc kubenswrapper[4917]: E0318 07:33:57.888511 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6\": container with ID starting with a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6 not found: ID does not exist" containerID="a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6" Mar 18 07:33:57 crc kubenswrapper[4917]: I0318 07:33:57.888552 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6"} err="failed to get container status \"a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6\": rpc error: code = NotFound desc = could not find container \"a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6\": container with ID starting with a1731952c89980e9e8737ed6015a7c2a3ff852dbb66e2cfcf32b2a34a1ac5fd6 not found: ID does not exist" Mar 18 07:33:57 crc kubenswrapper[4917]: E0318 07:33:57.983289 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod650fcf06_9932_47b7_8d7e_c989d1297586.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod650fcf06_9932_47b7_8d7e_c989d1297586.slice/crio-dc18e45a75f1757b3ff23112becb5e7ae79eb9c5cc9882eeb47b640bd3f68581\": RecentStats: unable to find data in memory cache]" Mar 18 07:33:59 crc kubenswrapper[4917]: I0318 07:33:59.783538 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="650fcf06-9932-47b7-8d7e-c989d1297586" path="/var/lib/kubelet/pods/650fcf06-9932-47b7-8d7e-c989d1297586/volumes" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.164672 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563654-77hph"] Mar 18 07:34:00 crc kubenswrapper[4917]: E0318 07:34:00.165020 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650fcf06-9932-47b7-8d7e-c989d1297586" containerName="extract-utilities" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.165034 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="650fcf06-9932-47b7-8d7e-c989d1297586" containerName="extract-utilities" Mar 18 07:34:00 crc kubenswrapper[4917]: E0318 07:34:00.165059 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650fcf06-9932-47b7-8d7e-c989d1297586" containerName="registry-server" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.165065 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="650fcf06-9932-47b7-8d7e-c989d1297586" containerName="registry-server" Mar 18 07:34:00 crc kubenswrapper[4917]: E0318 07:34:00.165073 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650fcf06-9932-47b7-8d7e-c989d1297586" containerName="extract-content" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.165079 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="650fcf06-9932-47b7-8d7e-c989d1297586" containerName="extract-content" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.165240 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="650fcf06-9932-47b7-8d7e-c989d1297586" containerName="registry-server" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.165789 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563654-77hph" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.168698 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.169104 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.169632 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.171529 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563654-77hph"] Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.267214 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hglt4\" (UniqueName: \"kubernetes.io/projected/f9beb779-3322-4afd-8130-3f9d3b8bcbc4-kube-api-access-hglt4\") pod \"auto-csr-approver-29563654-77hph\" (UID: \"f9beb779-3322-4afd-8130-3f9d3b8bcbc4\") " pod="openshift-infra/auto-csr-approver-29563654-77hph" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.369106 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hglt4\" (UniqueName: \"kubernetes.io/projected/f9beb779-3322-4afd-8130-3f9d3b8bcbc4-kube-api-access-hglt4\") pod \"auto-csr-approver-29563654-77hph\" (UID: \"f9beb779-3322-4afd-8130-3f9d3b8bcbc4\") " pod="openshift-infra/auto-csr-approver-29563654-77hph" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.401696 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hglt4\" (UniqueName: \"kubernetes.io/projected/f9beb779-3322-4afd-8130-3f9d3b8bcbc4-kube-api-access-hglt4\") pod \"auto-csr-approver-29563654-77hph\" (UID: \"f9beb779-3322-4afd-8130-3f9d3b8bcbc4\") " pod="openshift-infra/auto-csr-approver-29563654-77hph" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.497322 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563654-77hph" Mar 18 07:34:00 crc kubenswrapper[4917]: I0318 07:34:00.782299 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563654-77hph"] Mar 18 07:34:00 crc kubenswrapper[4917]: W0318 07:34:00.791649 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9beb779_3322_4afd_8130_3f9d3b8bcbc4.slice/crio-17673e5a6db4437ae4b434ceed190cad6ed42eef7eda867ebd0c91e0a9ba0e7d WatchSource:0}: Error finding container 17673e5a6db4437ae4b434ceed190cad6ed42eef7eda867ebd0c91e0a9ba0e7d: Status 404 returned error can't find the container with id 17673e5a6db4437ae4b434ceed190cad6ed42eef7eda867ebd0c91e0a9ba0e7d Mar 18 07:34:01 crc kubenswrapper[4917]: I0318 07:34:01.809556 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563654-77hph" event={"ID":"f9beb779-3322-4afd-8130-3f9d3b8bcbc4","Type":"ContainerStarted","Data":"17673e5a6db4437ae4b434ceed190cad6ed42eef7eda867ebd0c91e0a9ba0e7d"} Mar 18 07:34:02 crc kubenswrapper[4917]: I0318 07:34:02.821274 4917 generic.go:334] "Generic (PLEG): container finished" podID="f9beb779-3322-4afd-8130-3f9d3b8bcbc4" containerID="c43a239aa27f2f9114e07a7cbc0ca1fd8c59731ae0c252e453c018eb0ff75e31" exitCode=0 Mar 18 07:34:02 crc kubenswrapper[4917]: I0318 07:34:02.821364 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563654-77hph" event={"ID":"f9beb779-3322-4afd-8130-3f9d3b8bcbc4","Type":"ContainerDied","Data":"c43a239aa27f2f9114e07a7cbc0ca1fd8c59731ae0c252e453c018eb0ff75e31"} Mar 18 07:34:04 crc kubenswrapper[4917]: I0318 07:34:04.245070 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563654-77hph" Mar 18 07:34:04 crc kubenswrapper[4917]: I0318 07:34:04.332674 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hglt4\" (UniqueName: \"kubernetes.io/projected/f9beb779-3322-4afd-8130-3f9d3b8bcbc4-kube-api-access-hglt4\") pod \"f9beb779-3322-4afd-8130-3f9d3b8bcbc4\" (UID: \"f9beb779-3322-4afd-8130-3f9d3b8bcbc4\") " Mar 18 07:34:04 crc kubenswrapper[4917]: I0318 07:34:04.342375 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9beb779-3322-4afd-8130-3f9d3b8bcbc4-kube-api-access-hglt4" (OuterVolumeSpecName: "kube-api-access-hglt4") pod "f9beb779-3322-4afd-8130-3f9d3b8bcbc4" (UID: "f9beb779-3322-4afd-8130-3f9d3b8bcbc4"). InnerVolumeSpecName "kube-api-access-hglt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:34:04 crc kubenswrapper[4917]: I0318 07:34:04.434226 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hglt4\" (UniqueName: \"kubernetes.io/projected/f9beb779-3322-4afd-8130-3f9d3b8bcbc4-kube-api-access-hglt4\") on node \"crc\" DevicePath \"\"" Mar 18 07:34:04 crc kubenswrapper[4917]: I0318 07:34:04.878702 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563654-77hph" event={"ID":"f9beb779-3322-4afd-8130-3f9d3b8bcbc4","Type":"ContainerDied","Data":"17673e5a6db4437ae4b434ceed190cad6ed42eef7eda867ebd0c91e0a9ba0e7d"} Mar 18 07:34:04 crc kubenswrapper[4917]: I0318 07:34:04.878754 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17673e5a6db4437ae4b434ceed190cad6ed42eef7eda867ebd0c91e0a9ba0e7d" Mar 18 07:34:04 crc kubenswrapper[4917]: I0318 07:34:04.878824 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563654-77hph" Mar 18 07:34:05 crc kubenswrapper[4917]: I0318 07:34:05.330997 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563648-mltwf"] Mar 18 07:34:05 crc kubenswrapper[4917]: I0318 07:34:05.341674 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563648-mltwf"] Mar 18 07:34:05 crc kubenswrapper[4917]: I0318 07:34:05.798164 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d5d68c9-31a7-4f1a-b628-fe2de8939add" path="/var/lib/kubelet/pods/4d5d68c9-31a7-4f1a-b628-fe2de8939add/volumes" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.701100 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t6zfk"] Mar 18 07:34:47 crc kubenswrapper[4917]: E0318 07:34:47.702066 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9beb779-3322-4afd-8130-3f9d3b8bcbc4" containerName="oc" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.702097 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9beb779-3322-4afd-8130-3f9d3b8bcbc4" containerName="oc" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.702245 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9beb779-3322-4afd-8130-3f9d3b8bcbc4" containerName="oc" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.703215 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.729523 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6zfk"] Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.857807 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-utilities\") pod \"redhat-operators-t6zfk\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.858131 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzvpg\" (UniqueName: \"kubernetes.io/projected/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-kube-api-access-rzvpg\") pod \"redhat-operators-t6zfk\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.858340 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-catalog-content\") pod \"redhat-operators-t6zfk\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.959507 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzvpg\" (UniqueName: \"kubernetes.io/projected/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-kube-api-access-rzvpg\") pod \"redhat-operators-t6zfk\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.959570 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-catalog-content\") pod \"redhat-operators-t6zfk\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.959657 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-utilities\") pod \"redhat-operators-t6zfk\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.960011 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-catalog-content\") pod \"redhat-operators-t6zfk\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.960061 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-utilities\") pod \"redhat-operators-t6zfk\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:47 crc kubenswrapper[4917]: I0318 07:34:47.986776 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzvpg\" (UniqueName: \"kubernetes.io/projected/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-kube-api-access-rzvpg\") pod \"redhat-operators-t6zfk\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:48 crc kubenswrapper[4917]: I0318 07:34:48.043684 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:48 crc kubenswrapper[4917]: I0318 07:34:48.498971 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6zfk"] Mar 18 07:34:48 crc kubenswrapper[4917]: W0318 07:34:48.511842 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ef3b7e7_64c3_4d3a_a500_5e98794229e3.slice/crio-26fc5cd09e4246431e9227f27f2cccd62529e547205c96f649b3a7354ead78ab WatchSource:0}: Error finding container 26fc5cd09e4246431e9227f27f2cccd62529e547205c96f649b3a7354ead78ab: Status 404 returned error can't find the container with id 26fc5cd09e4246431e9227f27f2cccd62529e547205c96f649b3a7354ead78ab Mar 18 07:34:49 crc kubenswrapper[4917]: I0318 07:34:49.296106 4917 generic.go:334] "Generic (PLEG): container finished" podID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerID="a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3" exitCode=0 Mar 18 07:34:49 crc kubenswrapper[4917]: I0318 07:34:49.296166 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6zfk" event={"ID":"9ef3b7e7-64c3-4d3a-a500-5e98794229e3","Type":"ContainerDied","Data":"a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3"} Mar 18 07:34:49 crc kubenswrapper[4917]: I0318 07:34:49.296195 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6zfk" event={"ID":"9ef3b7e7-64c3-4d3a-a500-5e98794229e3","Type":"ContainerStarted","Data":"26fc5cd09e4246431e9227f27f2cccd62529e547205c96f649b3a7354ead78ab"} Mar 18 07:34:50 crc kubenswrapper[4917]: I0318 07:34:50.306469 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6zfk" event={"ID":"9ef3b7e7-64c3-4d3a-a500-5e98794229e3","Type":"ContainerStarted","Data":"670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940"} Mar 18 07:34:51 crc kubenswrapper[4917]: I0318 07:34:51.320420 4917 generic.go:334] "Generic (PLEG): container finished" podID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerID="670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940" exitCode=0 Mar 18 07:34:51 crc kubenswrapper[4917]: I0318 07:34:51.320909 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6zfk" event={"ID":"9ef3b7e7-64c3-4d3a-a500-5e98794229e3","Type":"ContainerDied","Data":"670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940"} Mar 18 07:34:52 crc kubenswrapper[4917]: I0318 07:34:52.335500 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6zfk" event={"ID":"9ef3b7e7-64c3-4d3a-a500-5e98794229e3","Type":"ContainerStarted","Data":"81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe"} Mar 18 07:34:52 crc kubenswrapper[4917]: I0318 07:34:52.368645 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t6zfk" podStartSLOduration=2.898123456 podStartE2EDuration="5.368620564s" podCreationTimestamp="2026-03-18 07:34:47 +0000 UTC" firstStartedPulling="2026-03-18 07:34:49.298867636 +0000 UTC m=+2874.240022350" lastFinishedPulling="2026-03-18 07:34:51.769364714 +0000 UTC m=+2876.710519458" observedRunningTime="2026-03-18 07:34:52.365794764 +0000 UTC m=+2877.306949508" watchObservedRunningTime="2026-03-18 07:34:52.368620564 +0000 UTC m=+2877.309775318" Mar 18 07:34:58 crc kubenswrapper[4917]: I0318 07:34:58.044449 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:58 crc kubenswrapper[4917]: I0318 07:34:58.045315 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:34:59 crc kubenswrapper[4917]: I0318 07:34:59.108047 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t6zfk" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerName="registry-server" probeResult="failure" output=< Mar 18 07:34:59 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 07:34:59 crc kubenswrapper[4917]: > Mar 18 07:35:04 crc kubenswrapper[4917]: I0318 07:35:04.302858 4917 scope.go:117] "RemoveContainer" containerID="993d6796ba4a603416c8c3eaa99aa50cce746b3c456d2474ab9661d06e3be950" Mar 18 07:35:08 crc kubenswrapper[4917]: I0318 07:35:08.130143 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:35:08 crc kubenswrapper[4917]: I0318 07:35:08.199938 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:35:08 crc kubenswrapper[4917]: I0318 07:35:08.381093 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6zfk"] Mar 18 07:35:09 crc kubenswrapper[4917]: I0318 07:35:09.485055 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t6zfk" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerName="registry-server" containerID="cri-o://81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe" gracePeriod=2 Mar 18 07:35:09 crc kubenswrapper[4917]: I0318 07:35:09.990440 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.128023 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-utilities\") pod \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.128189 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-catalog-content\") pod \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.128239 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzvpg\" (UniqueName: \"kubernetes.io/projected/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-kube-api-access-rzvpg\") pod \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\" (UID: \"9ef3b7e7-64c3-4d3a-a500-5e98794229e3\") " Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.129835 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-utilities" (OuterVolumeSpecName: "utilities") pod "9ef3b7e7-64c3-4d3a-a500-5e98794229e3" (UID: "9ef3b7e7-64c3-4d3a-a500-5e98794229e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.138999 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-kube-api-access-rzvpg" (OuterVolumeSpecName: "kube-api-access-rzvpg") pod "9ef3b7e7-64c3-4d3a-a500-5e98794229e3" (UID: "9ef3b7e7-64c3-4d3a-a500-5e98794229e3"). InnerVolumeSpecName "kube-api-access-rzvpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.230786 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzvpg\" (UniqueName: \"kubernetes.io/projected/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-kube-api-access-rzvpg\") on node \"crc\" DevicePath \"\"" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.230836 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.362847 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ef3b7e7-64c3-4d3a-a500-5e98794229e3" (UID: "9ef3b7e7-64c3-4d3a-a500-5e98794229e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.434516 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ef3b7e7-64c3-4d3a-a500-5e98794229e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.496799 4917 generic.go:334] "Generic (PLEG): container finished" podID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerID="81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe" exitCode=0 Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.496875 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6zfk" event={"ID":"9ef3b7e7-64c3-4d3a-a500-5e98794229e3","Type":"ContainerDied","Data":"81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe"} Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.496927 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6zfk" event={"ID":"9ef3b7e7-64c3-4d3a-a500-5e98794229e3","Type":"ContainerDied","Data":"26fc5cd09e4246431e9227f27f2cccd62529e547205c96f649b3a7354ead78ab"} Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.496960 4917 scope.go:117] "RemoveContainer" containerID="81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.500274 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6zfk" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.536363 4917 scope.go:117] "RemoveContainer" containerID="670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.560630 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6zfk"] Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.567814 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t6zfk"] Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.579235 4917 scope.go:117] "RemoveContainer" containerID="a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.610850 4917 scope.go:117] "RemoveContainer" containerID="81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe" Mar 18 07:35:10 crc kubenswrapper[4917]: E0318 07:35:10.611321 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe\": container with ID starting with 81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe not found: ID does not exist" containerID="81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.611376 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe"} err="failed to get container status \"81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe\": rpc error: code = NotFound desc = could not find container \"81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe\": container with ID starting with 81553022cca48bbfcfb8553caae74cefb7ea021aad9f57a96a7455eebe7badbe not found: ID does not exist" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.611412 4917 scope.go:117] "RemoveContainer" containerID="670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940" Mar 18 07:35:10 crc kubenswrapper[4917]: E0318 07:35:10.611954 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940\": container with ID starting with 670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940 not found: ID does not exist" containerID="670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.612013 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940"} err="failed to get container status \"670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940\": rpc error: code = NotFound desc = could not find container \"670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940\": container with ID starting with 670d1f6b57348eb5c2cb9ad7c4d2e46c0c8969aea94826d5eba142ee8b75d940 not found: ID does not exist" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.612042 4917 scope.go:117] "RemoveContainer" containerID="a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3" Mar 18 07:35:10 crc kubenswrapper[4917]: E0318 07:35:10.612619 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3\": container with ID starting with a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3 not found: ID does not exist" containerID="a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3" Mar 18 07:35:10 crc kubenswrapper[4917]: I0318 07:35:10.612776 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3"} err="failed to get container status \"a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3\": rpc error: code = NotFound desc = could not find container \"a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3\": container with ID starting with a482809ce1d9f8683f28a89793aa9f294ae52de62abe43c4407e289233a4b5c3 not found: ID does not exist" Mar 18 07:35:11 crc kubenswrapper[4917]: I0318 07:35:11.782946 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" path="/var/lib/kubelet/pods/9ef3b7e7-64c3-4d3a-a500-5e98794229e3/volumes" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.549641 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j754s"] Mar 18 07:35:34 crc kubenswrapper[4917]: E0318 07:35:34.550898 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerName="extract-content" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.550927 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerName="extract-content" Mar 18 07:35:34 crc kubenswrapper[4917]: E0318 07:35:34.550989 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerName="extract-utilities" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.551003 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerName="extract-utilities" Mar 18 07:35:34 crc kubenswrapper[4917]: E0318 07:35:34.551024 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerName="registry-server" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.551038 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerName="registry-server" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.551277 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef3b7e7-64c3-4d3a-a500-5e98794229e3" containerName="registry-server" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.553902 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.564897 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-764t2\" (UniqueName: \"kubernetes.io/projected/551511c8-971e-4ef3-b3fe-9b8063e55d41-kube-api-access-764t2\") pod \"certified-operators-j754s\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.565205 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-utilities\") pod \"certified-operators-j754s\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.565289 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-catalog-content\") pod \"certified-operators-j754s\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.570295 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j754s"] Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.666511 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-764t2\" (UniqueName: \"kubernetes.io/projected/551511c8-971e-4ef3-b3fe-9b8063e55d41-kube-api-access-764t2\") pod \"certified-operators-j754s\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.666608 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-utilities\") pod \"certified-operators-j754s\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.666704 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-catalog-content\") pod \"certified-operators-j754s\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.667405 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-catalog-content\") pod \"certified-operators-j754s\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.667776 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-utilities\") pod \"certified-operators-j754s\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.691538 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-764t2\" (UniqueName: \"kubernetes.io/projected/551511c8-971e-4ef3-b3fe-9b8063e55d41-kube-api-access-764t2\") pod \"certified-operators-j754s\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:34 crc kubenswrapper[4917]: I0318 07:35:34.887890 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:35 crc kubenswrapper[4917]: I0318 07:35:35.160789 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j754s"] Mar 18 07:35:35 crc kubenswrapper[4917]: I0318 07:35:35.713805 4917 generic.go:334] "Generic (PLEG): container finished" podID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerID="c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88" exitCode=0 Mar 18 07:35:35 crc kubenswrapper[4917]: I0318 07:35:35.713856 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j754s" event={"ID":"551511c8-971e-4ef3-b3fe-9b8063e55d41","Type":"ContainerDied","Data":"c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88"} Mar 18 07:35:35 crc kubenswrapper[4917]: I0318 07:35:35.713882 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j754s" event={"ID":"551511c8-971e-4ef3-b3fe-9b8063e55d41","Type":"ContainerStarted","Data":"f0cd566a064ec18da8a7582e558e3f19556c8da8d7761094be99087ad28c777b"} Mar 18 07:35:37 crc kubenswrapper[4917]: I0318 07:35:37.730758 4917 generic.go:334] "Generic (PLEG): container finished" podID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerID="65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1" exitCode=0 Mar 18 07:35:37 crc kubenswrapper[4917]: I0318 07:35:37.730874 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j754s" event={"ID":"551511c8-971e-4ef3-b3fe-9b8063e55d41","Type":"ContainerDied","Data":"65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1"} Mar 18 07:35:38 crc kubenswrapper[4917]: I0318 07:35:38.740232 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j754s" event={"ID":"551511c8-971e-4ef3-b3fe-9b8063e55d41","Type":"ContainerStarted","Data":"6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f"} Mar 18 07:35:38 crc kubenswrapper[4917]: I0318 07:35:38.764823 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j754s" podStartSLOduration=2.262714477 podStartE2EDuration="4.764798893s" podCreationTimestamp="2026-03-18 07:35:34 +0000 UTC" firstStartedPulling="2026-03-18 07:35:35.715359498 +0000 UTC m=+2920.656514212" lastFinishedPulling="2026-03-18 07:35:38.217443874 +0000 UTC m=+2923.158598628" observedRunningTime="2026-03-18 07:35:38.760938167 +0000 UTC m=+2923.702092921" watchObservedRunningTime="2026-03-18 07:35:38.764798893 +0000 UTC m=+2923.705953607" Mar 18 07:35:44 crc kubenswrapper[4917]: I0318 07:35:44.888970 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:44 crc kubenswrapper[4917]: I0318 07:35:44.890667 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:44 crc kubenswrapper[4917]: I0318 07:35:44.945621 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:45 crc kubenswrapper[4917]: I0318 07:35:45.895157 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:45 crc kubenswrapper[4917]: I0318 07:35:45.965351 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j754s"] Mar 18 07:35:47 crc kubenswrapper[4917]: I0318 07:35:47.834320 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j754s" podUID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerName="registry-server" containerID="cri-o://6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f" gracePeriod=2 Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.299059 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.402302 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-764t2\" (UniqueName: \"kubernetes.io/projected/551511c8-971e-4ef3-b3fe-9b8063e55d41-kube-api-access-764t2\") pod \"551511c8-971e-4ef3-b3fe-9b8063e55d41\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.402734 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-catalog-content\") pod \"551511c8-971e-4ef3-b3fe-9b8063e55d41\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.403002 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-utilities\") pod \"551511c8-971e-4ef3-b3fe-9b8063e55d41\" (UID: \"551511c8-971e-4ef3-b3fe-9b8063e55d41\") " Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.404097 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-utilities" (OuterVolumeSpecName: "utilities") pod "551511c8-971e-4ef3-b3fe-9b8063e55d41" (UID: "551511c8-971e-4ef3-b3fe-9b8063e55d41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.410331 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/551511c8-971e-4ef3-b3fe-9b8063e55d41-kube-api-access-764t2" (OuterVolumeSpecName: "kube-api-access-764t2") pod "551511c8-971e-4ef3-b3fe-9b8063e55d41" (UID: "551511c8-971e-4ef3-b3fe-9b8063e55d41"). InnerVolumeSpecName "kube-api-access-764t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.480249 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "551511c8-971e-4ef3-b3fe-9b8063e55d41" (UID: "551511c8-971e-4ef3-b3fe-9b8063e55d41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.504816 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.504841 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-764t2\" (UniqueName: \"kubernetes.io/projected/551511c8-971e-4ef3-b3fe-9b8063e55d41-kube-api-access-764t2\") on node \"crc\" DevicePath \"\"" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.504854 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/551511c8-971e-4ef3-b3fe-9b8063e55d41-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.851986 4917 generic.go:334] "Generic (PLEG): container finished" podID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerID="6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f" exitCode=0 Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.852076 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j754s" event={"ID":"551511c8-971e-4ef3-b3fe-9b8063e55d41","Type":"ContainerDied","Data":"6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f"} Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.852124 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j754s" event={"ID":"551511c8-971e-4ef3-b3fe-9b8063e55d41","Type":"ContainerDied","Data":"f0cd566a064ec18da8a7582e558e3f19556c8da8d7761094be99087ad28c777b"} Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.852155 4917 scope.go:117] "RemoveContainer" containerID="6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.852625 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j754s" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.885440 4917 scope.go:117] "RemoveContainer" containerID="65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.905086 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j754s"] Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.914170 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j754s"] Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.927532 4917 scope.go:117] "RemoveContainer" containerID="c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.951716 4917 scope.go:117] "RemoveContainer" containerID="6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f" Mar 18 07:35:48 crc kubenswrapper[4917]: E0318 07:35:48.952216 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f\": container with ID starting with 6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f not found: ID does not exist" containerID="6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.952271 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f"} err="failed to get container status \"6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f\": rpc error: code = NotFound desc = could not find container \"6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f\": container with ID starting with 6e445c438782e1b4aa6de051bf124750edb00a6bde0c27d763393d1b73812d3f not found: ID does not exist" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.952310 4917 scope.go:117] "RemoveContainer" containerID="65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1" Mar 18 07:35:48 crc kubenswrapper[4917]: E0318 07:35:48.952821 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1\": container with ID starting with 65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1 not found: ID does not exist" containerID="65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.952876 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1"} err="failed to get container status \"65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1\": rpc error: code = NotFound desc = could not find container \"65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1\": container with ID starting with 65cd71c8c91ab6b77d81f6b9cee0a94151838138789929aa800ee5d69268d0f1 not found: ID does not exist" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.952913 4917 scope.go:117] "RemoveContainer" containerID="c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88" Mar 18 07:35:48 crc kubenswrapper[4917]: E0318 07:35:48.953318 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88\": container with ID starting with c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88 not found: ID does not exist" containerID="c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88" Mar 18 07:35:48 crc kubenswrapper[4917]: I0318 07:35:48.953358 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88"} err="failed to get container status \"c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88\": rpc error: code = NotFound desc = could not find container \"c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88\": container with ID starting with c4f5dcce67cba8a8c0896ab77743ef798d4b506bb69f79539589a2e7c4f6fc88 not found: ID does not exist" Mar 18 07:35:49 crc kubenswrapper[4917]: I0318 07:35:49.790472 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="551511c8-971e-4ef3-b3fe-9b8063e55d41" path="/var/lib/kubelet/pods/551511c8-971e-4ef3-b3fe-9b8063e55d41/volumes" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.170956 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563656-sww89"] Mar 18 07:36:00 crc kubenswrapper[4917]: E0318 07:36:00.172159 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerName="registry-server" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.172187 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerName="registry-server" Mar 18 07:36:00 crc kubenswrapper[4917]: E0318 07:36:00.172228 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerName="extract-utilities" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.172244 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerName="extract-utilities" Mar 18 07:36:00 crc kubenswrapper[4917]: E0318 07:36:00.172269 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerName="extract-content" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.172290 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerName="extract-content" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.172564 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="551511c8-971e-4ef3-b3fe-9b8063e55d41" containerName="registry-server" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.173503 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563656-sww89" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.176034 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.176656 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.176970 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.182863 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563656-sww89"] Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.298846 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkcb\" (UniqueName: \"kubernetes.io/projected/330910fc-368f-4182-b012-fe027fc8858e-kube-api-access-qnkcb\") pod \"auto-csr-approver-29563656-sww89\" (UID: \"330910fc-368f-4182-b012-fe027fc8858e\") " pod="openshift-infra/auto-csr-approver-29563656-sww89" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.401121 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkcb\" (UniqueName: \"kubernetes.io/projected/330910fc-368f-4182-b012-fe027fc8858e-kube-api-access-qnkcb\") pod \"auto-csr-approver-29563656-sww89\" (UID: \"330910fc-368f-4182-b012-fe027fc8858e\") " pod="openshift-infra/auto-csr-approver-29563656-sww89" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.423008 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkcb\" (UniqueName: \"kubernetes.io/projected/330910fc-368f-4182-b012-fe027fc8858e-kube-api-access-qnkcb\") pod \"auto-csr-approver-29563656-sww89\" (UID: \"330910fc-368f-4182-b012-fe027fc8858e\") " pod="openshift-infra/auto-csr-approver-29563656-sww89" Mar 18 07:36:00 crc kubenswrapper[4917]: I0318 07:36:00.508231 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563656-sww89" Mar 18 07:36:01 crc kubenswrapper[4917]: I0318 07:36:01.011951 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563656-sww89"] Mar 18 07:36:02 crc kubenswrapper[4917]: I0318 07:36:02.002141 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563656-sww89" event={"ID":"330910fc-368f-4182-b012-fe027fc8858e","Type":"ContainerStarted","Data":"2f6538e9ef904145a55873890ade1b8d083d5ae9c4faac1d2871effee0e345d9"} Mar 18 07:36:02 crc kubenswrapper[4917]: I0318 07:36:02.928911 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:36:02 crc kubenswrapper[4917]: I0318 07:36:02.929027 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:36:04 crc kubenswrapper[4917]: I0318 07:36:04.023656 4917 generic.go:334] "Generic (PLEG): container finished" podID="330910fc-368f-4182-b012-fe027fc8858e" containerID="994d7292ffd41f8cef54218f18e3937322e3db167af493cc27dc72c28fa43bf0" exitCode=0 Mar 18 07:36:04 crc kubenswrapper[4917]: I0318 07:36:04.023745 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563656-sww89" event={"ID":"330910fc-368f-4182-b012-fe027fc8858e","Type":"ContainerDied","Data":"994d7292ffd41f8cef54218f18e3937322e3db167af493cc27dc72c28fa43bf0"} Mar 18 07:36:05 crc kubenswrapper[4917]: I0318 07:36:05.412134 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563656-sww89" Mar 18 07:36:05 crc kubenswrapper[4917]: I0318 07:36:05.583274 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnkcb\" (UniqueName: \"kubernetes.io/projected/330910fc-368f-4182-b012-fe027fc8858e-kube-api-access-qnkcb\") pod \"330910fc-368f-4182-b012-fe027fc8858e\" (UID: \"330910fc-368f-4182-b012-fe027fc8858e\") " Mar 18 07:36:05 crc kubenswrapper[4917]: I0318 07:36:05.594541 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330910fc-368f-4182-b012-fe027fc8858e-kube-api-access-qnkcb" (OuterVolumeSpecName: "kube-api-access-qnkcb") pod "330910fc-368f-4182-b012-fe027fc8858e" (UID: "330910fc-368f-4182-b012-fe027fc8858e"). InnerVolumeSpecName "kube-api-access-qnkcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:36:05 crc kubenswrapper[4917]: I0318 07:36:05.685689 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnkcb\" (UniqueName: \"kubernetes.io/projected/330910fc-368f-4182-b012-fe027fc8858e-kube-api-access-qnkcb\") on node \"crc\" DevicePath \"\"" Mar 18 07:36:06 crc kubenswrapper[4917]: I0318 07:36:06.046391 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563656-sww89" event={"ID":"330910fc-368f-4182-b012-fe027fc8858e","Type":"ContainerDied","Data":"2f6538e9ef904145a55873890ade1b8d083d5ae9c4faac1d2871effee0e345d9"} Mar 18 07:36:06 crc kubenswrapper[4917]: I0318 07:36:06.046934 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f6538e9ef904145a55873890ade1b8d083d5ae9c4faac1d2871effee0e345d9" Mar 18 07:36:06 crc kubenswrapper[4917]: I0318 07:36:06.046534 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563656-sww89" Mar 18 07:36:06 crc kubenswrapper[4917]: I0318 07:36:06.507055 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563650-cqnbt"] Mar 18 07:36:06 crc kubenswrapper[4917]: I0318 07:36:06.529837 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563650-cqnbt"] Mar 18 07:36:07 crc kubenswrapper[4917]: I0318 07:36:07.789149 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587c640f-3eb7-484d-acfa-7af4200547ae" path="/var/lib/kubelet/pods/587c640f-3eb7-484d-acfa-7af4200547ae/volumes" Mar 18 07:36:32 crc kubenswrapper[4917]: I0318 07:36:32.928564 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:36:32 crc kubenswrapper[4917]: I0318 07:36:32.929127 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:37:02 crc kubenswrapper[4917]: I0318 07:37:02.929105 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:37:02 crc kubenswrapper[4917]: I0318 07:37:02.929771 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:37:02 crc kubenswrapper[4917]: I0318 07:37:02.929846 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:37:02 crc kubenswrapper[4917]: I0318 07:37:02.930812 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bbf9144e4b69a26dc4113283f082a54cd447636d065f4b3cdcdc308d081feb2"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:37:02 crc kubenswrapper[4917]: I0318 07:37:02.931057 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://4bbf9144e4b69a26dc4113283f082a54cd447636d065f4b3cdcdc308d081feb2" gracePeriod=600 Mar 18 07:37:03 crc kubenswrapper[4917]: I0318 07:37:03.558117 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="4bbf9144e4b69a26dc4113283f082a54cd447636d065f4b3cdcdc308d081feb2" exitCode=0 Mar 18 07:37:03 crc kubenswrapper[4917]: I0318 07:37:03.558433 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"4bbf9144e4b69a26dc4113283f082a54cd447636d065f4b3cdcdc308d081feb2"} Mar 18 07:37:03 crc kubenswrapper[4917]: I0318 07:37:03.558536 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced"} Mar 18 07:37:03 crc kubenswrapper[4917]: I0318 07:37:03.558568 4917 scope.go:117] "RemoveContainer" containerID="56821a5681f8d5b01bf369bfc83905d198e257d87cd81f6d14b0538d51274fc5" Mar 18 07:37:04 crc kubenswrapper[4917]: I0318 07:37:04.435245 4917 scope.go:117] "RemoveContainer" containerID="fb531fc4d6a115562987e9b45470ba2b14f9338323eaf60d4384e45eeeab0d9e" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.147564 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563658-8fvsk"] Mar 18 07:38:00 crc kubenswrapper[4917]: E0318 07:38:00.148389 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330910fc-368f-4182-b012-fe027fc8858e" containerName="oc" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.148400 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="330910fc-368f-4182-b012-fe027fc8858e" containerName="oc" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.148531 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="330910fc-368f-4182-b012-fe027fc8858e" containerName="oc" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.149104 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563658-8fvsk" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.151807 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.159004 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.159685 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.163467 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563658-8fvsk"] Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.310963 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngc6q\" (UniqueName: \"kubernetes.io/projected/f3791801-7221-420a-84a6-75728ad54b63-kube-api-access-ngc6q\") pod \"auto-csr-approver-29563658-8fvsk\" (UID: \"f3791801-7221-420a-84a6-75728ad54b63\") " pod="openshift-infra/auto-csr-approver-29563658-8fvsk" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.412646 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngc6q\" (UniqueName: \"kubernetes.io/projected/f3791801-7221-420a-84a6-75728ad54b63-kube-api-access-ngc6q\") pod \"auto-csr-approver-29563658-8fvsk\" (UID: \"f3791801-7221-420a-84a6-75728ad54b63\") " pod="openshift-infra/auto-csr-approver-29563658-8fvsk" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.445292 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngc6q\" (UniqueName: \"kubernetes.io/projected/f3791801-7221-420a-84a6-75728ad54b63-kube-api-access-ngc6q\") pod \"auto-csr-approver-29563658-8fvsk\" (UID: \"f3791801-7221-420a-84a6-75728ad54b63\") " pod="openshift-infra/auto-csr-approver-29563658-8fvsk" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.466364 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563658-8fvsk" Mar 18 07:38:00 crc kubenswrapper[4917]: I0318 07:38:00.940359 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563658-8fvsk"] Mar 18 07:38:00 crc kubenswrapper[4917]: W0318 07:38:00.951341 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3791801_7221_420a_84a6_75728ad54b63.slice/crio-34a2b2d3115f1fe8c38538d035f7b0e8e6d1eebeb3df8dce1d3e705c74df74e2 WatchSource:0}: Error finding container 34a2b2d3115f1fe8c38538d035f7b0e8e6d1eebeb3df8dce1d3e705c74df74e2: Status 404 returned error can't find the container with id 34a2b2d3115f1fe8c38538d035f7b0e8e6d1eebeb3df8dce1d3e705c74df74e2 Mar 18 07:38:01 crc kubenswrapper[4917]: I0318 07:38:01.130128 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563658-8fvsk" event={"ID":"f3791801-7221-420a-84a6-75728ad54b63","Type":"ContainerStarted","Data":"34a2b2d3115f1fe8c38538d035f7b0e8e6d1eebeb3df8dce1d3e705c74df74e2"} Mar 18 07:38:03 crc kubenswrapper[4917]: I0318 07:38:03.151560 4917 generic.go:334] "Generic (PLEG): container finished" podID="f3791801-7221-420a-84a6-75728ad54b63" containerID="74ed507b63616512b1c664653ab221f289729d28080000637aa148de13c895c4" exitCode=0 Mar 18 07:38:03 crc kubenswrapper[4917]: I0318 07:38:03.151765 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563658-8fvsk" event={"ID":"f3791801-7221-420a-84a6-75728ad54b63","Type":"ContainerDied","Data":"74ed507b63616512b1c664653ab221f289729d28080000637aa148de13c895c4"} Mar 18 07:38:04 crc kubenswrapper[4917]: I0318 07:38:04.520732 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563658-8fvsk" Mar 18 07:38:04 crc kubenswrapper[4917]: I0318 07:38:04.686149 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngc6q\" (UniqueName: \"kubernetes.io/projected/f3791801-7221-420a-84a6-75728ad54b63-kube-api-access-ngc6q\") pod \"f3791801-7221-420a-84a6-75728ad54b63\" (UID: \"f3791801-7221-420a-84a6-75728ad54b63\") " Mar 18 07:38:04 crc kubenswrapper[4917]: I0318 07:38:04.695377 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3791801-7221-420a-84a6-75728ad54b63-kube-api-access-ngc6q" (OuterVolumeSpecName: "kube-api-access-ngc6q") pod "f3791801-7221-420a-84a6-75728ad54b63" (UID: "f3791801-7221-420a-84a6-75728ad54b63"). InnerVolumeSpecName "kube-api-access-ngc6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:38:04 crc kubenswrapper[4917]: I0318 07:38:04.788020 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngc6q\" (UniqueName: \"kubernetes.io/projected/f3791801-7221-420a-84a6-75728ad54b63-kube-api-access-ngc6q\") on node \"crc\" DevicePath \"\"" Mar 18 07:38:05 crc kubenswrapper[4917]: I0318 07:38:05.168363 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563658-8fvsk" event={"ID":"f3791801-7221-420a-84a6-75728ad54b63","Type":"ContainerDied","Data":"34a2b2d3115f1fe8c38538d035f7b0e8e6d1eebeb3df8dce1d3e705c74df74e2"} Mar 18 07:38:05 crc kubenswrapper[4917]: I0318 07:38:05.168406 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34a2b2d3115f1fe8c38538d035f7b0e8e6d1eebeb3df8dce1d3e705c74df74e2" Mar 18 07:38:05 crc kubenswrapper[4917]: I0318 07:38:05.168420 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563658-8fvsk" Mar 18 07:38:05 crc kubenswrapper[4917]: I0318 07:38:05.600887 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563652-cvr68"] Mar 18 07:38:05 crc kubenswrapper[4917]: I0318 07:38:05.607600 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563652-cvr68"] Mar 18 07:38:05 crc kubenswrapper[4917]: I0318 07:38:05.786842 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde28581-741f-4fe7-a348-d73e844e317f" path="/var/lib/kubelet/pods/cde28581-741f-4fe7-a348-d73e844e317f/volumes" Mar 18 07:39:04 crc kubenswrapper[4917]: I0318 07:39:04.561728 4917 scope.go:117] "RemoveContainer" containerID="6dfabd9ac15311c41a307e6003b821da2f1e64a3da73911d149eb90028ca3370" Mar 18 07:39:32 crc kubenswrapper[4917]: I0318 07:39:32.929369 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:39:32 crc kubenswrapper[4917]: I0318 07:39:32.929911 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.188203 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563660-cdgnk"] Mar 18 07:40:00 crc kubenswrapper[4917]: E0318 07:40:00.190692 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3791801-7221-420a-84a6-75728ad54b63" containerName="oc" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.190719 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3791801-7221-420a-84a6-75728ad54b63" containerName="oc" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.191027 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3791801-7221-420a-84a6-75728ad54b63" containerName="oc" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.192861 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563660-cdgnk" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.195553 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.195696 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.195723 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.199980 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563660-cdgnk"] Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.338353 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxxfk\" (UniqueName: \"kubernetes.io/projected/0697d2dc-dada-4b49-a201-0731534c8a59-kube-api-access-qxxfk\") pod \"auto-csr-approver-29563660-cdgnk\" (UID: \"0697d2dc-dada-4b49-a201-0731534c8a59\") " pod="openshift-infra/auto-csr-approver-29563660-cdgnk" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.440438 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxxfk\" (UniqueName: \"kubernetes.io/projected/0697d2dc-dada-4b49-a201-0731534c8a59-kube-api-access-qxxfk\") pod \"auto-csr-approver-29563660-cdgnk\" (UID: \"0697d2dc-dada-4b49-a201-0731534c8a59\") " pod="openshift-infra/auto-csr-approver-29563660-cdgnk" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.473271 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxxfk\" (UniqueName: \"kubernetes.io/projected/0697d2dc-dada-4b49-a201-0731534c8a59-kube-api-access-qxxfk\") pod \"auto-csr-approver-29563660-cdgnk\" (UID: \"0697d2dc-dada-4b49-a201-0731534c8a59\") " pod="openshift-infra/auto-csr-approver-29563660-cdgnk" Mar 18 07:40:00 crc kubenswrapper[4917]: I0318 07:40:00.523179 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563660-cdgnk" Mar 18 07:40:01 crc kubenswrapper[4917]: I0318 07:40:01.045753 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563660-cdgnk"] Mar 18 07:40:01 crc kubenswrapper[4917]: I0318 07:40:01.056826 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:40:01 crc kubenswrapper[4917]: I0318 07:40:01.244399 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563660-cdgnk" event={"ID":"0697d2dc-dada-4b49-a201-0731534c8a59","Type":"ContainerStarted","Data":"e1e3064fb79608287939f678841be081e246911013c20eecdd727bf98538f042"} Mar 18 07:40:02 crc kubenswrapper[4917]: I0318 07:40:02.929054 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:40:02 crc kubenswrapper[4917]: I0318 07:40:02.929464 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:40:03 crc kubenswrapper[4917]: I0318 07:40:03.262907 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563660-cdgnk" event={"ID":"0697d2dc-dada-4b49-a201-0731534c8a59","Type":"ContainerStarted","Data":"49bc18abfb613b126958f81f98bc7c1d60190aa28a6aa460189f31c80b087a86"} Mar 18 07:40:03 crc kubenswrapper[4917]: I0318 07:40:03.284542 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563660-cdgnk" podStartSLOduration=1.5906164170000001 podStartE2EDuration="3.284523942s" podCreationTimestamp="2026-03-18 07:40:00 +0000 UTC" firstStartedPulling="2026-03-18 07:40:01.056229349 +0000 UTC m=+3185.997384093" lastFinishedPulling="2026-03-18 07:40:02.750136894 +0000 UTC m=+3187.691291618" observedRunningTime="2026-03-18 07:40:03.27663858 +0000 UTC m=+3188.217793314" watchObservedRunningTime="2026-03-18 07:40:03.284523942 +0000 UTC m=+3188.225678656" Mar 18 07:40:04 crc kubenswrapper[4917]: I0318 07:40:04.271567 4917 generic.go:334] "Generic (PLEG): container finished" podID="0697d2dc-dada-4b49-a201-0731534c8a59" containerID="49bc18abfb613b126958f81f98bc7c1d60190aa28a6aa460189f31c80b087a86" exitCode=0 Mar 18 07:40:04 crc kubenswrapper[4917]: I0318 07:40:04.271629 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563660-cdgnk" event={"ID":"0697d2dc-dada-4b49-a201-0731534c8a59","Type":"ContainerDied","Data":"49bc18abfb613b126958f81f98bc7c1d60190aa28a6aa460189f31c80b087a86"} Mar 18 07:40:05 crc kubenswrapper[4917]: I0318 07:40:05.655633 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563660-cdgnk" Mar 18 07:40:05 crc kubenswrapper[4917]: I0318 07:40:05.729349 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxxfk\" (UniqueName: \"kubernetes.io/projected/0697d2dc-dada-4b49-a201-0731534c8a59-kube-api-access-qxxfk\") pod \"0697d2dc-dada-4b49-a201-0731534c8a59\" (UID: \"0697d2dc-dada-4b49-a201-0731534c8a59\") " Mar 18 07:40:05 crc kubenswrapper[4917]: I0318 07:40:05.748969 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0697d2dc-dada-4b49-a201-0731534c8a59-kube-api-access-qxxfk" (OuterVolumeSpecName: "kube-api-access-qxxfk") pod "0697d2dc-dada-4b49-a201-0731534c8a59" (UID: "0697d2dc-dada-4b49-a201-0731534c8a59"). InnerVolumeSpecName "kube-api-access-qxxfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:40:05 crc kubenswrapper[4917]: I0318 07:40:05.831400 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxxfk\" (UniqueName: \"kubernetes.io/projected/0697d2dc-dada-4b49-a201-0731534c8a59-kube-api-access-qxxfk\") on node \"crc\" DevicePath \"\"" Mar 18 07:40:06 crc kubenswrapper[4917]: I0318 07:40:06.294304 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563660-cdgnk" event={"ID":"0697d2dc-dada-4b49-a201-0731534c8a59","Type":"ContainerDied","Data":"e1e3064fb79608287939f678841be081e246911013c20eecdd727bf98538f042"} Mar 18 07:40:06 crc kubenswrapper[4917]: I0318 07:40:06.294364 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1e3064fb79608287939f678841be081e246911013c20eecdd727bf98538f042" Mar 18 07:40:06 crc kubenswrapper[4917]: I0318 07:40:06.294409 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563660-cdgnk" Mar 18 07:40:06 crc kubenswrapper[4917]: I0318 07:40:06.370367 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563654-77hph"] Mar 18 07:40:06 crc kubenswrapper[4917]: I0318 07:40:06.381932 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563654-77hph"] Mar 18 07:40:07 crc kubenswrapper[4917]: I0318 07:40:07.787775 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9beb779-3322-4afd-8130-3f9d3b8bcbc4" path="/var/lib/kubelet/pods/f9beb779-3322-4afd-8130-3f9d3b8bcbc4/volumes" Mar 18 07:40:32 crc kubenswrapper[4917]: I0318 07:40:32.929639 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:40:32 crc kubenswrapper[4917]: I0318 07:40:32.930278 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:40:32 crc kubenswrapper[4917]: I0318 07:40:32.930393 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:40:32 crc kubenswrapper[4917]: I0318 07:40:32.931313 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:40:32 crc kubenswrapper[4917]: I0318 07:40:32.931407 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" gracePeriod=600 Mar 18 07:40:33 crc kubenswrapper[4917]: E0318 07:40:33.063653 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:40:33 crc kubenswrapper[4917]: I0318 07:40:33.545104 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" exitCode=0 Mar 18 07:40:33 crc kubenswrapper[4917]: I0318 07:40:33.545179 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced"} Mar 18 07:40:33 crc kubenswrapper[4917]: I0318 07:40:33.545283 4917 scope.go:117] "RemoveContainer" containerID="4bbf9144e4b69a26dc4113283f082a54cd447636d065f4b3cdcdc308d081feb2" Mar 18 07:40:33 crc kubenswrapper[4917]: I0318 07:40:33.546008 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:40:33 crc kubenswrapper[4917]: E0318 07:40:33.546478 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:40:48 crc kubenswrapper[4917]: I0318 07:40:48.772105 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:40:48 crc kubenswrapper[4917]: E0318 07:40:48.772754 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:41:00 crc kubenswrapper[4917]: I0318 07:41:00.772875 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:41:00 crc kubenswrapper[4917]: E0318 07:41:00.773865 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:41:04 crc kubenswrapper[4917]: I0318 07:41:04.686115 4917 scope.go:117] "RemoveContainer" containerID="c43a239aa27f2f9114e07a7cbc0ca1fd8c59731ae0c252e453c018eb0ff75e31" Mar 18 07:41:11 crc kubenswrapper[4917]: I0318 07:41:11.774343 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:41:11 crc kubenswrapper[4917]: E0318 07:41:11.775711 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:41:23 crc kubenswrapper[4917]: I0318 07:41:23.781044 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:41:23 crc kubenswrapper[4917]: E0318 07:41:23.781825 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:41:37 crc kubenswrapper[4917]: I0318 07:41:37.772620 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:41:37 crc kubenswrapper[4917]: E0318 07:41:37.773750 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:41:52 crc kubenswrapper[4917]: I0318 07:41:52.772820 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:41:52 crc kubenswrapper[4917]: E0318 07:41:52.773577 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.152341 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563662-4vslz"] Mar 18 07:42:00 crc kubenswrapper[4917]: E0318 07:42:00.154606 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0697d2dc-dada-4b49-a201-0731534c8a59" containerName="oc" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.154734 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0697d2dc-dada-4b49-a201-0731534c8a59" containerName="oc" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.155074 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0697d2dc-dada-4b49-a201-0731534c8a59" containerName="oc" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.155707 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563662-4vslz" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.158428 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.159431 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.159504 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.168094 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563662-4vslz"] Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.276373 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n6tn\" (UniqueName: \"kubernetes.io/projected/22a2702f-b95f-4786-8940-8445b400b635-kube-api-access-6n6tn\") pod \"auto-csr-approver-29563662-4vslz\" (UID: \"22a2702f-b95f-4786-8940-8445b400b635\") " pod="openshift-infra/auto-csr-approver-29563662-4vslz" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.377614 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n6tn\" (UniqueName: \"kubernetes.io/projected/22a2702f-b95f-4786-8940-8445b400b635-kube-api-access-6n6tn\") pod \"auto-csr-approver-29563662-4vslz\" (UID: \"22a2702f-b95f-4786-8940-8445b400b635\") " pod="openshift-infra/auto-csr-approver-29563662-4vslz" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.409031 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n6tn\" (UniqueName: \"kubernetes.io/projected/22a2702f-b95f-4786-8940-8445b400b635-kube-api-access-6n6tn\") pod \"auto-csr-approver-29563662-4vslz\" (UID: \"22a2702f-b95f-4786-8940-8445b400b635\") " pod="openshift-infra/auto-csr-approver-29563662-4vslz" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.500053 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563662-4vslz" Mar 18 07:42:00 crc kubenswrapper[4917]: I0318 07:42:00.724187 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563662-4vslz"] Mar 18 07:42:01 crc kubenswrapper[4917]: I0318 07:42:01.319974 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563662-4vslz" event={"ID":"22a2702f-b95f-4786-8940-8445b400b635","Type":"ContainerStarted","Data":"efc635d236d605e02e63930aa805973c42c8f37e4b61398c1f2451e944d814bd"} Mar 18 07:42:02 crc kubenswrapper[4917]: I0318 07:42:02.340949 4917 generic.go:334] "Generic (PLEG): container finished" podID="22a2702f-b95f-4786-8940-8445b400b635" containerID="6a50868c6ae08ab0f7456088280a979e2f01932146f9ff61c5bd5d77b988a748" exitCode=0 Mar 18 07:42:02 crc kubenswrapper[4917]: I0318 07:42:02.341041 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563662-4vslz" event={"ID":"22a2702f-b95f-4786-8940-8445b400b635","Type":"ContainerDied","Data":"6a50868c6ae08ab0f7456088280a979e2f01932146f9ff61c5bd5d77b988a748"} Mar 18 07:42:03 crc kubenswrapper[4917]: I0318 07:42:03.728024 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563662-4vslz" Mar 18 07:42:03 crc kubenswrapper[4917]: I0318 07:42:03.773294 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:42:03 crc kubenswrapper[4917]: E0318 07:42:03.773507 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:42:03 crc kubenswrapper[4917]: I0318 07:42:03.828451 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6n6tn\" (UniqueName: \"kubernetes.io/projected/22a2702f-b95f-4786-8940-8445b400b635-kube-api-access-6n6tn\") pod \"22a2702f-b95f-4786-8940-8445b400b635\" (UID: \"22a2702f-b95f-4786-8940-8445b400b635\") " Mar 18 07:42:03 crc kubenswrapper[4917]: I0318 07:42:03.836870 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a2702f-b95f-4786-8940-8445b400b635-kube-api-access-6n6tn" (OuterVolumeSpecName: "kube-api-access-6n6tn") pod "22a2702f-b95f-4786-8940-8445b400b635" (UID: "22a2702f-b95f-4786-8940-8445b400b635"). InnerVolumeSpecName "kube-api-access-6n6tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:42:03 crc kubenswrapper[4917]: I0318 07:42:03.930633 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6n6tn\" (UniqueName: \"kubernetes.io/projected/22a2702f-b95f-4786-8940-8445b400b635-kube-api-access-6n6tn\") on node \"crc\" DevicePath \"\"" Mar 18 07:42:04 crc kubenswrapper[4917]: I0318 07:42:04.360274 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563662-4vslz" event={"ID":"22a2702f-b95f-4786-8940-8445b400b635","Type":"ContainerDied","Data":"efc635d236d605e02e63930aa805973c42c8f37e4b61398c1f2451e944d814bd"} Mar 18 07:42:04 crc kubenswrapper[4917]: I0318 07:42:04.360317 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc635d236d605e02e63930aa805973c42c8f37e4b61398c1f2451e944d814bd" Mar 18 07:42:04 crc kubenswrapper[4917]: I0318 07:42:04.360388 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563662-4vslz" Mar 18 07:42:04 crc kubenswrapper[4917]: I0318 07:42:04.813949 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563656-sww89"] Mar 18 07:42:04 crc kubenswrapper[4917]: I0318 07:42:04.820741 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563656-sww89"] Mar 18 07:42:05 crc kubenswrapper[4917]: I0318 07:42:05.810738 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330910fc-368f-4182-b012-fe027fc8858e" path="/var/lib/kubelet/pods/330910fc-368f-4182-b012-fe027fc8858e/volumes" Mar 18 07:42:14 crc kubenswrapper[4917]: I0318 07:42:14.773238 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:42:14 crc kubenswrapper[4917]: E0318 07:42:14.774508 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:42:27 crc kubenswrapper[4917]: I0318 07:42:27.773369 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:42:27 crc kubenswrapper[4917]: E0318 07:42:27.774405 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:42:40 crc kubenswrapper[4917]: I0318 07:42:40.773287 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:42:40 crc kubenswrapper[4917]: E0318 07:42:40.774524 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:42:54 crc kubenswrapper[4917]: I0318 07:42:54.774489 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:42:54 crc kubenswrapper[4917]: E0318 07:42:54.775525 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:43:04 crc kubenswrapper[4917]: I0318 07:43:04.807678 4917 scope.go:117] "RemoveContainer" containerID="994d7292ffd41f8cef54218f18e3937322e3db167af493cc27dc72c28fa43bf0" Mar 18 07:43:07 crc kubenswrapper[4917]: I0318 07:43:07.774382 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:43:07 crc kubenswrapper[4917]: E0318 07:43:07.775201 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:43:18 crc kubenswrapper[4917]: I0318 07:43:18.772759 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:43:18 crc kubenswrapper[4917]: E0318 07:43:18.773504 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:43:30 crc kubenswrapper[4917]: I0318 07:43:30.773097 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:43:30 crc kubenswrapper[4917]: E0318 07:43:30.775210 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:43:37 crc kubenswrapper[4917]: I0318 07:43:37.800846 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mn4hj"] Mar 18 07:43:37 crc kubenswrapper[4917]: E0318 07:43:37.802163 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22a2702f-b95f-4786-8940-8445b400b635" containerName="oc" Mar 18 07:43:37 crc kubenswrapper[4917]: I0318 07:43:37.802193 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="22a2702f-b95f-4786-8940-8445b400b635" containerName="oc" Mar 18 07:43:37 crc kubenswrapper[4917]: I0318 07:43:37.802562 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="22a2702f-b95f-4786-8940-8445b400b635" containerName="oc" Mar 18 07:43:37 crc kubenswrapper[4917]: I0318 07:43:37.804940 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn4hj"] Mar 18 07:43:37 crc kubenswrapper[4917]: I0318 07:43:37.805135 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:37 crc kubenswrapper[4917]: I0318 07:43:37.966033 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-catalog-content\") pod \"redhat-marketplace-mn4hj\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:37 crc kubenswrapper[4917]: I0318 07:43:37.966090 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-utilities\") pod \"redhat-marketplace-mn4hj\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:37 crc kubenswrapper[4917]: I0318 07:43:37.966119 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfkzw\" (UniqueName: \"kubernetes.io/projected/24dd41e4-8078-4e46-8c97-f8308b36f403-kube-api-access-mfkzw\") pod \"redhat-marketplace-mn4hj\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:38 crc kubenswrapper[4917]: I0318 07:43:38.067915 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfkzw\" (UniqueName: \"kubernetes.io/projected/24dd41e4-8078-4e46-8c97-f8308b36f403-kube-api-access-mfkzw\") pod \"redhat-marketplace-mn4hj\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:38 crc kubenswrapper[4917]: I0318 07:43:38.068095 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-catalog-content\") pod \"redhat-marketplace-mn4hj\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:38 crc kubenswrapper[4917]: I0318 07:43:38.068154 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-utilities\") pod \"redhat-marketplace-mn4hj\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:38 crc kubenswrapper[4917]: I0318 07:43:38.068548 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-catalog-content\") pod \"redhat-marketplace-mn4hj\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:38 crc kubenswrapper[4917]: I0318 07:43:38.068717 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-utilities\") pod \"redhat-marketplace-mn4hj\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:38 crc kubenswrapper[4917]: I0318 07:43:38.107859 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfkzw\" (UniqueName: \"kubernetes.io/projected/24dd41e4-8078-4e46-8c97-f8308b36f403-kube-api-access-mfkzw\") pod \"redhat-marketplace-mn4hj\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:38 crc kubenswrapper[4917]: I0318 07:43:38.163713 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:38 crc kubenswrapper[4917]: I0318 07:43:38.394047 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn4hj"] Mar 18 07:43:39 crc kubenswrapper[4917]: I0318 07:43:39.217286 4917 generic.go:334] "Generic (PLEG): container finished" podID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerID="984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4" exitCode=0 Mar 18 07:43:39 crc kubenswrapper[4917]: I0318 07:43:39.217374 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn4hj" event={"ID":"24dd41e4-8078-4e46-8c97-f8308b36f403","Type":"ContainerDied","Data":"984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4"} Mar 18 07:43:39 crc kubenswrapper[4917]: I0318 07:43:39.217762 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn4hj" event={"ID":"24dd41e4-8078-4e46-8c97-f8308b36f403","Type":"ContainerStarted","Data":"701963c13b8b59953fc7607a8ef2763319c43a7b5b987d0e9ee4e89f2b3fed48"} Mar 18 07:43:40 crc kubenswrapper[4917]: I0318 07:43:40.227374 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn4hj" event={"ID":"24dd41e4-8078-4e46-8c97-f8308b36f403","Type":"ContainerStarted","Data":"c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326"} Mar 18 07:43:41 crc kubenswrapper[4917]: I0318 07:43:41.238049 4917 generic.go:334] "Generic (PLEG): container finished" podID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerID="c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326" exitCode=0 Mar 18 07:43:41 crc kubenswrapper[4917]: I0318 07:43:41.238105 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn4hj" event={"ID":"24dd41e4-8078-4e46-8c97-f8308b36f403","Type":"ContainerDied","Data":"c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326"} Mar 18 07:43:42 crc kubenswrapper[4917]: I0318 07:43:42.257158 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn4hj" event={"ID":"24dd41e4-8078-4e46-8c97-f8308b36f403","Type":"ContainerStarted","Data":"46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b"} Mar 18 07:43:43 crc kubenswrapper[4917]: I0318 07:43:43.772668 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:43:43 crc kubenswrapper[4917]: E0318 07:43:43.773162 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:43:48 crc kubenswrapper[4917]: I0318 07:43:48.164002 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:48 crc kubenswrapper[4917]: I0318 07:43:48.164400 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:48 crc kubenswrapper[4917]: I0318 07:43:48.238863 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:48 crc kubenswrapper[4917]: I0318 07:43:48.269889 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mn4hj" podStartSLOduration=8.795869582 podStartE2EDuration="11.269858015s" podCreationTimestamp="2026-03-18 07:43:37 +0000 UTC" firstStartedPulling="2026-03-18 07:43:39.219325617 +0000 UTC m=+3404.160480341" lastFinishedPulling="2026-03-18 07:43:41.69331405 +0000 UTC m=+3406.634468774" observedRunningTime="2026-03-18 07:43:42.281139237 +0000 UTC m=+3407.222293951" watchObservedRunningTime="2026-03-18 07:43:48.269858015 +0000 UTC m=+3413.211012769" Mar 18 07:43:48 crc kubenswrapper[4917]: I0318 07:43:48.366354 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:48 crc kubenswrapper[4917]: I0318 07:43:48.488389 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn4hj"] Mar 18 07:43:50 crc kubenswrapper[4917]: I0318 07:43:50.325906 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mn4hj" podUID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerName="registry-server" containerID="cri-o://46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b" gracePeriod=2 Mar 18 07:43:50 crc kubenswrapper[4917]: W0318 07:43:50.372854 4917 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24dd41e4_8078_4e46_8c97_f8308b36f403.slice/crio-conmon-46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b.scope/memory.min": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24dd41e4_8078_4e46_8c97_f8308b36f403.slice/crio-conmon-46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b.scope/memory.min: no such device Mar 18 07:43:50 crc kubenswrapper[4917]: I0318 07:43:50.839683 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:50 crc kubenswrapper[4917]: I0318 07:43:50.969284 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-catalog-content\") pod \"24dd41e4-8078-4e46-8c97-f8308b36f403\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " Mar 18 07:43:50 crc kubenswrapper[4917]: I0318 07:43:50.969412 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-utilities\") pod \"24dd41e4-8078-4e46-8c97-f8308b36f403\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " Mar 18 07:43:50 crc kubenswrapper[4917]: I0318 07:43:50.969570 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfkzw\" (UniqueName: \"kubernetes.io/projected/24dd41e4-8078-4e46-8c97-f8308b36f403-kube-api-access-mfkzw\") pod \"24dd41e4-8078-4e46-8c97-f8308b36f403\" (UID: \"24dd41e4-8078-4e46-8c97-f8308b36f403\") " Mar 18 07:43:50 crc kubenswrapper[4917]: I0318 07:43:50.970472 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-utilities" (OuterVolumeSpecName: "utilities") pod "24dd41e4-8078-4e46-8c97-f8308b36f403" (UID: "24dd41e4-8078-4e46-8c97-f8308b36f403"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:43:50 crc kubenswrapper[4917]: I0318 07:43:50.978821 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24dd41e4-8078-4e46-8c97-f8308b36f403-kube-api-access-mfkzw" (OuterVolumeSpecName: "kube-api-access-mfkzw") pod "24dd41e4-8078-4e46-8c97-f8308b36f403" (UID: "24dd41e4-8078-4e46-8c97-f8308b36f403"). InnerVolumeSpecName "kube-api-access-mfkzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.005041 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "24dd41e4-8078-4e46-8c97-f8308b36f403" (UID: "24dd41e4-8078-4e46-8c97-f8308b36f403"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.071538 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfkzw\" (UniqueName: \"kubernetes.io/projected/24dd41e4-8078-4e46-8c97-f8308b36f403-kube-api-access-mfkzw\") on node \"crc\" DevicePath \"\"" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.071616 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.071633 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/24dd41e4-8078-4e46-8c97-f8308b36f403-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.335174 4917 generic.go:334] "Generic (PLEG): container finished" podID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerID="46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b" exitCode=0 Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.335228 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn4hj" event={"ID":"24dd41e4-8078-4e46-8c97-f8308b36f403","Type":"ContainerDied","Data":"46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b"} Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.335267 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mn4hj" event={"ID":"24dd41e4-8078-4e46-8c97-f8308b36f403","Type":"ContainerDied","Data":"701963c13b8b59953fc7607a8ef2763319c43a7b5b987d0e9ee4e89f2b3fed48"} Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.335275 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mn4hj" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.335288 4917 scope.go:117] "RemoveContainer" containerID="46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.354007 4917 scope.go:117] "RemoveContainer" containerID="c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.377691 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn4hj"] Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.382026 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mn4hj"] Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.395849 4917 scope.go:117] "RemoveContainer" containerID="984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.417824 4917 scope.go:117] "RemoveContainer" containerID="46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b" Mar 18 07:43:51 crc kubenswrapper[4917]: E0318 07:43:51.418312 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b\": container with ID starting with 46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b not found: ID does not exist" containerID="46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.418345 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b"} err="failed to get container status \"46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b\": rpc error: code = NotFound desc = could not find container \"46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b\": container with ID starting with 46c9e012508f847b0ffc433121ef82686c109bde72b7f077dad4aa29ee6dac3b not found: ID does not exist" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.418365 4917 scope.go:117] "RemoveContainer" containerID="c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326" Mar 18 07:43:51 crc kubenswrapper[4917]: E0318 07:43:51.418874 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326\": container with ID starting with c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326 not found: ID does not exist" containerID="c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.418898 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326"} err="failed to get container status \"c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326\": rpc error: code = NotFound desc = could not find container \"c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326\": container with ID starting with c809568a2d6be5f72e819aaaf144f9ce05bb2a4b4780f144bfcee43a7484e326 not found: ID does not exist" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.418911 4917 scope.go:117] "RemoveContainer" containerID="984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4" Mar 18 07:43:51 crc kubenswrapper[4917]: E0318 07:43:51.419189 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4\": container with ID starting with 984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4 not found: ID does not exist" containerID="984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.419221 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4"} err="failed to get container status \"984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4\": rpc error: code = NotFound desc = could not find container \"984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4\": container with ID starting with 984095d8e70b153c794e388da61379a0cf36d16afe59d8a7121806b64f524fa4 not found: ID does not exist" Mar 18 07:43:51 crc kubenswrapper[4917]: I0318 07:43:51.790333 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24dd41e4-8078-4e46-8c97-f8308b36f403" path="/var/lib/kubelet/pods/24dd41e4-8078-4e46-8c97-f8308b36f403/volumes" Mar 18 07:43:57 crc kubenswrapper[4917]: I0318 07:43:57.773970 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:43:57 crc kubenswrapper[4917]: E0318 07:43:57.774945 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.152308 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563664-98fdw"] Mar 18 07:44:00 crc kubenswrapper[4917]: E0318 07:44:00.152870 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerName="registry-server" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.152891 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerName="registry-server" Mar 18 07:44:00 crc kubenswrapper[4917]: E0318 07:44:00.152937 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerName="extract-utilities" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.152950 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerName="extract-utilities" Mar 18 07:44:00 crc kubenswrapper[4917]: E0318 07:44:00.152970 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerName="extract-content" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.152982 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerName="extract-content" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.153249 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="24dd41e4-8078-4e46-8c97-f8308b36f403" containerName="registry-server" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.154136 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563664-98fdw" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.156311 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.156477 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.156634 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.160571 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563664-98fdw"] Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.319511 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4t8l\" (UniqueName: \"kubernetes.io/projected/957d8712-e191-467e-8922-361ce4bc8c5a-kube-api-access-f4t8l\") pod \"auto-csr-approver-29563664-98fdw\" (UID: \"957d8712-e191-467e-8922-361ce4bc8c5a\") " pod="openshift-infra/auto-csr-approver-29563664-98fdw" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.421891 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4t8l\" (UniqueName: \"kubernetes.io/projected/957d8712-e191-467e-8922-361ce4bc8c5a-kube-api-access-f4t8l\") pod \"auto-csr-approver-29563664-98fdw\" (UID: \"957d8712-e191-467e-8922-361ce4bc8c5a\") " pod="openshift-infra/auto-csr-approver-29563664-98fdw" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.455865 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4t8l\" (UniqueName: \"kubernetes.io/projected/957d8712-e191-467e-8922-361ce4bc8c5a-kube-api-access-f4t8l\") pod \"auto-csr-approver-29563664-98fdw\" (UID: \"957d8712-e191-467e-8922-361ce4bc8c5a\") " pod="openshift-infra/auto-csr-approver-29563664-98fdw" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.472047 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563664-98fdw" Mar 18 07:44:00 crc kubenswrapper[4917]: I0318 07:44:00.950199 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563664-98fdw"] Mar 18 07:44:01 crc kubenswrapper[4917]: I0318 07:44:01.426929 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563664-98fdw" event={"ID":"957d8712-e191-467e-8922-361ce4bc8c5a","Type":"ContainerStarted","Data":"57b5f258f7ea36a44d1f99f7c9b140a0e748134bf4a32460b146d14a457c79f9"} Mar 18 07:44:02 crc kubenswrapper[4917]: I0318 07:44:02.436779 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563664-98fdw" event={"ID":"957d8712-e191-467e-8922-361ce4bc8c5a","Type":"ContainerStarted","Data":"2ac531baf53f527f8f2e700ee12d7dd3b068fd9fcdc9f5bb21c702ce853aedc0"} Mar 18 07:44:03 crc kubenswrapper[4917]: I0318 07:44:03.447549 4917 generic.go:334] "Generic (PLEG): container finished" podID="957d8712-e191-467e-8922-361ce4bc8c5a" containerID="2ac531baf53f527f8f2e700ee12d7dd3b068fd9fcdc9f5bb21c702ce853aedc0" exitCode=0 Mar 18 07:44:03 crc kubenswrapper[4917]: I0318 07:44:03.447665 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563664-98fdw" event={"ID":"957d8712-e191-467e-8922-361ce4bc8c5a","Type":"ContainerDied","Data":"2ac531baf53f527f8f2e700ee12d7dd3b068fd9fcdc9f5bb21c702ce853aedc0"} Mar 18 07:44:04 crc kubenswrapper[4917]: I0318 07:44:04.783189 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563664-98fdw" Mar 18 07:44:04 crc kubenswrapper[4917]: I0318 07:44:04.906876 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4t8l\" (UniqueName: \"kubernetes.io/projected/957d8712-e191-467e-8922-361ce4bc8c5a-kube-api-access-f4t8l\") pod \"957d8712-e191-467e-8922-361ce4bc8c5a\" (UID: \"957d8712-e191-467e-8922-361ce4bc8c5a\") " Mar 18 07:44:04 crc kubenswrapper[4917]: I0318 07:44:04.914964 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957d8712-e191-467e-8922-361ce4bc8c5a-kube-api-access-f4t8l" (OuterVolumeSpecName: "kube-api-access-f4t8l") pod "957d8712-e191-467e-8922-361ce4bc8c5a" (UID: "957d8712-e191-467e-8922-361ce4bc8c5a"). InnerVolumeSpecName "kube-api-access-f4t8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:44:05 crc kubenswrapper[4917]: I0318 07:44:05.012785 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4t8l\" (UniqueName: \"kubernetes.io/projected/957d8712-e191-467e-8922-361ce4bc8c5a-kube-api-access-f4t8l\") on node \"crc\" DevicePath \"\"" Mar 18 07:44:05 crc kubenswrapper[4917]: I0318 07:44:05.469804 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563664-98fdw" event={"ID":"957d8712-e191-467e-8922-361ce4bc8c5a","Type":"ContainerDied","Data":"57b5f258f7ea36a44d1f99f7c9b140a0e748134bf4a32460b146d14a457c79f9"} Mar 18 07:44:05 crc kubenswrapper[4917]: I0318 07:44:05.469862 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57b5f258f7ea36a44d1f99f7c9b140a0e748134bf4a32460b146d14a457c79f9" Mar 18 07:44:05 crc kubenswrapper[4917]: I0318 07:44:05.469896 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563664-98fdw" Mar 18 07:44:05 crc kubenswrapper[4917]: I0318 07:44:05.545644 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563658-8fvsk"] Mar 18 07:44:05 crc kubenswrapper[4917]: I0318 07:44:05.555442 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563658-8fvsk"] Mar 18 07:44:05 crc kubenswrapper[4917]: I0318 07:44:05.784933 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3791801-7221-420a-84a6-75728ad54b63" path="/var/lib/kubelet/pods/f3791801-7221-420a-84a6-75728ad54b63/volumes" Mar 18 07:44:12 crc kubenswrapper[4917]: I0318 07:44:12.772486 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:44:12 crc kubenswrapper[4917]: E0318 07:44:12.773524 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:44:24 crc kubenswrapper[4917]: I0318 07:44:24.772795 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:44:24 crc kubenswrapper[4917]: E0318 07:44:24.773865 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:44:39 crc kubenswrapper[4917]: I0318 07:44:39.773425 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:44:39 crc kubenswrapper[4917]: E0318 07:44:39.774653 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.573222 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fktlg"] Mar 18 07:44:42 crc kubenswrapper[4917]: E0318 07:44:42.574269 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="957d8712-e191-467e-8922-361ce4bc8c5a" containerName="oc" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.574303 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="957d8712-e191-467e-8922-361ce4bc8c5a" containerName="oc" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.574718 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="957d8712-e191-467e-8922-361ce4bc8c5a" containerName="oc" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.576987 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.587075 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fktlg"] Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.693087 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-utilities\") pod \"community-operators-fktlg\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.693298 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-764r2\" (UniqueName: \"kubernetes.io/projected/0878ce2f-dd9c-438f-a951-3a6aee229a69-kube-api-access-764r2\") pod \"community-operators-fktlg\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.693783 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-catalog-content\") pod \"community-operators-fktlg\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.794820 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-utilities\") pod \"community-operators-fktlg\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.794914 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-764r2\" (UniqueName: \"kubernetes.io/projected/0878ce2f-dd9c-438f-a951-3a6aee229a69-kube-api-access-764r2\") pod \"community-operators-fktlg\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.795024 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-catalog-content\") pod \"community-operators-fktlg\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.795430 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-utilities\") pod \"community-operators-fktlg\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.795559 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-catalog-content\") pod \"community-operators-fktlg\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.817866 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-764r2\" (UniqueName: \"kubernetes.io/projected/0878ce2f-dd9c-438f-a951-3a6aee229a69-kube-api-access-764r2\") pod \"community-operators-fktlg\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:42 crc kubenswrapper[4917]: I0318 07:44:42.901914 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:43 crc kubenswrapper[4917]: I0318 07:44:43.444787 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fktlg"] Mar 18 07:44:43 crc kubenswrapper[4917]: I0318 07:44:43.827132 4917 generic.go:334] "Generic (PLEG): container finished" podID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerID="69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d" exitCode=0 Mar 18 07:44:43 crc kubenswrapper[4917]: I0318 07:44:43.827189 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fktlg" event={"ID":"0878ce2f-dd9c-438f-a951-3a6aee229a69","Type":"ContainerDied","Data":"69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d"} Mar 18 07:44:43 crc kubenswrapper[4917]: I0318 07:44:43.827220 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fktlg" event={"ID":"0878ce2f-dd9c-438f-a951-3a6aee229a69","Type":"ContainerStarted","Data":"a204236c5a62e65158a32b0ac63262b38feadb028b5fc79607d7c0d320524ee8"} Mar 18 07:44:44 crc kubenswrapper[4917]: I0318 07:44:44.846345 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fktlg" event={"ID":"0878ce2f-dd9c-438f-a951-3a6aee229a69","Type":"ContainerStarted","Data":"95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38"} Mar 18 07:44:45 crc kubenswrapper[4917]: I0318 07:44:45.856932 4917 generic.go:334] "Generic (PLEG): container finished" podID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerID="95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38" exitCode=0 Mar 18 07:44:45 crc kubenswrapper[4917]: I0318 07:44:45.857005 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fktlg" event={"ID":"0878ce2f-dd9c-438f-a951-3a6aee229a69","Type":"ContainerDied","Data":"95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38"} Mar 18 07:44:46 crc kubenswrapper[4917]: I0318 07:44:46.869202 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fktlg" event={"ID":"0878ce2f-dd9c-438f-a951-3a6aee229a69","Type":"ContainerStarted","Data":"d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3"} Mar 18 07:44:46 crc kubenswrapper[4917]: I0318 07:44:46.901429 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fktlg" podStartSLOduration=2.146452633 podStartE2EDuration="4.901407934s" podCreationTimestamp="2026-03-18 07:44:42 +0000 UTC" firstStartedPulling="2026-03-18 07:44:43.828652078 +0000 UTC m=+3468.769806792" lastFinishedPulling="2026-03-18 07:44:46.583607379 +0000 UTC m=+3471.524762093" observedRunningTime="2026-03-18 07:44:46.898154555 +0000 UTC m=+3471.839309289" watchObservedRunningTime="2026-03-18 07:44:46.901407934 +0000 UTC m=+3471.842562648" Mar 18 07:44:50 crc kubenswrapper[4917]: I0318 07:44:50.772826 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:44:50 crc kubenswrapper[4917]: E0318 07:44:50.773548 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:44:52 crc kubenswrapper[4917]: I0318 07:44:52.902545 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:52 crc kubenswrapper[4917]: I0318 07:44:52.902941 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:52 crc kubenswrapper[4917]: I0318 07:44:52.950992 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:53 crc kubenswrapper[4917]: I0318 07:44:53.009367 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:53 crc kubenswrapper[4917]: I0318 07:44:53.204716 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fktlg"] Mar 18 07:44:54 crc kubenswrapper[4917]: I0318 07:44:54.936687 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fktlg" podUID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerName="registry-server" containerID="cri-o://d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3" gracePeriod=2 Mar 18 07:44:55 crc kubenswrapper[4917]: I0318 07:44:55.919435 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:55 crc kubenswrapper[4917]: I0318 07:44:55.963659 4917 generic.go:334] "Generic (PLEG): container finished" podID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerID="d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3" exitCode=0 Mar 18 07:44:55 crc kubenswrapper[4917]: I0318 07:44:55.963713 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fktlg" event={"ID":"0878ce2f-dd9c-438f-a951-3a6aee229a69","Type":"ContainerDied","Data":"d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3"} Mar 18 07:44:55 crc kubenswrapper[4917]: I0318 07:44:55.963751 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fktlg" event={"ID":"0878ce2f-dd9c-438f-a951-3a6aee229a69","Type":"ContainerDied","Data":"a204236c5a62e65158a32b0ac63262b38feadb028b5fc79607d7c0d320524ee8"} Mar 18 07:44:55 crc kubenswrapper[4917]: I0318 07:44:55.963778 4917 scope.go:117] "RemoveContainer" containerID="d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3" Mar 18 07:44:55 crc kubenswrapper[4917]: I0318 07:44:55.963872 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fktlg" Mar 18 07:44:55 crc kubenswrapper[4917]: I0318 07:44:55.987531 4917 scope.go:117] "RemoveContainer" containerID="95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.030462 4917 scope.go:117] "RemoveContainer" containerID="69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.047006 4917 scope.go:117] "RemoveContainer" containerID="d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3" Mar 18 07:44:56 crc kubenswrapper[4917]: E0318 07:44:56.047533 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3\": container with ID starting with d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3 not found: ID does not exist" containerID="d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.047574 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3"} err="failed to get container status \"d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3\": rpc error: code = NotFound desc = could not find container \"d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3\": container with ID starting with d6d2a8e56a1d9541bcc6c04cafefc0520ee66b2b66c9ca25611ac51402b4bda3 not found: ID does not exist" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.047636 4917 scope.go:117] "RemoveContainer" containerID="95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38" Mar 18 07:44:56 crc kubenswrapper[4917]: E0318 07:44:56.047941 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38\": container with ID starting with 95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38 not found: ID does not exist" containerID="95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.047974 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38"} err="failed to get container status \"95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38\": rpc error: code = NotFound desc = could not find container \"95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38\": container with ID starting with 95b37639b770d64400b9f2ea09c3a1c509ab8d45cf062861f70fa7b840af1c38 not found: ID does not exist" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.047992 4917 scope.go:117] "RemoveContainer" containerID="69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d" Mar 18 07:44:56 crc kubenswrapper[4917]: E0318 07:44:56.048264 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d\": container with ID starting with 69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d not found: ID does not exist" containerID="69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.048286 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d"} err="failed to get container status \"69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d\": rpc error: code = NotFound desc = could not find container \"69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d\": container with ID starting with 69d999ebb93b3d1a69194d0d3c0e4fb4280414b5618353ae472bc31d7c5fb29d not found: ID does not exist" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.108757 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-utilities\") pod \"0878ce2f-dd9c-438f-a951-3a6aee229a69\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.108922 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-764r2\" (UniqueName: \"kubernetes.io/projected/0878ce2f-dd9c-438f-a951-3a6aee229a69-kube-api-access-764r2\") pod \"0878ce2f-dd9c-438f-a951-3a6aee229a69\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.108949 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-catalog-content\") pod \"0878ce2f-dd9c-438f-a951-3a6aee229a69\" (UID: \"0878ce2f-dd9c-438f-a951-3a6aee229a69\") " Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.110842 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-utilities" (OuterVolumeSpecName: "utilities") pod "0878ce2f-dd9c-438f-a951-3a6aee229a69" (UID: "0878ce2f-dd9c-438f-a951-3a6aee229a69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.113080 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0878ce2f-dd9c-438f-a951-3a6aee229a69-kube-api-access-764r2" (OuterVolumeSpecName: "kube-api-access-764r2") pod "0878ce2f-dd9c-438f-a951-3a6aee229a69" (UID: "0878ce2f-dd9c-438f-a951-3a6aee229a69"). InnerVolumeSpecName "kube-api-access-764r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.180231 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0878ce2f-dd9c-438f-a951-3a6aee229a69" (UID: "0878ce2f-dd9c-438f-a951-3a6aee229a69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.211090 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.211148 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-764r2\" (UniqueName: \"kubernetes.io/projected/0878ce2f-dd9c-438f-a951-3a6aee229a69-kube-api-access-764r2\") on node \"crc\" DevicePath \"\"" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.211168 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0878ce2f-dd9c-438f-a951-3a6aee229a69-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.301854 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fktlg"] Mar 18 07:44:56 crc kubenswrapper[4917]: I0318 07:44:56.308758 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fktlg"] Mar 18 07:44:57 crc kubenswrapper[4917]: I0318 07:44:57.779512 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0878ce2f-dd9c-438f-a951-3a6aee229a69" path="/var/lib/kubelet/pods/0878ce2f-dd9c-438f-a951-3a6aee229a69/volumes" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.620110 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-59ntb"] Mar 18 07:44:58 crc kubenswrapper[4917]: E0318 07:44:58.620544 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerName="extract-content" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.620615 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerName="extract-content" Mar 18 07:44:58 crc kubenswrapper[4917]: E0318 07:44:58.620648 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerName="extract-utilities" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.620662 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerName="extract-utilities" Mar 18 07:44:58 crc kubenswrapper[4917]: E0318 07:44:58.620702 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerName="registry-server" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.620713 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerName="registry-server" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.620963 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0878ce2f-dd9c-438f-a951-3a6aee229a69" containerName="registry-server" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.622536 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.631232 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-59ntb"] Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.748315 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l84db\" (UniqueName: \"kubernetes.io/projected/8aefcbde-ba2b-4748-a35f-c755e8aecf21-kube-api-access-l84db\") pod \"redhat-operators-59ntb\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.748527 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-utilities\") pod \"redhat-operators-59ntb\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.748647 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-catalog-content\") pod \"redhat-operators-59ntb\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.850354 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l84db\" (UniqueName: \"kubernetes.io/projected/8aefcbde-ba2b-4748-a35f-c755e8aecf21-kube-api-access-l84db\") pod \"redhat-operators-59ntb\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.850431 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-utilities\") pod \"redhat-operators-59ntb\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.851106 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-catalog-content\") pod \"redhat-operators-59ntb\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.851383 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-utilities\") pod \"redhat-operators-59ntb\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.851492 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-catalog-content\") pod \"redhat-operators-59ntb\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.871247 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l84db\" (UniqueName: \"kubernetes.io/projected/8aefcbde-ba2b-4748-a35f-c755e8aecf21-kube-api-access-l84db\") pod \"redhat-operators-59ntb\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:58 crc kubenswrapper[4917]: I0318 07:44:58.955988 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:44:59 crc kubenswrapper[4917]: I0318 07:44:59.442809 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-59ntb"] Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.005688 4917 generic.go:334] "Generic (PLEG): container finished" podID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerID="ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22" exitCode=0 Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.005745 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59ntb" event={"ID":"8aefcbde-ba2b-4748-a35f-c755e8aecf21","Type":"ContainerDied","Data":"ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22"} Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.005776 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59ntb" event={"ID":"8aefcbde-ba2b-4748-a35f-c755e8aecf21","Type":"ContainerStarted","Data":"990fdde3e4484154d6ff74c22e9eebb0a52d1376efffb563dd02e60e51063653"} Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.141258 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h"] Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.142051 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.145459 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.151881 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.154074 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h"] Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.167721 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfbst\" (UniqueName: \"kubernetes.io/projected/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-kube-api-access-vfbst\") pod \"collect-profiles-29563665-dr46h\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.167896 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-config-volume\") pod \"collect-profiles-29563665-dr46h\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.167931 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-secret-volume\") pod \"collect-profiles-29563665-dr46h\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.269308 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-config-volume\") pod \"collect-profiles-29563665-dr46h\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.269394 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-secret-volume\") pod \"collect-profiles-29563665-dr46h\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.269511 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfbst\" (UniqueName: \"kubernetes.io/projected/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-kube-api-access-vfbst\") pod \"collect-profiles-29563665-dr46h\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.270794 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-config-volume\") pod \"collect-profiles-29563665-dr46h\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.285729 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-secret-volume\") pod \"collect-profiles-29563665-dr46h\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.296712 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfbst\" (UniqueName: \"kubernetes.io/projected/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-kube-api-access-vfbst\") pod \"collect-profiles-29563665-dr46h\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.456075 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:00 crc kubenswrapper[4917]: I0318 07:45:00.929238 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h"] Mar 18 07:45:01 crc kubenswrapper[4917]: I0318 07:45:01.015043 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" event={"ID":"c9563789-4f20-4653-a02c-3b1aeb3b7cd7","Type":"ContainerStarted","Data":"d2fa96d3176956de392f07fb4e269d1e51a8d93ca9c184b421b4b5c1359aec08"} Mar 18 07:45:01 crc kubenswrapper[4917]: I0318 07:45:01.774933 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:45:01 crc kubenswrapper[4917]: E0318 07:45:01.775463 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:45:02 crc kubenswrapper[4917]: I0318 07:45:02.022523 4917 generic.go:334] "Generic (PLEG): container finished" podID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerID="46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7" exitCode=0 Mar 18 07:45:02 crc kubenswrapper[4917]: I0318 07:45:02.022597 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59ntb" event={"ID":"8aefcbde-ba2b-4748-a35f-c755e8aecf21","Type":"ContainerDied","Data":"46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7"} Mar 18 07:45:02 crc kubenswrapper[4917]: I0318 07:45:02.025211 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:45:02 crc kubenswrapper[4917]: I0318 07:45:02.026842 4917 generic.go:334] "Generic (PLEG): container finished" podID="c9563789-4f20-4653-a02c-3b1aeb3b7cd7" containerID="26d4875fd22bf23619e35eb98d60c380db5075bdbd1e5fcb71b11be4033bfdeb" exitCode=0 Mar 18 07:45:02 crc kubenswrapper[4917]: I0318 07:45:02.026902 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" event={"ID":"c9563789-4f20-4653-a02c-3b1aeb3b7cd7","Type":"ContainerDied","Data":"26d4875fd22bf23619e35eb98d60c380db5075bdbd1e5fcb71b11be4033bfdeb"} Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.040052 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59ntb" event={"ID":"8aefcbde-ba2b-4748-a35f-c755e8aecf21","Type":"ContainerStarted","Data":"bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64"} Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.077575 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-59ntb" podStartSLOduration=2.550117025 podStartE2EDuration="5.077551269s" podCreationTimestamp="2026-03-18 07:44:58 +0000 UTC" firstStartedPulling="2026-03-18 07:45:00.008056301 +0000 UTC m=+3484.949211015" lastFinishedPulling="2026-03-18 07:45:02.535490515 +0000 UTC m=+3487.476645259" observedRunningTime="2026-03-18 07:45:03.06935989 +0000 UTC m=+3488.010514624" watchObservedRunningTime="2026-03-18 07:45:03.077551269 +0000 UTC m=+3488.018706023" Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.345149 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.521396 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfbst\" (UniqueName: \"kubernetes.io/projected/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-kube-api-access-vfbst\") pod \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.521440 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-config-volume\") pod \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.521482 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-secret-volume\") pod \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\" (UID: \"c9563789-4f20-4653-a02c-3b1aeb3b7cd7\") " Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.522450 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-config-volume" (OuterVolumeSpecName: "config-volume") pod "c9563789-4f20-4653-a02c-3b1aeb3b7cd7" (UID: "c9563789-4f20-4653-a02c-3b1aeb3b7cd7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.528836 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-kube-api-access-vfbst" (OuterVolumeSpecName: "kube-api-access-vfbst") pod "c9563789-4f20-4653-a02c-3b1aeb3b7cd7" (UID: "c9563789-4f20-4653-a02c-3b1aeb3b7cd7"). InnerVolumeSpecName "kube-api-access-vfbst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.531820 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c9563789-4f20-4653-a02c-3b1aeb3b7cd7" (UID: "c9563789-4f20-4653-a02c-3b1aeb3b7cd7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.623353 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfbst\" (UniqueName: \"kubernetes.io/projected/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-kube-api-access-vfbst\") on node \"crc\" DevicePath \"\"" Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.623392 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 07:45:03 crc kubenswrapper[4917]: I0318 07:45:03.623407 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c9563789-4f20-4653-a02c-3b1aeb3b7cd7-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 07:45:04 crc kubenswrapper[4917]: I0318 07:45:04.050903 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" Mar 18 07:45:04 crc kubenswrapper[4917]: I0318 07:45:04.050886 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h" event={"ID":"c9563789-4f20-4653-a02c-3b1aeb3b7cd7","Type":"ContainerDied","Data":"d2fa96d3176956de392f07fb4e269d1e51a8d93ca9c184b421b4b5c1359aec08"} Mar 18 07:45:04 crc kubenswrapper[4917]: I0318 07:45:04.050986 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2fa96d3176956de392f07fb4e269d1e51a8d93ca9c184b421b4b5c1359aec08" Mar 18 07:45:04 crc kubenswrapper[4917]: I0318 07:45:04.449250 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl"] Mar 18 07:45:04 crc kubenswrapper[4917]: I0318 07:45:04.458075 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563620-pnxkl"] Mar 18 07:45:04 crc kubenswrapper[4917]: I0318 07:45:04.948275 4917 scope.go:117] "RemoveContainer" containerID="74ed507b63616512b1c664653ab221f289729d28080000637aa148de13c895c4" Mar 18 07:45:05 crc kubenswrapper[4917]: I0318 07:45:05.791310 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11225532-26d4-47aa-80bc-077720b847bd" path="/var/lib/kubelet/pods/11225532-26d4-47aa-80bc-077720b847bd/volumes" Mar 18 07:45:08 crc kubenswrapper[4917]: I0318 07:45:08.956367 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:45:08 crc kubenswrapper[4917]: I0318 07:45:08.956933 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:45:10 crc kubenswrapper[4917]: I0318 07:45:10.008556 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-59ntb" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerName="registry-server" probeResult="failure" output=< Mar 18 07:45:10 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 07:45:10 crc kubenswrapper[4917]: > Mar 18 07:45:14 crc kubenswrapper[4917]: I0318 07:45:14.772533 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:45:14 crc kubenswrapper[4917]: E0318 07:45:14.773541 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:45:19 crc kubenswrapper[4917]: I0318 07:45:19.021001 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:45:19 crc kubenswrapper[4917]: I0318 07:45:19.078368 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:45:19 crc kubenswrapper[4917]: I0318 07:45:19.264508 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-59ntb"] Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.193944 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-59ntb" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerName="registry-server" containerID="cri-o://bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64" gracePeriod=2 Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.630906 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.712525 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-utilities\") pod \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.712619 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l84db\" (UniqueName: \"kubernetes.io/projected/8aefcbde-ba2b-4748-a35f-c755e8aecf21-kube-api-access-l84db\") pod \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.712647 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-catalog-content\") pod \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\" (UID: \"8aefcbde-ba2b-4748-a35f-c755e8aecf21\") " Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.713889 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-utilities" (OuterVolumeSpecName: "utilities") pod "8aefcbde-ba2b-4748-a35f-c755e8aecf21" (UID: "8aefcbde-ba2b-4748-a35f-c755e8aecf21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.718738 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aefcbde-ba2b-4748-a35f-c755e8aecf21-kube-api-access-l84db" (OuterVolumeSpecName: "kube-api-access-l84db") pod "8aefcbde-ba2b-4748-a35f-c755e8aecf21" (UID: "8aefcbde-ba2b-4748-a35f-c755e8aecf21"). InnerVolumeSpecName "kube-api-access-l84db". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.813725 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.813757 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l84db\" (UniqueName: \"kubernetes.io/projected/8aefcbde-ba2b-4748-a35f-c755e8aecf21-kube-api-access-l84db\") on node \"crc\" DevicePath \"\"" Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.840783 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8aefcbde-ba2b-4748-a35f-c755e8aecf21" (UID: "8aefcbde-ba2b-4748-a35f-c755e8aecf21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:45:20 crc kubenswrapper[4917]: I0318 07:45:20.916391 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8aefcbde-ba2b-4748-a35f-c755e8aecf21-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.213972 4917 generic.go:334] "Generic (PLEG): container finished" podID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerID="bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64" exitCode=0 Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.214028 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59ntb" event={"ID":"8aefcbde-ba2b-4748-a35f-c755e8aecf21","Type":"ContainerDied","Data":"bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64"} Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.214064 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59ntb" event={"ID":"8aefcbde-ba2b-4748-a35f-c755e8aecf21","Type":"ContainerDied","Data":"990fdde3e4484154d6ff74c22e9eebb0a52d1376efffb563dd02e60e51063653"} Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.214085 4917 scope.go:117] "RemoveContainer" containerID="bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.214129 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59ntb" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.240657 4917 scope.go:117] "RemoveContainer" containerID="46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.264189 4917 scope.go:117] "RemoveContainer" containerID="ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.264406 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-59ntb"] Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.269752 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-59ntb"] Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.295741 4917 scope.go:117] "RemoveContainer" containerID="bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64" Mar 18 07:45:21 crc kubenswrapper[4917]: E0318 07:45:21.296185 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64\": container with ID starting with bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64 not found: ID does not exist" containerID="bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.296226 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64"} err="failed to get container status \"bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64\": rpc error: code = NotFound desc = could not find container \"bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64\": container with ID starting with bf62463a8ff8145968e6a575cbfff013b493b3bfddcb8e9dd414e43af3ec0a64 not found: ID does not exist" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.296253 4917 scope.go:117] "RemoveContainer" containerID="46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7" Mar 18 07:45:21 crc kubenswrapper[4917]: E0318 07:45:21.296678 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7\": container with ID starting with 46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7 not found: ID does not exist" containerID="46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.296712 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7"} err="failed to get container status \"46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7\": rpc error: code = NotFound desc = could not find container \"46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7\": container with ID starting with 46ce509df20963efbc000eae97f4f44f56bd2d6e523e3cd5272fbf4316f7b0b7 not found: ID does not exist" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.296732 4917 scope.go:117] "RemoveContainer" containerID="ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22" Mar 18 07:45:21 crc kubenswrapper[4917]: E0318 07:45:21.298720 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22\": container with ID starting with ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22 not found: ID does not exist" containerID="ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.298752 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22"} err="failed to get container status \"ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22\": rpc error: code = NotFound desc = could not find container \"ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22\": container with ID starting with ad8fa32b040a9ed5163556ba2263085e0651c978223c3adf94cd7d26eba9ea22 not found: ID does not exist" Mar 18 07:45:21 crc kubenswrapper[4917]: I0318 07:45:21.783179 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" path="/var/lib/kubelet/pods/8aefcbde-ba2b-4748-a35f-c755e8aecf21/volumes" Mar 18 07:45:27 crc kubenswrapper[4917]: I0318 07:45:27.773382 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:45:27 crc kubenswrapper[4917]: E0318 07:45:27.774109 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:45:39 crc kubenswrapper[4917]: I0318 07:45:39.772924 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:45:40 crc kubenswrapper[4917]: I0318 07:45:40.392545 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"8faa40aa240f88d8f2e1df0c17f7ed1f5c6cafdc5093f3fd5b8b41e2a9662cc6"} Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.153798 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563666-tcbjv"] Mar 18 07:46:00 crc kubenswrapper[4917]: E0318 07:46:00.155461 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9563789-4f20-4653-a02c-3b1aeb3b7cd7" containerName="collect-profiles" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.155533 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9563789-4f20-4653-a02c-3b1aeb3b7cd7" containerName="collect-profiles" Mar 18 07:46:00 crc kubenswrapper[4917]: E0318 07:46:00.155595 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerName="extract-content" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.155683 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerName="extract-content" Mar 18 07:46:00 crc kubenswrapper[4917]: E0318 07:46:00.155746 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerName="extract-utilities" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.155797 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerName="extract-utilities" Mar 18 07:46:00 crc kubenswrapper[4917]: E0318 07:46:00.155863 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerName="registry-server" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.155916 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerName="registry-server" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.156097 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8aefcbde-ba2b-4748-a35f-c755e8aecf21" containerName="registry-server" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.156172 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9563789-4f20-4653-a02c-3b1aeb3b7cd7" containerName="collect-profiles" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.156687 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563666-tcbjv" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.156864 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563666-tcbjv"] Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.158867 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.159121 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.160673 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.277588 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wwcp\" (UniqueName: \"kubernetes.io/projected/0016906f-6e64-4356-99cb-a86118113691-kube-api-access-8wwcp\") pod \"auto-csr-approver-29563666-tcbjv\" (UID: \"0016906f-6e64-4356-99cb-a86118113691\") " pod="openshift-infra/auto-csr-approver-29563666-tcbjv" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.379408 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wwcp\" (UniqueName: \"kubernetes.io/projected/0016906f-6e64-4356-99cb-a86118113691-kube-api-access-8wwcp\") pod \"auto-csr-approver-29563666-tcbjv\" (UID: \"0016906f-6e64-4356-99cb-a86118113691\") " pod="openshift-infra/auto-csr-approver-29563666-tcbjv" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.397221 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wwcp\" (UniqueName: \"kubernetes.io/projected/0016906f-6e64-4356-99cb-a86118113691-kube-api-access-8wwcp\") pod \"auto-csr-approver-29563666-tcbjv\" (UID: \"0016906f-6e64-4356-99cb-a86118113691\") " pod="openshift-infra/auto-csr-approver-29563666-tcbjv" Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.473439 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563666-tcbjv" Mar 18 07:46:00 crc kubenswrapper[4917]: W0318 07:46:00.937553 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0016906f_6e64_4356_99cb_a86118113691.slice/crio-e5f6e1675ff3163387de58f7ede90e43771b72fe3be2bded111a7150fe042c7b WatchSource:0}: Error finding container e5f6e1675ff3163387de58f7ede90e43771b72fe3be2bded111a7150fe042c7b: Status 404 returned error can't find the container with id e5f6e1675ff3163387de58f7ede90e43771b72fe3be2bded111a7150fe042c7b Mar 18 07:46:00 crc kubenswrapper[4917]: I0318 07:46:00.937835 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563666-tcbjv"] Mar 18 07:46:01 crc kubenswrapper[4917]: I0318 07:46:01.572921 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563666-tcbjv" event={"ID":"0016906f-6e64-4356-99cb-a86118113691","Type":"ContainerStarted","Data":"e5f6e1675ff3163387de58f7ede90e43771b72fe3be2bded111a7150fe042c7b"} Mar 18 07:46:02 crc kubenswrapper[4917]: I0318 07:46:02.579334 4917 generic.go:334] "Generic (PLEG): container finished" podID="0016906f-6e64-4356-99cb-a86118113691" containerID="f4536d57aa3db97910e14acf15dc543807957d6e5abf0cce9483bce33aebadde" exitCode=0 Mar 18 07:46:02 crc kubenswrapper[4917]: I0318 07:46:02.579402 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563666-tcbjv" event={"ID":"0016906f-6e64-4356-99cb-a86118113691","Type":"ContainerDied","Data":"f4536d57aa3db97910e14acf15dc543807957d6e5abf0cce9483bce33aebadde"} Mar 18 07:46:03 crc kubenswrapper[4917]: I0318 07:46:03.887456 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563666-tcbjv" Mar 18 07:46:04 crc kubenswrapper[4917]: I0318 07:46:04.035190 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wwcp\" (UniqueName: \"kubernetes.io/projected/0016906f-6e64-4356-99cb-a86118113691-kube-api-access-8wwcp\") pod \"0016906f-6e64-4356-99cb-a86118113691\" (UID: \"0016906f-6e64-4356-99cb-a86118113691\") " Mar 18 07:46:04 crc kubenswrapper[4917]: I0318 07:46:04.043091 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0016906f-6e64-4356-99cb-a86118113691-kube-api-access-8wwcp" (OuterVolumeSpecName: "kube-api-access-8wwcp") pod "0016906f-6e64-4356-99cb-a86118113691" (UID: "0016906f-6e64-4356-99cb-a86118113691"). InnerVolumeSpecName "kube-api-access-8wwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:46:04 crc kubenswrapper[4917]: I0318 07:46:04.137090 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wwcp\" (UniqueName: \"kubernetes.io/projected/0016906f-6e64-4356-99cb-a86118113691-kube-api-access-8wwcp\") on node \"crc\" DevicePath \"\"" Mar 18 07:46:04 crc kubenswrapper[4917]: I0318 07:46:04.603988 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563666-tcbjv" event={"ID":"0016906f-6e64-4356-99cb-a86118113691","Type":"ContainerDied","Data":"e5f6e1675ff3163387de58f7ede90e43771b72fe3be2bded111a7150fe042c7b"} Mar 18 07:46:04 crc kubenswrapper[4917]: I0318 07:46:04.604402 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5f6e1675ff3163387de58f7ede90e43771b72fe3be2bded111a7150fe042c7b" Mar 18 07:46:04 crc kubenswrapper[4917]: I0318 07:46:04.604089 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563666-tcbjv" Mar 18 07:46:04 crc kubenswrapper[4917]: I0318 07:46:04.969476 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563660-cdgnk"] Mar 18 07:46:04 crc kubenswrapper[4917]: I0318 07:46:04.979834 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563660-cdgnk"] Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.033867 4917 scope.go:117] "RemoveContainer" containerID="431fdcead3cff13ff87c87a4261522916c371ae85df4343ab3dfc8b6632e3303" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.060826 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qzh2n"] Mar 18 07:46:05 crc kubenswrapper[4917]: E0318 07:46:05.061683 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0016906f-6e64-4356-99cb-a86118113691" containerName="oc" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.061886 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0016906f-6e64-4356-99cb-a86118113691" containerName="oc" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.062395 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0016906f-6e64-4356-99cb-a86118113691" containerName="oc" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.064801 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.107530 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzh2n"] Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.154149 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-catalog-content\") pod \"certified-operators-qzh2n\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.154209 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-utilities\") pod \"certified-operators-qzh2n\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.154239 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7xjb\" (UniqueName: \"kubernetes.io/projected/34374b8b-2c3d-48e2-b795-65bc898e1963-kube-api-access-z7xjb\") pod \"certified-operators-qzh2n\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.255486 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-catalog-content\") pod \"certified-operators-qzh2n\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.255545 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-utilities\") pod \"certified-operators-qzh2n\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.255573 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7xjb\" (UniqueName: \"kubernetes.io/projected/34374b8b-2c3d-48e2-b795-65bc898e1963-kube-api-access-z7xjb\") pod \"certified-operators-qzh2n\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.256084 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-catalog-content\") pod \"certified-operators-qzh2n\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.256143 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-utilities\") pod \"certified-operators-qzh2n\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.275008 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7xjb\" (UniqueName: \"kubernetes.io/projected/34374b8b-2c3d-48e2-b795-65bc898e1963-kube-api-access-z7xjb\") pod \"certified-operators-qzh2n\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.402468 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.670389 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qzh2n"] Mar 18 07:46:05 crc kubenswrapper[4917]: I0318 07:46:05.790726 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0697d2dc-dada-4b49-a201-0731534c8a59" path="/var/lib/kubelet/pods/0697d2dc-dada-4b49-a201-0731534c8a59/volumes" Mar 18 07:46:06 crc kubenswrapper[4917]: I0318 07:46:06.621611 4917 generic.go:334] "Generic (PLEG): container finished" podID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerID="6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf" exitCode=0 Mar 18 07:46:06 crc kubenswrapper[4917]: I0318 07:46:06.621666 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzh2n" event={"ID":"34374b8b-2c3d-48e2-b795-65bc898e1963","Type":"ContainerDied","Data":"6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf"} Mar 18 07:46:06 crc kubenswrapper[4917]: I0318 07:46:06.621874 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzh2n" event={"ID":"34374b8b-2c3d-48e2-b795-65bc898e1963","Type":"ContainerStarted","Data":"d5664055c6f7607293af2cae1acd5b8fc48d2164683ad9da9490c451cb178642"} Mar 18 07:46:07 crc kubenswrapper[4917]: I0318 07:46:07.635177 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzh2n" event={"ID":"34374b8b-2c3d-48e2-b795-65bc898e1963","Type":"ContainerStarted","Data":"6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b"} Mar 18 07:46:08 crc kubenswrapper[4917]: I0318 07:46:08.649907 4917 generic.go:334] "Generic (PLEG): container finished" podID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerID="6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b" exitCode=0 Mar 18 07:46:08 crc kubenswrapper[4917]: I0318 07:46:08.649996 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzh2n" event={"ID":"34374b8b-2c3d-48e2-b795-65bc898e1963","Type":"ContainerDied","Data":"6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b"} Mar 18 07:46:09 crc kubenswrapper[4917]: I0318 07:46:09.664390 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzh2n" event={"ID":"34374b8b-2c3d-48e2-b795-65bc898e1963","Type":"ContainerStarted","Data":"e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8"} Mar 18 07:46:09 crc kubenswrapper[4917]: I0318 07:46:09.692186 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qzh2n" podStartSLOduration=1.961041891 podStartE2EDuration="4.692169394s" podCreationTimestamp="2026-03-18 07:46:05 +0000 UTC" firstStartedPulling="2026-03-18 07:46:06.623927976 +0000 UTC m=+3551.565082720" lastFinishedPulling="2026-03-18 07:46:09.355055509 +0000 UTC m=+3554.296210223" observedRunningTime="2026-03-18 07:46:09.689645042 +0000 UTC m=+3554.630799796" watchObservedRunningTime="2026-03-18 07:46:09.692169394 +0000 UTC m=+3554.633324118" Mar 18 07:46:15 crc kubenswrapper[4917]: I0318 07:46:15.403778 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:15 crc kubenswrapper[4917]: I0318 07:46:15.404514 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:15 crc kubenswrapper[4917]: I0318 07:46:15.483685 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:15 crc kubenswrapper[4917]: I0318 07:46:15.820905 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:15 crc kubenswrapper[4917]: I0318 07:46:15.880855 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzh2n"] Mar 18 07:46:17 crc kubenswrapper[4917]: I0318 07:46:17.752731 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qzh2n" podUID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerName="registry-server" containerID="cri-o://e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8" gracePeriod=2 Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.185574 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.276278 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-utilities\") pod \"34374b8b-2c3d-48e2-b795-65bc898e1963\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.276381 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-catalog-content\") pod \"34374b8b-2c3d-48e2-b795-65bc898e1963\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.276449 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7xjb\" (UniqueName: \"kubernetes.io/projected/34374b8b-2c3d-48e2-b795-65bc898e1963-kube-api-access-z7xjb\") pod \"34374b8b-2c3d-48e2-b795-65bc898e1963\" (UID: \"34374b8b-2c3d-48e2-b795-65bc898e1963\") " Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.277575 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-utilities" (OuterVolumeSpecName: "utilities") pod "34374b8b-2c3d-48e2-b795-65bc898e1963" (UID: "34374b8b-2c3d-48e2-b795-65bc898e1963"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.284359 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34374b8b-2c3d-48e2-b795-65bc898e1963-kube-api-access-z7xjb" (OuterVolumeSpecName: "kube-api-access-z7xjb") pod "34374b8b-2c3d-48e2-b795-65bc898e1963" (UID: "34374b8b-2c3d-48e2-b795-65bc898e1963"). InnerVolumeSpecName "kube-api-access-z7xjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.340848 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34374b8b-2c3d-48e2-b795-65bc898e1963" (UID: "34374b8b-2c3d-48e2-b795-65bc898e1963"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.379080 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.379227 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34374b8b-2c3d-48e2-b795-65bc898e1963-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.379250 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7xjb\" (UniqueName: \"kubernetes.io/projected/34374b8b-2c3d-48e2-b795-65bc898e1963-kube-api-access-z7xjb\") on node \"crc\" DevicePath \"\"" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.763726 4917 generic.go:334] "Generic (PLEG): container finished" podID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerID="e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8" exitCode=0 Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.763821 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qzh2n" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.763882 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzh2n" event={"ID":"34374b8b-2c3d-48e2-b795-65bc898e1963","Type":"ContainerDied","Data":"e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8"} Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.764134 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qzh2n" event={"ID":"34374b8b-2c3d-48e2-b795-65bc898e1963","Type":"ContainerDied","Data":"d5664055c6f7607293af2cae1acd5b8fc48d2164683ad9da9490c451cb178642"} Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.764164 4917 scope.go:117] "RemoveContainer" containerID="e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.793220 4917 scope.go:117] "RemoveContainer" containerID="6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.798253 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qzh2n"] Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.809439 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qzh2n"] Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.812804 4917 scope.go:117] "RemoveContainer" containerID="6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.837412 4917 scope.go:117] "RemoveContainer" containerID="e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8" Mar 18 07:46:18 crc kubenswrapper[4917]: E0318 07:46:18.838016 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8\": container with ID starting with e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8 not found: ID does not exist" containerID="e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.838048 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8"} err="failed to get container status \"e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8\": rpc error: code = NotFound desc = could not find container \"e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8\": container with ID starting with e559fa815260d79df5794d06859fdbe2c0a9f1eaaf89d9ab7b850d0a99dbbcc8 not found: ID does not exist" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.838072 4917 scope.go:117] "RemoveContainer" containerID="6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b" Mar 18 07:46:18 crc kubenswrapper[4917]: E0318 07:46:18.838551 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b\": container with ID starting with 6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b not found: ID does not exist" containerID="6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.838607 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b"} err="failed to get container status \"6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b\": rpc error: code = NotFound desc = could not find container \"6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b\": container with ID starting with 6a0469c3d230b9e6cf2f9feab45516ac4b5979ff4c483ed820149f8cd16f893b not found: ID does not exist" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.838636 4917 scope.go:117] "RemoveContainer" containerID="6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf" Mar 18 07:46:18 crc kubenswrapper[4917]: E0318 07:46:18.838887 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf\": container with ID starting with 6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf not found: ID does not exist" containerID="6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf" Mar 18 07:46:18 crc kubenswrapper[4917]: I0318 07:46:18.838912 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf"} err="failed to get container status \"6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf\": rpc error: code = NotFound desc = could not find container \"6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf\": container with ID starting with 6a637dc0d58c4cf3cc7c3c4acfc32125a46bcd8e701dbd19d6d0fa823ecafbaf not found: ID does not exist" Mar 18 07:46:19 crc kubenswrapper[4917]: I0318 07:46:19.787923 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34374b8b-2c3d-48e2-b795-65bc898e1963" path="/var/lib/kubelet/pods/34374b8b-2c3d-48e2-b795-65bc898e1963/volumes" Mar 18 07:47:05 crc kubenswrapper[4917]: I0318 07:47:05.132756 4917 scope.go:117] "RemoveContainer" containerID="49bc18abfb613b126958f81f98bc7c1d60190aa28a6aa460189f31c80b087a86" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.160403 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563668-jprcn"] Mar 18 07:48:00 crc kubenswrapper[4917]: E0318 07:48:00.161826 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerName="extract-content" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.161856 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerName="extract-content" Mar 18 07:48:00 crc kubenswrapper[4917]: E0318 07:48:00.161895 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerName="extract-utilities" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.161909 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerName="extract-utilities" Mar 18 07:48:00 crc kubenswrapper[4917]: E0318 07:48:00.161932 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerName="registry-server" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.161947 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerName="registry-server" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.162181 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="34374b8b-2c3d-48e2-b795-65bc898e1963" containerName="registry-server" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.163250 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563668-jprcn" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.167464 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.167739 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.168569 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.179162 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563668-jprcn"] Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.228198 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drgfk\" (UniqueName: \"kubernetes.io/projected/8f5d381c-41ed-4e23-b8c8-aad758b54303-kube-api-access-drgfk\") pod \"auto-csr-approver-29563668-jprcn\" (UID: \"8f5d381c-41ed-4e23-b8c8-aad758b54303\") " pod="openshift-infra/auto-csr-approver-29563668-jprcn" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.329952 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drgfk\" (UniqueName: \"kubernetes.io/projected/8f5d381c-41ed-4e23-b8c8-aad758b54303-kube-api-access-drgfk\") pod \"auto-csr-approver-29563668-jprcn\" (UID: \"8f5d381c-41ed-4e23-b8c8-aad758b54303\") " pod="openshift-infra/auto-csr-approver-29563668-jprcn" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.360244 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drgfk\" (UniqueName: \"kubernetes.io/projected/8f5d381c-41ed-4e23-b8c8-aad758b54303-kube-api-access-drgfk\") pod \"auto-csr-approver-29563668-jprcn\" (UID: \"8f5d381c-41ed-4e23-b8c8-aad758b54303\") " pod="openshift-infra/auto-csr-approver-29563668-jprcn" Mar 18 07:48:00 crc kubenswrapper[4917]: I0318 07:48:00.493237 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563668-jprcn" Mar 18 07:48:01 crc kubenswrapper[4917]: I0318 07:48:01.038529 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563668-jprcn"] Mar 18 07:48:01 crc kubenswrapper[4917]: I0318 07:48:01.723085 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563668-jprcn" event={"ID":"8f5d381c-41ed-4e23-b8c8-aad758b54303","Type":"ContainerStarted","Data":"e63e863693436a4f5115247ee3a4c3ec6eabffbd9aea3a018018fef1f29299d4"} Mar 18 07:48:02 crc kubenswrapper[4917]: I0318 07:48:02.735911 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563668-jprcn" event={"ID":"8f5d381c-41ed-4e23-b8c8-aad758b54303","Type":"ContainerStarted","Data":"164543bda9e93ce0c521e028930ad0db141e4b0131f157fe9fdc4ab91ff716a1"} Mar 18 07:48:02 crc kubenswrapper[4917]: I0318 07:48:02.760871 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563668-jprcn" podStartSLOduration=1.486827643 podStartE2EDuration="2.760844911s" podCreationTimestamp="2026-03-18 07:48:00 +0000 UTC" firstStartedPulling="2026-03-18 07:48:01.051338914 +0000 UTC m=+3665.992493638" lastFinishedPulling="2026-03-18 07:48:02.325356152 +0000 UTC m=+3667.266510906" observedRunningTime="2026-03-18 07:48:02.751667567 +0000 UTC m=+3667.692822321" watchObservedRunningTime="2026-03-18 07:48:02.760844911 +0000 UTC m=+3667.701999655" Mar 18 07:48:02 crc kubenswrapper[4917]: I0318 07:48:02.929037 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:48:02 crc kubenswrapper[4917]: I0318 07:48:02.929159 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:48:03 crc kubenswrapper[4917]: I0318 07:48:03.748522 4917 generic.go:334] "Generic (PLEG): container finished" podID="8f5d381c-41ed-4e23-b8c8-aad758b54303" containerID="164543bda9e93ce0c521e028930ad0db141e4b0131f157fe9fdc4ab91ff716a1" exitCode=0 Mar 18 07:48:03 crc kubenswrapper[4917]: I0318 07:48:03.748573 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563668-jprcn" event={"ID":"8f5d381c-41ed-4e23-b8c8-aad758b54303","Type":"ContainerDied","Data":"164543bda9e93ce0c521e028930ad0db141e4b0131f157fe9fdc4ab91ff716a1"} Mar 18 07:48:05 crc kubenswrapper[4917]: I0318 07:48:05.093107 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563668-jprcn" Mar 18 07:48:05 crc kubenswrapper[4917]: I0318 07:48:05.227569 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drgfk\" (UniqueName: \"kubernetes.io/projected/8f5d381c-41ed-4e23-b8c8-aad758b54303-kube-api-access-drgfk\") pod \"8f5d381c-41ed-4e23-b8c8-aad758b54303\" (UID: \"8f5d381c-41ed-4e23-b8c8-aad758b54303\") " Mar 18 07:48:05 crc kubenswrapper[4917]: I0318 07:48:05.236355 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5d381c-41ed-4e23-b8c8-aad758b54303-kube-api-access-drgfk" (OuterVolumeSpecName: "kube-api-access-drgfk") pod "8f5d381c-41ed-4e23-b8c8-aad758b54303" (UID: "8f5d381c-41ed-4e23-b8c8-aad758b54303"). InnerVolumeSpecName "kube-api-access-drgfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:48:05 crc kubenswrapper[4917]: I0318 07:48:05.329504 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drgfk\" (UniqueName: \"kubernetes.io/projected/8f5d381c-41ed-4e23-b8c8-aad758b54303-kube-api-access-drgfk\") on node \"crc\" DevicePath \"\"" Mar 18 07:48:05 crc kubenswrapper[4917]: I0318 07:48:05.768041 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563668-jprcn" event={"ID":"8f5d381c-41ed-4e23-b8c8-aad758b54303","Type":"ContainerDied","Data":"e63e863693436a4f5115247ee3a4c3ec6eabffbd9aea3a018018fef1f29299d4"} Mar 18 07:48:05 crc kubenswrapper[4917]: I0318 07:48:05.768099 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e63e863693436a4f5115247ee3a4c3ec6eabffbd9aea3a018018fef1f29299d4" Mar 18 07:48:05 crc kubenswrapper[4917]: I0318 07:48:05.768134 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563668-jprcn" Mar 18 07:48:05 crc kubenswrapper[4917]: I0318 07:48:05.845275 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563662-4vslz"] Mar 18 07:48:05 crc kubenswrapper[4917]: I0318 07:48:05.855123 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563662-4vslz"] Mar 18 07:48:07 crc kubenswrapper[4917]: I0318 07:48:07.802899 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a2702f-b95f-4786-8940-8445b400b635" path="/var/lib/kubelet/pods/22a2702f-b95f-4786-8940-8445b400b635/volumes" Mar 18 07:48:32 crc kubenswrapper[4917]: I0318 07:48:32.929559 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:48:32 crc kubenswrapper[4917]: I0318 07:48:32.931024 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:49:02 crc kubenswrapper[4917]: I0318 07:49:02.929647 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:49:02 crc kubenswrapper[4917]: I0318 07:49:02.930287 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:49:02 crc kubenswrapper[4917]: I0318 07:49:02.930349 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:49:02 crc kubenswrapper[4917]: I0318 07:49:02.931167 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8faa40aa240f88d8f2e1df0c17f7ed1f5c6cafdc5093f3fd5b8b41e2a9662cc6"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:49:02 crc kubenswrapper[4917]: I0318 07:49:02.931264 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://8faa40aa240f88d8f2e1df0c17f7ed1f5c6cafdc5093f3fd5b8b41e2a9662cc6" gracePeriod=600 Mar 18 07:49:03 crc kubenswrapper[4917]: I0318 07:49:03.305480 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="8faa40aa240f88d8f2e1df0c17f7ed1f5c6cafdc5093f3fd5b8b41e2a9662cc6" exitCode=0 Mar 18 07:49:03 crc kubenswrapper[4917]: I0318 07:49:03.305544 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"8faa40aa240f88d8f2e1df0c17f7ed1f5c6cafdc5093f3fd5b8b41e2a9662cc6"} Mar 18 07:49:03 crc kubenswrapper[4917]: I0318 07:49:03.306112 4917 scope.go:117] "RemoveContainer" containerID="6d1beaee9ec1485fb804eebd911eacd8261434a308c9f6e2c70bcbe82641cced" Mar 18 07:49:04 crc kubenswrapper[4917]: I0318 07:49:04.323874 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21"} Mar 18 07:49:05 crc kubenswrapper[4917]: I0318 07:49:05.271731 4917 scope.go:117] "RemoveContainer" containerID="6a50868c6ae08ab0f7456088280a979e2f01932146f9ff61c5bd5d77b988a748" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.169970 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563670-hjzkx"] Mar 18 07:50:00 crc kubenswrapper[4917]: E0318 07:50:00.171001 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5d381c-41ed-4e23-b8c8-aad758b54303" containerName="oc" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.171023 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5d381c-41ed-4e23-b8c8-aad758b54303" containerName="oc" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.171271 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5d381c-41ed-4e23-b8c8-aad758b54303" containerName="oc" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.172101 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563670-hjzkx" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.181086 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.181281 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.183394 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.190851 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563670-hjzkx"] Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.331496 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wvg\" (UniqueName: \"kubernetes.io/projected/bc85683a-74ce-4fed-a4c6-ce995a4d73d2-kube-api-access-26wvg\") pod \"auto-csr-approver-29563670-hjzkx\" (UID: \"bc85683a-74ce-4fed-a4c6-ce995a4d73d2\") " pod="openshift-infra/auto-csr-approver-29563670-hjzkx" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.433497 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26wvg\" (UniqueName: \"kubernetes.io/projected/bc85683a-74ce-4fed-a4c6-ce995a4d73d2-kube-api-access-26wvg\") pod \"auto-csr-approver-29563670-hjzkx\" (UID: \"bc85683a-74ce-4fed-a4c6-ce995a4d73d2\") " pod="openshift-infra/auto-csr-approver-29563670-hjzkx" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.475917 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26wvg\" (UniqueName: \"kubernetes.io/projected/bc85683a-74ce-4fed-a4c6-ce995a4d73d2-kube-api-access-26wvg\") pod \"auto-csr-approver-29563670-hjzkx\" (UID: \"bc85683a-74ce-4fed-a4c6-ce995a4d73d2\") " pod="openshift-infra/auto-csr-approver-29563670-hjzkx" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.505163 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563670-hjzkx" Mar 18 07:50:00 crc kubenswrapper[4917]: I0318 07:50:00.994027 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563670-hjzkx"] Mar 18 07:50:01 crc kubenswrapper[4917]: I0318 07:50:01.870864 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563670-hjzkx" event={"ID":"bc85683a-74ce-4fed-a4c6-ce995a4d73d2","Type":"ContainerStarted","Data":"8f540704abe63e540da938f799ce4c0fe9f71becf196ecafbede913b061c40c5"} Mar 18 07:50:02 crc kubenswrapper[4917]: I0318 07:50:02.882768 4917 generic.go:334] "Generic (PLEG): container finished" podID="bc85683a-74ce-4fed-a4c6-ce995a4d73d2" containerID="61a64523c186094a4506210090aac2691b4a22dba4eb6c1044c243be3a5b6a39" exitCode=0 Mar 18 07:50:02 crc kubenswrapper[4917]: I0318 07:50:02.882846 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563670-hjzkx" event={"ID":"bc85683a-74ce-4fed-a4c6-ce995a4d73d2","Type":"ContainerDied","Data":"61a64523c186094a4506210090aac2691b4a22dba4eb6c1044c243be3a5b6a39"} Mar 18 07:50:04 crc kubenswrapper[4917]: I0318 07:50:04.314972 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563670-hjzkx" Mar 18 07:50:04 crc kubenswrapper[4917]: I0318 07:50:04.391294 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26wvg\" (UniqueName: \"kubernetes.io/projected/bc85683a-74ce-4fed-a4c6-ce995a4d73d2-kube-api-access-26wvg\") pod \"bc85683a-74ce-4fed-a4c6-ce995a4d73d2\" (UID: \"bc85683a-74ce-4fed-a4c6-ce995a4d73d2\") " Mar 18 07:50:04 crc kubenswrapper[4917]: I0318 07:50:04.398703 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc85683a-74ce-4fed-a4c6-ce995a4d73d2-kube-api-access-26wvg" (OuterVolumeSpecName: "kube-api-access-26wvg") pod "bc85683a-74ce-4fed-a4c6-ce995a4d73d2" (UID: "bc85683a-74ce-4fed-a4c6-ce995a4d73d2"). InnerVolumeSpecName "kube-api-access-26wvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:50:04 crc kubenswrapper[4917]: I0318 07:50:04.493377 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26wvg\" (UniqueName: \"kubernetes.io/projected/bc85683a-74ce-4fed-a4c6-ce995a4d73d2-kube-api-access-26wvg\") on node \"crc\" DevicePath \"\"" Mar 18 07:50:04 crc kubenswrapper[4917]: I0318 07:50:04.901660 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563670-hjzkx" event={"ID":"bc85683a-74ce-4fed-a4c6-ce995a4d73d2","Type":"ContainerDied","Data":"8f540704abe63e540da938f799ce4c0fe9f71becf196ecafbede913b061c40c5"} Mar 18 07:50:04 crc kubenswrapper[4917]: I0318 07:50:04.901706 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f540704abe63e540da938f799ce4c0fe9f71becf196ecafbede913b061c40c5" Mar 18 07:50:04 crc kubenswrapper[4917]: I0318 07:50:04.901769 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563670-hjzkx" Mar 18 07:50:05 crc kubenswrapper[4917]: I0318 07:50:05.394313 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563664-98fdw"] Mar 18 07:50:05 crc kubenswrapper[4917]: I0318 07:50:05.402525 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563664-98fdw"] Mar 18 07:50:05 crc kubenswrapper[4917]: I0318 07:50:05.783946 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957d8712-e191-467e-8922-361ce4bc8c5a" path="/var/lib/kubelet/pods/957d8712-e191-467e-8922-361ce4bc8c5a/volumes" Mar 18 07:51:05 crc kubenswrapper[4917]: I0318 07:51:05.368944 4917 scope.go:117] "RemoveContainer" containerID="2ac531baf53f527f8f2e700ee12d7dd3b068fd9fcdc9f5bb21c702ce853aedc0" Mar 18 07:51:32 crc kubenswrapper[4917]: I0318 07:51:32.928636 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:51:32 crc kubenswrapper[4917]: I0318 07:51:32.929289 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.159059 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563672-w7vx8"] Mar 18 07:52:00 crc kubenswrapper[4917]: E0318 07:52:00.160572 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc85683a-74ce-4fed-a4c6-ce995a4d73d2" containerName="oc" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.160660 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc85683a-74ce-4fed-a4c6-ce995a4d73d2" containerName="oc" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.161122 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc85683a-74ce-4fed-a4c6-ce995a4d73d2" containerName="oc" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.162199 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563672-w7vx8" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.164925 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.167670 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563672-w7vx8"] Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.167793 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.169690 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.344085 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrxgx\" (UniqueName: \"kubernetes.io/projected/1e67c910-f202-4145-87fd-8c1666597a0e-kube-api-access-wrxgx\") pod \"auto-csr-approver-29563672-w7vx8\" (UID: \"1e67c910-f202-4145-87fd-8c1666597a0e\") " pod="openshift-infra/auto-csr-approver-29563672-w7vx8" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.446167 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrxgx\" (UniqueName: \"kubernetes.io/projected/1e67c910-f202-4145-87fd-8c1666597a0e-kube-api-access-wrxgx\") pod \"auto-csr-approver-29563672-w7vx8\" (UID: \"1e67c910-f202-4145-87fd-8c1666597a0e\") " pod="openshift-infra/auto-csr-approver-29563672-w7vx8" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.484369 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrxgx\" (UniqueName: \"kubernetes.io/projected/1e67c910-f202-4145-87fd-8c1666597a0e-kube-api-access-wrxgx\") pod \"auto-csr-approver-29563672-w7vx8\" (UID: \"1e67c910-f202-4145-87fd-8c1666597a0e\") " pod="openshift-infra/auto-csr-approver-29563672-w7vx8" Mar 18 07:52:00 crc kubenswrapper[4917]: I0318 07:52:00.492035 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563672-w7vx8" Mar 18 07:52:01 crc kubenswrapper[4917]: I0318 07:52:01.013345 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563672-w7vx8"] Mar 18 07:52:01 crc kubenswrapper[4917]: I0318 07:52:01.025404 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:52:01 crc kubenswrapper[4917]: I0318 07:52:01.991977 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563672-w7vx8" event={"ID":"1e67c910-f202-4145-87fd-8c1666597a0e","Type":"ContainerStarted","Data":"28e0fc322fe7bbc15e43b885922cfd2e971e3e0af8771ca667dd521ce4a76fd6"} Mar 18 07:52:02 crc kubenswrapper[4917]: I0318 07:52:02.929082 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:52:02 crc kubenswrapper[4917]: I0318 07:52:02.929454 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:52:03 crc kubenswrapper[4917]: I0318 07:52:03.013462 4917 generic.go:334] "Generic (PLEG): container finished" podID="1e67c910-f202-4145-87fd-8c1666597a0e" containerID="b1898428e330b84109d5f38a9452448690cd799abd58fcd3e2924cdbeb5ab363" exitCode=0 Mar 18 07:52:03 crc kubenswrapper[4917]: I0318 07:52:03.013528 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563672-w7vx8" event={"ID":"1e67c910-f202-4145-87fd-8c1666597a0e","Type":"ContainerDied","Data":"b1898428e330b84109d5f38a9452448690cd799abd58fcd3e2924cdbeb5ab363"} Mar 18 07:52:04 crc kubenswrapper[4917]: I0318 07:52:04.426761 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563672-w7vx8" Mar 18 07:52:04 crc kubenswrapper[4917]: I0318 07:52:04.624025 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrxgx\" (UniqueName: \"kubernetes.io/projected/1e67c910-f202-4145-87fd-8c1666597a0e-kube-api-access-wrxgx\") pod \"1e67c910-f202-4145-87fd-8c1666597a0e\" (UID: \"1e67c910-f202-4145-87fd-8c1666597a0e\") " Mar 18 07:52:04 crc kubenswrapper[4917]: I0318 07:52:04.630269 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e67c910-f202-4145-87fd-8c1666597a0e-kube-api-access-wrxgx" (OuterVolumeSpecName: "kube-api-access-wrxgx") pod "1e67c910-f202-4145-87fd-8c1666597a0e" (UID: "1e67c910-f202-4145-87fd-8c1666597a0e"). InnerVolumeSpecName "kube-api-access-wrxgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:52:04 crc kubenswrapper[4917]: I0318 07:52:04.725824 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrxgx\" (UniqueName: \"kubernetes.io/projected/1e67c910-f202-4145-87fd-8c1666597a0e-kube-api-access-wrxgx\") on node \"crc\" DevicePath \"\"" Mar 18 07:52:05 crc kubenswrapper[4917]: I0318 07:52:05.042462 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563672-w7vx8" event={"ID":"1e67c910-f202-4145-87fd-8c1666597a0e","Type":"ContainerDied","Data":"28e0fc322fe7bbc15e43b885922cfd2e971e3e0af8771ca667dd521ce4a76fd6"} Mar 18 07:52:05 crc kubenswrapper[4917]: I0318 07:52:05.042846 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28e0fc322fe7bbc15e43b885922cfd2e971e3e0af8771ca667dd521ce4a76fd6" Mar 18 07:52:05 crc kubenswrapper[4917]: I0318 07:52:05.042551 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563672-w7vx8" Mar 18 07:52:05 crc kubenswrapper[4917]: I0318 07:52:05.527965 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563666-tcbjv"] Mar 18 07:52:05 crc kubenswrapper[4917]: I0318 07:52:05.539055 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563666-tcbjv"] Mar 18 07:52:05 crc kubenswrapper[4917]: I0318 07:52:05.782451 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0016906f-6e64-4356-99cb-a86118113691" path="/var/lib/kubelet/pods/0016906f-6e64-4356-99cb-a86118113691/volumes" Mar 18 07:52:32 crc kubenswrapper[4917]: I0318 07:52:32.928721 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 07:52:32 crc kubenswrapper[4917]: I0318 07:52:32.929658 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 07:52:32 crc kubenswrapper[4917]: I0318 07:52:32.929744 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 07:52:32 crc kubenswrapper[4917]: I0318 07:52:32.930698 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 07:52:32 crc kubenswrapper[4917]: I0318 07:52:32.930810 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" gracePeriod=600 Mar 18 07:52:33 crc kubenswrapper[4917]: E0318 07:52:33.050275 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:52:33 crc kubenswrapper[4917]: I0318 07:52:33.283501 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" exitCode=0 Mar 18 07:52:33 crc kubenswrapper[4917]: I0318 07:52:33.283568 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21"} Mar 18 07:52:33 crc kubenswrapper[4917]: I0318 07:52:33.283687 4917 scope.go:117] "RemoveContainer" containerID="8faa40aa240f88d8f2e1df0c17f7ed1f5c6cafdc5093f3fd5b8b41e2a9662cc6" Mar 18 07:52:33 crc kubenswrapper[4917]: I0318 07:52:33.284499 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:52:33 crc kubenswrapper[4917]: E0318 07:52:33.284924 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:52:45 crc kubenswrapper[4917]: I0318 07:52:45.780755 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:52:45 crc kubenswrapper[4917]: E0318 07:52:45.781685 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:52:59 crc kubenswrapper[4917]: I0318 07:52:59.772393 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:52:59 crc kubenswrapper[4917]: E0318 07:52:59.773734 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:53:05 crc kubenswrapper[4917]: I0318 07:53:05.496232 4917 scope.go:117] "RemoveContainer" containerID="f4536d57aa3db97910e14acf15dc543807957d6e5abf0cce9483bce33aebadde" Mar 18 07:53:12 crc kubenswrapper[4917]: I0318 07:53:12.773194 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:53:12 crc kubenswrapper[4917]: E0318 07:53:12.774544 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:53:26 crc kubenswrapper[4917]: I0318 07:53:26.773123 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:53:26 crc kubenswrapper[4917]: E0318 07:53:26.774517 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:53:37 crc kubenswrapper[4917]: I0318 07:53:37.773652 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:53:37 crc kubenswrapper[4917]: E0318 07:53:37.774959 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:53:49 crc kubenswrapper[4917]: I0318 07:53:49.772665 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:53:49 crc kubenswrapper[4917]: E0318 07:53:49.773745 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.169722 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563674-zgzjb"] Mar 18 07:54:00 crc kubenswrapper[4917]: E0318 07:54:00.171120 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e67c910-f202-4145-87fd-8c1666597a0e" containerName="oc" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.171145 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e67c910-f202-4145-87fd-8c1666597a0e" containerName="oc" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.171437 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e67c910-f202-4145-87fd-8c1666597a0e" containerName="oc" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.172309 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563674-zgzjb" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.176217 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.176991 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.177390 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.178524 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563674-zgzjb"] Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.300815 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwssm\" (UniqueName: \"kubernetes.io/projected/a64d73ee-a6dd-42af-b1f5-ccbf24ee8292-kube-api-access-gwssm\") pod \"auto-csr-approver-29563674-zgzjb\" (UID: \"a64d73ee-a6dd-42af-b1f5-ccbf24ee8292\") " pod="openshift-infra/auto-csr-approver-29563674-zgzjb" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.402482 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwssm\" (UniqueName: \"kubernetes.io/projected/a64d73ee-a6dd-42af-b1f5-ccbf24ee8292-kube-api-access-gwssm\") pod \"auto-csr-approver-29563674-zgzjb\" (UID: \"a64d73ee-a6dd-42af-b1f5-ccbf24ee8292\") " pod="openshift-infra/auto-csr-approver-29563674-zgzjb" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.438957 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwssm\" (UniqueName: \"kubernetes.io/projected/a64d73ee-a6dd-42af-b1f5-ccbf24ee8292-kube-api-access-gwssm\") pod \"auto-csr-approver-29563674-zgzjb\" (UID: \"a64d73ee-a6dd-42af-b1f5-ccbf24ee8292\") " pod="openshift-infra/auto-csr-approver-29563674-zgzjb" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.506745 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563674-zgzjb" Mar 18 07:54:00 crc kubenswrapper[4917]: I0318 07:54:00.781044 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563674-zgzjb"] Mar 18 07:54:01 crc kubenswrapper[4917]: I0318 07:54:01.073092 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563674-zgzjb" event={"ID":"a64d73ee-a6dd-42af-b1f5-ccbf24ee8292","Type":"ContainerStarted","Data":"4cb9ce4cadd53c1d2c3bc168e6630cbc50b9a74ba8337b5d5aa8a1cbde540a4b"} Mar 18 07:54:02 crc kubenswrapper[4917]: I0318 07:54:02.772991 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:54:02 crc kubenswrapper[4917]: E0318 07:54:02.773990 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:54:03 crc kubenswrapper[4917]: I0318 07:54:03.096171 4917 generic.go:334] "Generic (PLEG): container finished" podID="a64d73ee-a6dd-42af-b1f5-ccbf24ee8292" containerID="0b394a0b47f6272235de311e9fc7b141d01bba0d58f651618789a9944e7590e5" exitCode=0 Mar 18 07:54:03 crc kubenswrapper[4917]: I0318 07:54:03.096235 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563674-zgzjb" event={"ID":"a64d73ee-a6dd-42af-b1f5-ccbf24ee8292","Type":"ContainerDied","Data":"0b394a0b47f6272235de311e9fc7b141d01bba0d58f651618789a9944e7590e5"} Mar 18 07:54:04 crc kubenswrapper[4917]: I0318 07:54:04.426988 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563674-zgzjb" Mar 18 07:54:04 crc kubenswrapper[4917]: I0318 07:54:04.567273 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwssm\" (UniqueName: \"kubernetes.io/projected/a64d73ee-a6dd-42af-b1f5-ccbf24ee8292-kube-api-access-gwssm\") pod \"a64d73ee-a6dd-42af-b1f5-ccbf24ee8292\" (UID: \"a64d73ee-a6dd-42af-b1f5-ccbf24ee8292\") " Mar 18 07:54:04 crc kubenswrapper[4917]: I0318 07:54:04.578494 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64d73ee-a6dd-42af-b1f5-ccbf24ee8292-kube-api-access-gwssm" (OuterVolumeSpecName: "kube-api-access-gwssm") pod "a64d73ee-a6dd-42af-b1f5-ccbf24ee8292" (UID: "a64d73ee-a6dd-42af-b1f5-ccbf24ee8292"). InnerVolumeSpecName "kube-api-access-gwssm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:54:04 crc kubenswrapper[4917]: I0318 07:54:04.670298 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwssm\" (UniqueName: \"kubernetes.io/projected/a64d73ee-a6dd-42af-b1f5-ccbf24ee8292-kube-api-access-gwssm\") on node \"crc\" DevicePath \"\"" Mar 18 07:54:05 crc kubenswrapper[4917]: I0318 07:54:05.115416 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563674-zgzjb" event={"ID":"a64d73ee-a6dd-42af-b1f5-ccbf24ee8292","Type":"ContainerDied","Data":"4cb9ce4cadd53c1d2c3bc168e6630cbc50b9a74ba8337b5d5aa8a1cbde540a4b"} Mar 18 07:54:05 crc kubenswrapper[4917]: I0318 07:54:05.115472 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb9ce4cadd53c1d2c3bc168e6630cbc50b9a74ba8337b5d5aa8a1cbde540a4b" Mar 18 07:54:05 crc kubenswrapper[4917]: I0318 07:54:05.115548 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563674-zgzjb" Mar 18 07:54:05 crc kubenswrapper[4917]: I0318 07:54:05.517214 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563668-jprcn"] Mar 18 07:54:05 crc kubenswrapper[4917]: I0318 07:54:05.527246 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563668-jprcn"] Mar 18 07:54:05 crc kubenswrapper[4917]: I0318 07:54:05.787682 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5d381c-41ed-4e23-b8c8-aad758b54303" path="/var/lib/kubelet/pods/8f5d381c-41ed-4e23-b8c8-aad758b54303/volumes" Mar 18 07:54:13 crc kubenswrapper[4917]: I0318 07:54:13.773565 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:54:13 crc kubenswrapper[4917]: E0318 07:54:13.774867 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:54:25 crc kubenswrapper[4917]: I0318 07:54:25.780347 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:54:25 crc kubenswrapper[4917]: E0318 07:54:25.781385 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:54:37 crc kubenswrapper[4917]: I0318 07:54:37.835393 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rmxd2"] Mar 18 07:54:37 crc kubenswrapper[4917]: E0318 07:54:37.837878 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64d73ee-a6dd-42af-b1f5-ccbf24ee8292" containerName="oc" Mar 18 07:54:37 crc kubenswrapper[4917]: I0318 07:54:37.838031 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64d73ee-a6dd-42af-b1f5-ccbf24ee8292" containerName="oc" Mar 18 07:54:37 crc kubenswrapper[4917]: I0318 07:54:37.838364 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64d73ee-a6dd-42af-b1f5-ccbf24ee8292" containerName="oc" Mar 18 07:54:37 crc kubenswrapper[4917]: I0318 07:54:37.840213 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:37 crc kubenswrapper[4917]: I0318 07:54:37.864983 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmxd2"] Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.013384 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-utilities\") pod \"redhat-marketplace-rmxd2\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.013462 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-catalog-content\") pod \"redhat-marketplace-rmxd2\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.013511 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqpbq\" (UniqueName: \"kubernetes.io/projected/5d64ada0-babb-4480-8dfb-b764e933c592-kube-api-access-nqpbq\") pod \"redhat-marketplace-rmxd2\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.115022 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-utilities\") pod \"redhat-marketplace-rmxd2\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.115077 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-catalog-content\") pod \"redhat-marketplace-rmxd2\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.115111 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqpbq\" (UniqueName: \"kubernetes.io/projected/5d64ada0-babb-4480-8dfb-b764e933c592-kube-api-access-nqpbq\") pod \"redhat-marketplace-rmxd2\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.115647 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-utilities\") pod \"redhat-marketplace-rmxd2\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.115670 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-catalog-content\") pod \"redhat-marketplace-rmxd2\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.137523 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqpbq\" (UniqueName: \"kubernetes.io/projected/5d64ada0-babb-4480-8dfb-b764e933c592-kube-api-access-nqpbq\") pod \"redhat-marketplace-rmxd2\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.187910 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:38 crc kubenswrapper[4917]: I0318 07:54:38.631652 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmxd2"] Mar 18 07:54:39 crc kubenswrapper[4917]: I0318 07:54:39.435117 4917 generic.go:334] "Generic (PLEG): container finished" podID="5d64ada0-babb-4480-8dfb-b764e933c592" containerID="2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32" exitCode=0 Mar 18 07:54:39 crc kubenswrapper[4917]: I0318 07:54:39.435193 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmxd2" event={"ID":"5d64ada0-babb-4480-8dfb-b764e933c592","Type":"ContainerDied","Data":"2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32"} Mar 18 07:54:39 crc kubenswrapper[4917]: I0318 07:54:39.435558 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmxd2" event={"ID":"5d64ada0-babb-4480-8dfb-b764e933c592","Type":"ContainerStarted","Data":"726063d2c8a755b1480a035bdec306f20317ab4ba200dd3e8f7bf30cdee97670"} Mar 18 07:54:40 crc kubenswrapper[4917]: I0318 07:54:40.449263 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmxd2" event={"ID":"5d64ada0-babb-4480-8dfb-b764e933c592","Type":"ContainerStarted","Data":"568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3"} Mar 18 07:54:40 crc kubenswrapper[4917]: I0318 07:54:40.772737 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:54:40 crc kubenswrapper[4917]: E0318 07:54:40.773176 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:54:41 crc kubenswrapper[4917]: I0318 07:54:41.464705 4917 generic.go:334] "Generic (PLEG): container finished" podID="5d64ada0-babb-4480-8dfb-b764e933c592" containerID="568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3" exitCode=0 Mar 18 07:54:41 crc kubenswrapper[4917]: I0318 07:54:41.464811 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmxd2" event={"ID":"5d64ada0-babb-4480-8dfb-b764e933c592","Type":"ContainerDied","Data":"568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3"} Mar 18 07:54:42 crc kubenswrapper[4917]: I0318 07:54:42.479119 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmxd2" event={"ID":"5d64ada0-babb-4480-8dfb-b764e933c592","Type":"ContainerStarted","Data":"82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d"} Mar 18 07:54:48 crc kubenswrapper[4917]: I0318 07:54:48.189170 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:48 crc kubenswrapper[4917]: I0318 07:54:48.189820 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:48 crc kubenswrapper[4917]: I0318 07:54:48.269300 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:48 crc kubenswrapper[4917]: I0318 07:54:48.310033 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rmxd2" podStartSLOduration=8.828604321 podStartE2EDuration="11.310004832s" podCreationTimestamp="2026-03-18 07:54:37 +0000 UTC" firstStartedPulling="2026-03-18 07:54:39.437876835 +0000 UTC m=+4064.379031589" lastFinishedPulling="2026-03-18 07:54:41.919277386 +0000 UTC m=+4066.860432100" observedRunningTime="2026-03-18 07:54:42.501348755 +0000 UTC m=+4067.442503509" watchObservedRunningTime="2026-03-18 07:54:48.310004832 +0000 UTC m=+4073.251159586" Mar 18 07:54:48 crc kubenswrapper[4917]: I0318 07:54:48.647842 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:48 crc kubenswrapper[4917]: I0318 07:54:48.729809 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmxd2"] Mar 18 07:54:50 crc kubenswrapper[4917]: I0318 07:54:50.591768 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rmxd2" podUID="5d64ada0-babb-4480-8dfb-b764e933c592" containerName="registry-server" containerID="cri-o://82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d" gracePeriod=2 Mar 18 07:54:50 crc kubenswrapper[4917]: I0318 07:54:50.990026 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.134105 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqpbq\" (UniqueName: \"kubernetes.io/projected/5d64ada0-babb-4480-8dfb-b764e933c592-kube-api-access-nqpbq\") pod \"5d64ada0-babb-4480-8dfb-b764e933c592\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.134491 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-utilities\") pod \"5d64ada0-babb-4480-8dfb-b764e933c592\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.134596 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-catalog-content\") pod \"5d64ada0-babb-4480-8dfb-b764e933c592\" (UID: \"5d64ada0-babb-4480-8dfb-b764e933c592\") " Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.135537 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-utilities" (OuterVolumeSpecName: "utilities") pod "5d64ada0-babb-4480-8dfb-b764e933c592" (UID: "5d64ada0-babb-4480-8dfb-b764e933c592"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.142061 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d64ada0-babb-4480-8dfb-b764e933c592-kube-api-access-nqpbq" (OuterVolumeSpecName: "kube-api-access-nqpbq") pod "5d64ada0-babb-4480-8dfb-b764e933c592" (UID: "5d64ada0-babb-4480-8dfb-b764e933c592"). InnerVolumeSpecName "kube-api-access-nqpbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.159872 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5d64ada0-babb-4480-8dfb-b764e933c592" (UID: "5d64ada0-babb-4480-8dfb-b764e933c592"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.236939 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.236987 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d64ada0-babb-4480-8dfb-b764e933c592-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.237010 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqpbq\" (UniqueName: \"kubernetes.io/projected/5d64ada0-babb-4480-8dfb-b764e933c592-kube-api-access-nqpbq\") on node \"crc\" DevicePath \"\"" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.616966 4917 generic.go:334] "Generic (PLEG): container finished" podID="5d64ada0-babb-4480-8dfb-b764e933c592" containerID="82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d" exitCode=0 Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.617064 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmxd2" event={"ID":"5d64ada0-babb-4480-8dfb-b764e933c592","Type":"ContainerDied","Data":"82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d"} Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.617143 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rmxd2" event={"ID":"5d64ada0-babb-4480-8dfb-b764e933c592","Type":"ContainerDied","Data":"726063d2c8a755b1480a035bdec306f20317ab4ba200dd3e8f7bf30cdee97670"} Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.617188 4917 scope.go:117] "RemoveContainer" containerID="82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.617753 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rmxd2" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.651080 4917 scope.go:117] "RemoveContainer" containerID="568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.676350 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmxd2"] Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.685306 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rmxd2"] Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.706830 4917 scope.go:117] "RemoveContainer" containerID="2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.728676 4917 scope.go:117] "RemoveContainer" containerID="82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d" Mar 18 07:54:51 crc kubenswrapper[4917]: E0318 07:54:51.729529 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d\": container with ID starting with 82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d not found: ID does not exist" containerID="82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.729631 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d"} err="failed to get container status \"82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d\": rpc error: code = NotFound desc = could not find container \"82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d\": container with ID starting with 82b96b0573dac2b77a4eebe7fa871e31220f35a59aaf8195591bef3311cdfa6d not found: ID does not exist" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.729757 4917 scope.go:117] "RemoveContainer" containerID="568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3" Mar 18 07:54:51 crc kubenswrapper[4917]: E0318 07:54:51.730232 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3\": container with ID starting with 568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3 not found: ID does not exist" containerID="568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.730288 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3"} err="failed to get container status \"568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3\": rpc error: code = NotFound desc = could not find container \"568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3\": container with ID starting with 568e5ff2de8728a1d96295e90d5891923d851c64ef55aee7e01bdcefe81687c3 not found: ID does not exist" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.730329 4917 scope.go:117] "RemoveContainer" containerID="2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32" Mar 18 07:54:51 crc kubenswrapper[4917]: E0318 07:54:51.730728 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32\": container with ID starting with 2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32 not found: ID does not exist" containerID="2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.730771 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32"} err="failed to get container status \"2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32\": rpc error: code = NotFound desc = could not find container \"2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32\": container with ID starting with 2b4ca220eee02f65af62c33db0c66741b1dcc9147d070a059ad8d9c09921ad32 not found: ID does not exist" Mar 18 07:54:51 crc kubenswrapper[4917]: I0318 07:54:51.790813 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d64ada0-babb-4480-8dfb-b764e933c592" path="/var/lib/kubelet/pods/5d64ada0-babb-4480-8dfb-b764e933c592/volumes" Mar 18 07:54:55 crc kubenswrapper[4917]: I0318 07:54:55.782690 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:54:55 crc kubenswrapper[4917]: E0318 07:54:55.784241 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:55:05 crc kubenswrapper[4917]: I0318 07:55:05.618671 4917 scope.go:117] "RemoveContainer" containerID="164543bda9e93ce0c521e028930ad0db141e4b0131f157fe9fdc4ab91ff716a1" Mar 18 07:55:07 crc kubenswrapper[4917]: I0318 07:55:07.774210 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:55:07 crc kubenswrapper[4917]: E0318 07:55:07.775472 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:55:07 crc kubenswrapper[4917]: I0318 07:55:07.938487 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xdsdz"] Mar 18 07:55:07 crc kubenswrapper[4917]: E0318 07:55:07.939561 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d64ada0-babb-4480-8dfb-b764e933c592" containerName="registry-server" Mar 18 07:55:07 crc kubenswrapper[4917]: I0318 07:55:07.939627 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d64ada0-babb-4480-8dfb-b764e933c592" containerName="registry-server" Mar 18 07:55:07 crc kubenswrapper[4917]: E0318 07:55:07.939701 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d64ada0-babb-4480-8dfb-b764e933c592" containerName="extract-content" Mar 18 07:55:07 crc kubenswrapper[4917]: I0318 07:55:07.939744 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d64ada0-babb-4480-8dfb-b764e933c592" containerName="extract-content" Mar 18 07:55:07 crc kubenswrapper[4917]: E0318 07:55:07.939770 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d64ada0-babb-4480-8dfb-b764e933c592" containerName="extract-utilities" Mar 18 07:55:07 crc kubenswrapper[4917]: I0318 07:55:07.939787 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d64ada0-babb-4480-8dfb-b764e933c592" containerName="extract-utilities" Mar 18 07:55:07 crc kubenswrapper[4917]: I0318 07:55:07.940137 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d64ada0-babb-4480-8dfb-b764e933c592" containerName="registry-server" Mar 18 07:55:07 crc kubenswrapper[4917]: I0318 07:55:07.942557 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:07 crc kubenswrapper[4917]: I0318 07:55:07.950157 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdsdz"] Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.014503 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-utilities\") pod \"redhat-operators-xdsdz\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.014653 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-catalog-content\") pod \"redhat-operators-xdsdz\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.014692 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5zg8\" (UniqueName: \"kubernetes.io/projected/d8def970-f869-44c1-bb45-baeb1c921f09-kube-api-access-w5zg8\") pod \"redhat-operators-xdsdz\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.116553 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-catalog-content\") pod \"redhat-operators-xdsdz\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.116674 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5zg8\" (UniqueName: \"kubernetes.io/projected/d8def970-f869-44c1-bb45-baeb1c921f09-kube-api-access-w5zg8\") pod \"redhat-operators-xdsdz\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.116758 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-utilities\") pod \"redhat-operators-xdsdz\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.117294 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-utilities\") pod \"redhat-operators-xdsdz\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.117576 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-catalog-content\") pod \"redhat-operators-xdsdz\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.152396 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5zg8\" (UniqueName: \"kubernetes.io/projected/d8def970-f869-44c1-bb45-baeb1c921f09-kube-api-access-w5zg8\") pod \"redhat-operators-xdsdz\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.267782 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:08 crc kubenswrapper[4917]: I0318 07:55:08.763540 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdsdz"] Mar 18 07:55:09 crc kubenswrapper[4917]: I0318 07:55:09.774880 4917 generic.go:334] "Generic (PLEG): container finished" podID="d8def970-f869-44c1-bb45-baeb1c921f09" containerID="5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee" exitCode=0 Mar 18 07:55:09 crc kubenswrapper[4917]: I0318 07:55:09.792293 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdsdz" event={"ID":"d8def970-f869-44c1-bb45-baeb1c921f09","Type":"ContainerDied","Data":"5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee"} Mar 18 07:55:09 crc kubenswrapper[4917]: I0318 07:55:09.792393 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdsdz" event={"ID":"d8def970-f869-44c1-bb45-baeb1c921f09","Type":"ContainerStarted","Data":"a1664bded889fb8626b5dc010760fe960513e8b62747add053fec4804c0e2382"} Mar 18 07:55:11 crc kubenswrapper[4917]: I0318 07:55:11.796356 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdsdz" event={"ID":"d8def970-f869-44c1-bb45-baeb1c921f09","Type":"ContainerStarted","Data":"69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483"} Mar 18 07:55:12 crc kubenswrapper[4917]: I0318 07:55:12.815526 4917 generic.go:334] "Generic (PLEG): container finished" podID="d8def970-f869-44c1-bb45-baeb1c921f09" containerID="69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483" exitCode=0 Mar 18 07:55:12 crc kubenswrapper[4917]: I0318 07:55:12.815647 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdsdz" event={"ID":"d8def970-f869-44c1-bb45-baeb1c921f09","Type":"ContainerDied","Data":"69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483"} Mar 18 07:55:14 crc kubenswrapper[4917]: I0318 07:55:14.844353 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdsdz" event={"ID":"d8def970-f869-44c1-bb45-baeb1c921f09","Type":"ContainerStarted","Data":"c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296"} Mar 18 07:55:14 crc kubenswrapper[4917]: I0318 07:55:14.882059 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xdsdz" podStartSLOduration=4.420789578 podStartE2EDuration="7.882024918s" podCreationTimestamp="2026-03-18 07:55:07 +0000 UTC" firstStartedPulling="2026-03-18 07:55:09.77779868 +0000 UTC m=+4094.718953424" lastFinishedPulling="2026-03-18 07:55:13.23903402 +0000 UTC m=+4098.180188764" observedRunningTime="2026-03-18 07:55:14.870440965 +0000 UTC m=+4099.811595709" watchObservedRunningTime="2026-03-18 07:55:14.882024918 +0000 UTC m=+4099.823179682" Mar 18 07:55:18 crc kubenswrapper[4917]: I0318 07:55:18.268030 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:18 crc kubenswrapper[4917]: I0318 07:55:18.268644 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:18 crc kubenswrapper[4917]: I0318 07:55:18.773385 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:55:18 crc kubenswrapper[4917]: E0318 07:55:18.773696 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:55:19 crc kubenswrapper[4917]: I0318 07:55:19.308531 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xdsdz" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" containerName="registry-server" probeResult="failure" output=< Mar 18 07:55:19 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 07:55:19 crc kubenswrapper[4917]: > Mar 18 07:55:28 crc kubenswrapper[4917]: I0318 07:55:28.349310 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:28 crc kubenswrapper[4917]: I0318 07:55:28.407066 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:28 crc kubenswrapper[4917]: I0318 07:55:28.597757 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdsdz"] Mar 18 07:55:29 crc kubenswrapper[4917]: I0318 07:55:29.975205 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xdsdz" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" containerName="registry-server" containerID="cri-o://c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296" gracePeriod=2 Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.403451 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.482451 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-catalog-content\") pod \"d8def970-f869-44c1-bb45-baeb1c921f09\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.482566 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-utilities\") pod \"d8def970-f869-44c1-bb45-baeb1c921f09\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.482760 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5zg8\" (UniqueName: \"kubernetes.io/projected/d8def970-f869-44c1-bb45-baeb1c921f09-kube-api-access-w5zg8\") pod \"d8def970-f869-44c1-bb45-baeb1c921f09\" (UID: \"d8def970-f869-44c1-bb45-baeb1c921f09\") " Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.483610 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-utilities" (OuterVolumeSpecName: "utilities") pod "d8def970-f869-44c1-bb45-baeb1c921f09" (UID: "d8def970-f869-44c1-bb45-baeb1c921f09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.490607 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8def970-f869-44c1-bb45-baeb1c921f09-kube-api-access-w5zg8" (OuterVolumeSpecName: "kube-api-access-w5zg8") pod "d8def970-f869-44c1-bb45-baeb1c921f09" (UID: "d8def970-f869-44c1-bb45-baeb1c921f09"). InnerVolumeSpecName "kube-api-access-w5zg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.584765 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5zg8\" (UniqueName: \"kubernetes.io/projected/d8def970-f869-44c1-bb45-baeb1c921f09-kube-api-access-w5zg8\") on node \"crc\" DevicePath \"\"" Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.584795 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.635387 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8def970-f869-44c1-bb45-baeb1c921f09" (UID: "d8def970-f869-44c1-bb45-baeb1c921f09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.686468 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8def970-f869-44c1-bb45-baeb1c921f09-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.773911 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:55:30 crc kubenswrapper[4917]: E0318 07:55:30.774423 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.988630 4917 generic.go:334] "Generic (PLEG): container finished" podID="d8def970-f869-44c1-bb45-baeb1c921f09" containerID="c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296" exitCode=0 Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.988712 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdsdz" Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.988710 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdsdz" event={"ID":"d8def970-f869-44c1-bb45-baeb1c921f09","Type":"ContainerDied","Data":"c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296"} Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.988880 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdsdz" event={"ID":"d8def970-f869-44c1-bb45-baeb1c921f09","Type":"ContainerDied","Data":"a1664bded889fb8626b5dc010760fe960513e8b62747add053fec4804c0e2382"} Mar 18 07:55:30 crc kubenswrapper[4917]: I0318 07:55:30.988910 4917 scope.go:117] "RemoveContainer" containerID="c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296" Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.025037 4917 scope.go:117] "RemoveContainer" containerID="69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483" Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.048861 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdsdz"] Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.057113 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xdsdz"] Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.074473 4917 scope.go:117] "RemoveContainer" containerID="5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee" Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.103459 4917 scope.go:117] "RemoveContainer" containerID="c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296" Mar 18 07:55:31 crc kubenswrapper[4917]: E0318 07:55:31.104057 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296\": container with ID starting with c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296 not found: ID does not exist" containerID="c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296" Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.104095 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296"} err="failed to get container status \"c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296\": rpc error: code = NotFound desc = could not find container \"c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296\": container with ID starting with c73b8bc28ee01bda738b206abc9b18cc3b102de55614b6d30dace406140c7296 not found: ID does not exist" Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.104118 4917 scope.go:117] "RemoveContainer" containerID="69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483" Mar 18 07:55:31 crc kubenswrapper[4917]: E0318 07:55:31.104513 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483\": container with ID starting with 69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483 not found: ID does not exist" containerID="69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483" Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.104540 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483"} err="failed to get container status \"69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483\": rpc error: code = NotFound desc = could not find container \"69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483\": container with ID starting with 69377cdc8a4006c855ef4daa1a366f2122cbe0993f446b3a19c446eb9616f483 not found: ID does not exist" Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.104556 4917 scope.go:117] "RemoveContainer" containerID="5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee" Mar 18 07:55:31 crc kubenswrapper[4917]: E0318 07:55:31.105104 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee\": container with ID starting with 5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee not found: ID does not exist" containerID="5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee" Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.105151 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee"} err="failed to get container status \"5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee\": rpc error: code = NotFound desc = could not find container \"5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee\": container with ID starting with 5aec9e1a984f61fb3119a901686b15e352471fe3eb2ddc9e4d9da45e758c4fee not found: ID does not exist" Mar 18 07:55:31 crc kubenswrapper[4917]: I0318 07:55:31.789832 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" path="/var/lib/kubelet/pods/d8def970-f869-44c1-bb45-baeb1c921f09/volumes" Mar 18 07:55:44 crc kubenswrapper[4917]: I0318 07:55:44.774219 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:55:44 crc kubenswrapper[4917]: E0318 07:55:44.775434 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:55:57 crc kubenswrapper[4917]: I0318 07:55:57.773216 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:55:57 crc kubenswrapper[4917]: E0318 07:55:57.774022 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.166643 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563676-pb54r"] Mar 18 07:56:00 crc kubenswrapper[4917]: E0318 07:56:00.167657 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" containerName="registry-server" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.167691 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" containerName="registry-server" Mar 18 07:56:00 crc kubenswrapper[4917]: E0318 07:56:00.167727 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" containerName="extract-content" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.167744 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" containerName="extract-content" Mar 18 07:56:00 crc kubenswrapper[4917]: E0318 07:56:00.167789 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" containerName="extract-utilities" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.167806 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" containerName="extract-utilities" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.168137 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8def970-f869-44c1-bb45-baeb1c921f09" containerName="registry-server" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.169122 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563676-pb54r" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.172136 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.173325 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.173683 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.174401 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563676-pb54r"] Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.308502 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6xrz\" (UniqueName: \"kubernetes.io/projected/c57f2cb0-a891-427e-a4e3-1ef419741e5c-kube-api-access-g6xrz\") pod \"auto-csr-approver-29563676-pb54r\" (UID: \"c57f2cb0-a891-427e-a4e3-1ef419741e5c\") " pod="openshift-infra/auto-csr-approver-29563676-pb54r" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.409577 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6xrz\" (UniqueName: \"kubernetes.io/projected/c57f2cb0-a891-427e-a4e3-1ef419741e5c-kube-api-access-g6xrz\") pod \"auto-csr-approver-29563676-pb54r\" (UID: \"c57f2cb0-a891-427e-a4e3-1ef419741e5c\") " pod="openshift-infra/auto-csr-approver-29563676-pb54r" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.441418 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6xrz\" (UniqueName: \"kubernetes.io/projected/c57f2cb0-a891-427e-a4e3-1ef419741e5c-kube-api-access-g6xrz\") pod \"auto-csr-approver-29563676-pb54r\" (UID: \"c57f2cb0-a891-427e-a4e3-1ef419741e5c\") " pod="openshift-infra/auto-csr-approver-29563676-pb54r" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.496656 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563676-pb54r" Mar 18 07:56:00 crc kubenswrapper[4917]: I0318 07:56:00.756830 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563676-pb54r"] Mar 18 07:56:01 crc kubenswrapper[4917]: I0318 07:56:01.270205 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563676-pb54r" event={"ID":"c57f2cb0-a891-427e-a4e3-1ef419741e5c","Type":"ContainerStarted","Data":"9c322182070a070690748231836c9d16a079e14a9eb7522ecdcbc617506e7ca6"} Mar 18 07:56:02 crc kubenswrapper[4917]: I0318 07:56:02.279862 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563676-pb54r" event={"ID":"c57f2cb0-a891-427e-a4e3-1ef419741e5c","Type":"ContainerStarted","Data":"f74cc1947fd6115f75ced40312df2aa9e0d67ff7a2456bf48277b0abcf98e9f0"} Mar 18 07:56:02 crc kubenswrapper[4917]: I0318 07:56:02.300664 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563676-pb54r" podStartSLOduration=1.279339336 podStartE2EDuration="2.300644046s" podCreationTimestamp="2026-03-18 07:56:00 +0000 UTC" firstStartedPulling="2026-03-18 07:56:00.767040356 +0000 UTC m=+4145.708195090" lastFinishedPulling="2026-03-18 07:56:01.788345096 +0000 UTC m=+4146.729499800" observedRunningTime="2026-03-18 07:56:02.294442074 +0000 UTC m=+4147.235596818" watchObservedRunningTime="2026-03-18 07:56:02.300644046 +0000 UTC m=+4147.241798780" Mar 18 07:56:03 crc kubenswrapper[4917]: I0318 07:56:03.291759 4917 generic.go:334] "Generic (PLEG): container finished" podID="c57f2cb0-a891-427e-a4e3-1ef419741e5c" containerID="f74cc1947fd6115f75ced40312df2aa9e0d67ff7a2456bf48277b0abcf98e9f0" exitCode=0 Mar 18 07:56:03 crc kubenswrapper[4917]: I0318 07:56:03.291824 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563676-pb54r" event={"ID":"c57f2cb0-a891-427e-a4e3-1ef419741e5c","Type":"ContainerDied","Data":"f74cc1947fd6115f75ced40312df2aa9e0d67ff7a2456bf48277b0abcf98e9f0"} Mar 18 07:56:04 crc kubenswrapper[4917]: I0318 07:56:04.741712 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563676-pb54r" Mar 18 07:56:04 crc kubenswrapper[4917]: I0318 07:56:04.777346 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6xrz\" (UniqueName: \"kubernetes.io/projected/c57f2cb0-a891-427e-a4e3-1ef419741e5c-kube-api-access-g6xrz\") pod \"c57f2cb0-a891-427e-a4e3-1ef419741e5c\" (UID: \"c57f2cb0-a891-427e-a4e3-1ef419741e5c\") " Mar 18 07:56:04 crc kubenswrapper[4917]: I0318 07:56:04.783921 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57f2cb0-a891-427e-a4e3-1ef419741e5c-kube-api-access-g6xrz" (OuterVolumeSpecName: "kube-api-access-g6xrz") pod "c57f2cb0-a891-427e-a4e3-1ef419741e5c" (UID: "c57f2cb0-a891-427e-a4e3-1ef419741e5c"). InnerVolumeSpecName "kube-api-access-g6xrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:56:04 crc kubenswrapper[4917]: I0318 07:56:04.879656 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6xrz\" (UniqueName: \"kubernetes.io/projected/c57f2cb0-a891-427e-a4e3-1ef419741e5c-kube-api-access-g6xrz\") on node \"crc\" DevicePath \"\"" Mar 18 07:56:05 crc kubenswrapper[4917]: I0318 07:56:05.315268 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563676-pb54r" event={"ID":"c57f2cb0-a891-427e-a4e3-1ef419741e5c","Type":"ContainerDied","Data":"9c322182070a070690748231836c9d16a079e14a9eb7522ecdcbc617506e7ca6"} Mar 18 07:56:05 crc kubenswrapper[4917]: I0318 07:56:05.315341 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c322182070a070690748231836c9d16a079e14a9eb7522ecdcbc617506e7ca6" Mar 18 07:56:05 crc kubenswrapper[4917]: I0318 07:56:05.315398 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563676-pb54r" Mar 18 07:56:05 crc kubenswrapper[4917]: I0318 07:56:05.431693 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563670-hjzkx"] Mar 18 07:56:05 crc kubenswrapper[4917]: I0318 07:56:05.438979 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563670-hjzkx"] Mar 18 07:56:05 crc kubenswrapper[4917]: I0318 07:56:05.786891 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc85683a-74ce-4fed-a4c6-ce995a4d73d2" path="/var/lib/kubelet/pods/bc85683a-74ce-4fed-a4c6-ce995a4d73d2/volumes" Mar 18 07:56:10 crc kubenswrapper[4917]: I0318 07:56:10.772556 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:56:10 crc kubenswrapper[4917]: E0318 07:56:10.773680 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:56:22 crc kubenswrapper[4917]: I0318 07:56:22.773776 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:56:22 crc kubenswrapper[4917]: E0318 07:56:22.774722 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:56:36 crc kubenswrapper[4917]: I0318 07:56:36.772612 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:56:36 crc kubenswrapper[4917]: E0318 07:56:36.773895 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:56:50 crc kubenswrapper[4917]: I0318 07:56:50.773724 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:56:50 crc kubenswrapper[4917]: E0318 07:56:50.775416 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.324421 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7r5xm"] Mar 18 07:56:55 crc kubenswrapper[4917]: E0318 07:56:55.325322 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57f2cb0-a891-427e-a4e3-1ef419741e5c" containerName="oc" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.325343 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57f2cb0-a891-427e-a4e3-1ef419741e5c" containerName="oc" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.325694 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57f2cb0-a891-427e-a4e3-1ef419741e5c" containerName="oc" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.327785 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.343209 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7r5xm"] Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.464454 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-utilities\") pod \"certified-operators-7r5xm\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.464620 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckzgc\" (UniqueName: \"kubernetes.io/projected/c98d851e-ea02-4790-b4e4-9ad1e900204b-kube-api-access-ckzgc\") pod \"certified-operators-7r5xm\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.464773 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-catalog-content\") pod \"certified-operators-7r5xm\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.566439 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-catalog-content\") pod \"certified-operators-7r5xm\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.566568 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-utilities\") pod \"certified-operators-7r5xm\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.566622 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckzgc\" (UniqueName: \"kubernetes.io/projected/c98d851e-ea02-4790-b4e4-9ad1e900204b-kube-api-access-ckzgc\") pod \"certified-operators-7r5xm\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.567226 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-catalog-content\") pod \"certified-operators-7r5xm\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.567296 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-utilities\") pod \"certified-operators-7r5xm\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.932819 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckzgc\" (UniqueName: \"kubernetes.io/projected/c98d851e-ea02-4790-b4e4-9ad1e900204b-kube-api-access-ckzgc\") pod \"certified-operators-7r5xm\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:55 crc kubenswrapper[4917]: I0318 07:56:55.979028 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:56:56 crc kubenswrapper[4917]: I0318 07:56:56.487312 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7r5xm"] Mar 18 07:56:56 crc kubenswrapper[4917]: W0318 07:56:56.494827 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc98d851e_ea02_4790_b4e4_9ad1e900204b.slice/crio-1581fbf1d1d49da48c3c8f2fc1c9dea9c289f3adfdc846e18aeaa5443de69e52 WatchSource:0}: Error finding container 1581fbf1d1d49da48c3c8f2fc1c9dea9c289f3adfdc846e18aeaa5443de69e52: Status 404 returned error can't find the container with id 1581fbf1d1d49da48c3c8f2fc1c9dea9c289f3adfdc846e18aeaa5443de69e52 Mar 18 07:56:56 crc kubenswrapper[4917]: I0318 07:56:56.764949 4917 generic.go:334] "Generic (PLEG): container finished" podID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerID="f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8" exitCode=0 Mar 18 07:56:56 crc kubenswrapper[4917]: I0318 07:56:56.765008 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7r5xm" event={"ID":"c98d851e-ea02-4790-b4e4-9ad1e900204b","Type":"ContainerDied","Data":"f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8"} Mar 18 07:56:56 crc kubenswrapper[4917]: I0318 07:56:56.765040 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7r5xm" event={"ID":"c98d851e-ea02-4790-b4e4-9ad1e900204b","Type":"ContainerStarted","Data":"1581fbf1d1d49da48c3c8f2fc1c9dea9c289f3adfdc846e18aeaa5443de69e52"} Mar 18 07:56:58 crc kubenswrapper[4917]: I0318 07:56:58.793507 4917 generic.go:334] "Generic (PLEG): container finished" podID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerID="fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79" exitCode=0 Mar 18 07:56:58 crc kubenswrapper[4917]: I0318 07:56:58.793573 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7r5xm" event={"ID":"c98d851e-ea02-4790-b4e4-9ad1e900204b","Type":"ContainerDied","Data":"fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79"} Mar 18 07:56:59 crc kubenswrapper[4917]: I0318 07:56:59.803891 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7r5xm" event={"ID":"c98d851e-ea02-4790-b4e4-9ad1e900204b","Type":"ContainerStarted","Data":"e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7"} Mar 18 07:57:03 crc kubenswrapper[4917]: I0318 07:57:03.772698 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:57:03 crc kubenswrapper[4917]: E0318 07:57:03.773471 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:57:05 crc kubenswrapper[4917]: I0318 07:57:05.737540 4917 scope.go:117] "RemoveContainer" containerID="61a64523c186094a4506210090aac2691b4a22dba4eb6c1044c243be3a5b6a39" Mar 18 07:57:05 crc kubenswrapper[4917]: I0318 07:57:05.980189 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:57:05 crc kubenswrapper[4917]: I0318 07:57:05.980632 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:57:06 crc kubenswrapper[4917]: I0318 07:57:06.062259 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:57:06 crc kubenswrapper[4917]: I0318 07:57:06.092814 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7r5xm" podStartSLOduration=8.637139065 podStartE2EDuration="11.092781685s" podCreationTimestamp="2026-03-18 07:56:55 +0000 UTC" firstStartedPulling="2026-03-18 07:56:56.767012874 +0000 UTC m=+4201.708167588" lastFinishedPulling="2026-03-18 07:56:59.222655454 +0000 UTC m=+4204.163810208" observedRunningTime="2026-03-18 07:56:59.824858635 +0000 UTC m=+4204.766013349" watchObservedRunningTime="2026-03-18 07:57:06.092781685 +0000 UTC m=+4211.033936449" Mar 18 07:57:06 crc kubenswrapper[4917]: I0318 07:57:06.972783 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:57:07 crc kubenswrapper[4917]: I0318 07:57:07.050907 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7r5xm"] Mar 18 07:57:08 crc kubenswrapper[4917]: I0318 07:57:08.912661 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7r5xm" podUID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerName="registry-server" containerID="cri-o://e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7" gracePeriod=2 Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.361424 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.495135 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckzgc\" (UniqueName: \"kubernetes.io/projected/c98d851e-ea02-4790-b4e4-9ad1e900204b-kube-api-access-ckzgc\") pod \"c98d851e-ea02-4790-b4e4-9ad1e900204b\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.495375 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-utilities\") pod \"c98d851e-ea02-4790-b4e4-9ad1e900204b\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.496169 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-catalog-content\") pod \"c98d851e-ea02-4790-b4e4-9ad1e900204b\" (UID: \"c98d851e-ea02-4790-b4e4-9ad1e900204b\") " Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.497061 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-utilities" (OuterVolumeSpecName: "utilities") pod "c98d851e-ea02-4790-b4e4-9ad1e900204b" (UID: "c98d851e-ea02-4790-b4e4-9ad1e900204b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.501709 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98d851e-ea02-4790-b4e4-9ad1e900204b-kube-api-access-ckzgc" (OuterVolumeSpecName: "kube-api-access-ckzgc") pod "c98d851e-ea02-4790-b4e4-9ad1e900204b" (UID: "c98d851e-ea02-4790-b4e4-9ad1e900204b"). InnerVolumeSpecName "kube-api-access-ckzgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.579950 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c98d851e-ea02-4790-b4e4-9ad1e900204b" (UID: "c98d851e-ea02-4790-b4e4-9ad1e900204b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.598177 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.598234 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c98d851e-ea02-4790-b4e4-9ad1e900204b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.598256 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckzgc\" (UniqueName: \"kubernetes.io/projected/c98d851e-ea02-4790-b4e4-9ad1e900204b-kube-api-access-ckzgc\") on node \"crc\" DevicePath \"\"" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.924153 4917 generic.go:334] "Generic (PLEG): container finished" podID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerID="e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7" exitCode=0 Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.924206 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7r5xm" event={"ID":"c98d851e-ea02-4790-b4e4-9ad1e900204b","Type":"ContainerDied","Data":"e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7"} Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.924244 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7r5xm" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.924277 4917 scope.go:117] "RemoveContainer" containerID="e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.924257 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7r5xm" event={"ID":"c98d851e-ea02-4790-b4e4-9ad1e900204b","Type":"ContainerDied","Data":"1581fbf1d1d49da48c3c8f2fc1c9dea9c289f3adfdc846e18aeaa5443de69e52"} Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.954982 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7r5xm"] Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.955952 4917 scope.go:117] "RemoveContainer" containerID="fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79" Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.970967 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7r5xm"] Mar 18 07:57:09 crc kubenswrapper[4917]: I0318 07:57:09.985541 4917 scope.go:117] "RemoveContainer" containerID="f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8" Mar 18 07:57:10 crc kubenswrapper[4917]: I0318 07:57:10.034755 4917 scope.go:117] "RemoveContainer" containerID="e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7" Mar 18 07:57:10 crc kubenswrapper[4917]: E0318 07:57:10.035901 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7\": container with ID starting with e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7 not found: ID does not exist" containerID="e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7" Mar 18 07:57:10 crc kubenswrapper[4917]: I0318 07:57:10.035970 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7"} err="failed to get container status \"e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7\": rpc error: code = NotFound desc = could not find container \"e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7\": container with ID starting with e4db3209fcbbdc08dec0d0690bc28a467487da3106a65ad7d0d323ce10c492e7 not found: ID does not exist" Mar 18 07:57:10 crc kubenswrapper[4917]: I0318 07:57:10.036010 4917 scope.go:117] "RemoveContainer" containerID="fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79" Mar 18 07:57:10 crc kubenswrapper[4917]: E0318 07:57:10.037204 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79\": container with ID starting with fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79 not found: ID does not exist" containerID="fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79" Mar 18 07:57:10 crc kubenswrapper[4917]: I0318 07:57:10.037366 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79"} err="failed to get container status \"fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79\": rpc error: code = NotFound desc = could not find container \"fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79\": container with ID starting with fc820dffc8ec471b913b3037a364fe8f73bd92b4aa565c21e9db87b37f03ce79 not found: ID does not exist" Mar 18 07:57:10 crc kubenswrapper[4917]: I0318 07:57:10.037510 4917 scope.go:117] "RemoveContainer" containerID="f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8" Mar 18 07:57:10 crc kubenswrapper[4917]: E0318 07:57:10.038146 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8\": container with ID starting with f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8 not found: ID does not exist" containerID="f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8" Mar 18 07:57:10 crc kubenswrapper[4917]: I0318 07:57:10.038199 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8"} err="failed to get container status \"f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8\": rpc error: code = NotFound desc = could not find container \"f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8\": container with ID starting with f7bd71ded36db9b506abfa360840df22ea40279e0c0f17cb19d712b7874555f8 not found: ID does not exist" Mar 18 07:57:11 crc kubenswrapper[4917]: I0318 07:57:11.789114 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98d851e-ea02-4790-b4e4-9ad1e900204b" path="/var/lib/kubelet/pods/c98d851e-ea02-4790-b4e4-9ad1e900204b/volumes" Mar 18 07:57:14 crc kubenswrapper[4917]: I0318 07:57:14.773547 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:57:14 crc kubenswrapper[4917]: E0318 07:57:14.774579 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:57:28 crc kubenswrapper[4917]: I0318 07:57:28.773008 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:57:28 crc kubenswrapper[4917]: E0318 07:57:28.773928 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 07:57:39 crc kubenswrapper[4917]: I0318 07:57:39.773238 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 07:57:40 crc kubenswrapper[4917]: I0318 07:57:40.207193 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"c5ab49504cd4a3cbf79368c9f693de03d31b8e9c6679f12fadae736ae2434a51"} Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.153643 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563678-n8lgk"] Mar 18 07:58:00 crc kubenswrapper[4917]: E0318 07:58:00.154647 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerName="registry-server" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.154669 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerName="registry-server" Mar 18 07:58:00 crc kubenswrapper[4917]: E0318 07:58:00.154692 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerName="extract-content" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.154702 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerName="extract-content" Mar 18 07:58:00 crc kubenswrapper[4917]: E0318 07:58:00.154737 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerName="extract-utilities" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.154748 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerName="extract-utilities" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.154932 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98d851e-ea02-4790-b4e4-9ad1e900204b" containerName="registry-server" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.155659 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563678-n8lgk" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.159152 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.160948 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.165424 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563678-n8lgk"] Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.168186 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.264201 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcz4q\" (UniqueName: \"kubernetes.io/projected/277b2e07-a273-498b-b9c6-05e0a1a2c297-kube-api-access-lcz4q\") pod \"auto-csr-approver-29563678-n8lgk\" (UID: \"277b2e07-a273-498b-b9c6-05e0a1a2c297\") " pod="openshift-infra/auto-csr-approver-29563678-n8lgk" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.366105 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcz4q\" (UniqueName: \"kubernetes.io/projected/277b2e07-a273-498b-b9c6-05e0a1a2c297-kube-api-access-lcz4q\") pod \"auto-csr-approver-29563678-n8lgk\" (UID: \"277b2e07-a273-498b-b9c6-05e0a1a2c297\") " pod="openshift-infra/auto-csr-approver-29563678-n8lgk" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.389247 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcz4q\" (UniqueName: \"kubernetes.io/projected/277b2e07-a273-498b-b9c6-05e0a1a2c297-kube-api-access-lcz4q\") pod \"auto-csr-approver-29563678-n8lgk\" (UID: \"277b2e07-a273-498b-b9c6-05e0a1a2c297\") " pod="openshift-infra/auto-csr-approver-29563678-n8lgk" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.487575 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563678-n8lgk" Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.969495 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563678-n8lgk"] Mar 18 07:58:00 crc kubenswrapper[4917]: I0318 07:58:00.978401 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 07:58:01 crc kubenswrapper[4917]: I0318 07:58:01.395220 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563678-n8lgk" event={"ID":"277b2e07-a273-498b-b9c6-05e0a1a2c297","Type":"ContainerStarted","Data":"cb49bf70c3a4fd58c849799e76f765c6efbf45e28e6e5cf87a085837a8fbd54d"} Mar 18 07:58:02 crc kubenswrapper[4917]: I0318 07:58:02.411746 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563678-n8lgk" event={"ID":"277b2e07-a273-498b-b9c6-05e0a1a2c297","Type":"ContainerStarted","Data":"665f9315bf2bf2d4a05233390fd6e683f3ee5307d41ec139b23c07efda7d2697"} Mar 18 07:58:02 crc kubenswrapper[4917]: I0318 07:58:02.436343 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563678-n8lgk" podStartSLOduration=1.467958841 podStartE2EDuration="2.436324365s" podCreationTimestamp="2026-03-18 07:58:00 +0000 UTC" firstStartedPulling="2026-03-18 07:58:00.977813613 +0000 UTC m=+4265.918968367" lastFinishedPulling="2026-03-18 07:58:01.946179127 +0000 UTC m=+4266.887333891" observedRunningTime="2026-03-18 07:58:02.430127403 +0000 UTC m=+4267.371282137" watchObservedRunningTime="2026-03-18 07:58:02.436324365 +0000 UTC m=+4267.377479109" Mar 18 07:58:03 crc kubenswrapper[4917]: I0318 07:58:03.421979 4917 generic.go:334] "Generic (PLEG): container finished" podID="277b2e07-a273-498b-b9c6-05e0a1a2c297" containerID="665f9315bf2bf2d4a05233390fd6e683f3ee5307d41ec139b23c07efda7d2697" exitCode=0 Mar 18 07:58:03 crc kubenswrapper[4917]: I0318 07:58:03.422043 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563678-n8lgk" event={"ID":"277b2e07-a273-498b-b9c6-05e0a1a2c297","Type":"ContainerDied","Data":"665f9315bf2bf2d4a05233390fd6e683f3ee5307d41ec139b23c07efda7d2697"} Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.009680 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563678-n8lgk" Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.128255 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcz4q\" (UniqueName: \"kubernetes.io/projected/277b2e07-a273-498b-b9c6-05e0a1a2c297-kube-api-access-lcz4q\") pod \"277b2e07-a273-498b-b9c6-05e0a1a2c297\" (UID: \"277b2e07-a273-498b-b9c6-05e0a1a2c297\") " Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.230365 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277b2e07-a273-498b-b9c6-05e0a1a2c297-kube-api-access-lcz4q" (OuterVolumeSpecName: "kube-api-access-lcz4q") pod "277b2e07-a273-498b-b9c6-05e0a1a2c297" (UID: "277b2e07-a273-498b-b9c6-05e0a1a2c297"). InnerVolumeSpecName "kube-api-access-lcz4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.230679 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcz4q\" (UniqueName: \"kubernetes.io/projected/277b2e07-a273-498b-b9c6-05e0a1a2c297-kube-api-access-lcz4q\") on node \"crc\" DevicePath \"\"" Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.439061 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563678-n8lgk" event={"ID":"277b2e07-a273-498b-b9c6-05e0a1a2c297","Type":"ContainerDied","Data":"cb49bf70c3a4fd58c849799e76f765c6efbf45e28e6e5cf87a085837a8fbd54d"} Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.439118 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb49bf70c3a4fd58c849799e76f765c6efbf45e28e6e5cf87a085837a8fbd54d" Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.439191 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563678-n8lgk" Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.514318 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563672-w7vx8"] Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.524426 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563672-w7vx8"] Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.784298 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e67c910-f202-4145-87fd-8c1666597a0e" path="/var/lib/kubelet/pods/1e67c910-f202-4145-87fd-8c1666597a0e/volumes" Mar 18 07:58:05 crc kubenswrapper[4917]: I0318 07:58:05.815971 4917 scope.go:117] "RemoveContainer" containerID="b1898428e330b84109d5f38a9452448690cd799abd58fcd3e2924cdbeb5ab363" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.201253 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563680-blp4p"] Mar 18 08:00:00 crc kubenswrapper[4917]: E0318 08:00:00.202396 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277b2e07-a273-498b-b9c6-05e0a1a2c297" containerName="oc" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.202420 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="277b2e07-a273-498b-b9c6-05e0a1a2c297" containerName="oc" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.202682 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="277b2e07-a273-498b-b9c6-05e0a1a2c297" containerName="oc" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.203404 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563680-blp4p" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.209538 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx"] Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.210388 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.217928 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563680-blp4p"] Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.218389 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.218563 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.218702 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.218843 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.219076 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.229623 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx"] Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.305245 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/713196bd-eb4e-42a5-b17e-265566a93719-config-volume\") pod \"collect-profiles-29563680-jk8mx\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.305575 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dhh2\" (UniqueName: \"kubernetes.io/projected/1a5bb6b9-4df3-4211-8e19-7077e71eb072-kube-api-access-2dhh2\") pod \"auto-csr-approver-29563680-blp4p\" (UID: \"1a5bb6b9-4df3-4211-8e19-7077e71eb072\") " pod="openshift-infra/auto-csr-approver-29563680-blp4p" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.305627 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssccs\" (UniqueName: \"kubernetes.io/projected/713196bd-eb4e-42a5-b17e-265566a93719-kube-api-access-ssccs\") pod \"collect-profiles-29563680-jk8mx\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.305670 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/713196bd-eb4e-42a5-b17e-265566a93719-secret-volume\") pod \"collect-profiles-29563680-jk8mx\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.407238 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/713196bd-eb4e-42a5-b17e-265566a93719-secret-volume\") pod \"collect-profiles-29563680-jk8mx\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.407400 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/713196bd-eb4e-42a5-b17e-265566a93719-config-volume\") pod \"collect-profiles-29563680-jk8mx\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.407451 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dhh2\" (UniqueName: \"kubernetes.io/projected/1a5bb6b9-4df3-4211-8e19-7077e71eb072-kube-api-access-2dhh2\") pod \"auto-csr-approver-29563680-blp4p\" (UID: \"1a5bb6b9-4df3-4211-8e19-7077e71eb072\") " pod="openshift-infra/auto-csr-approver-29563680-blp4p" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.407502 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssccs\" (UniqueName: \"kubernetes.io/projected/713196bd-eb4e-42a5-b17e-265566a93719-kube-api-access-ssccs\") pod \"collect-profiles-29563680-jk8mx\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.408648 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/713196bd-eb4e-42a5-b17e-265566a93719-config-volume\") pod \"collect-profiles-29563680-jk8mx\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.415775 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/713196bd-eb4e-42a5-b17e-265566a93719-secret-volume\") pod \"collect-profiles-29563680-jk8mx\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.424885 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssccs\" (UniqueName: \"kubernetes.io/projected/713196bd-eb4e-42a5-b17e-265566a93719-kube-api-access-ssccs\") pod \"collect-profiles-29563680-jk8mx\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.432352 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dhh2\" (UniqueName: \"kubernetes.io/projected/1a5bb6b9-4df3-4211-8e19-7077e71eb072-kube-api-access-2dhh2\") pod \"auto-csr-approver-29563680-blp4p\" (UID: \"1a5bb6b9-4df3-4211-8e19-7077e71eb072\") " pod="openshift-infra/auto-csr-approver-29563680-blp4p" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.540641 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563680-blp4p" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.549745 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.820373 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx"] Mar 18 08:00:00 crc kubenswrapper[4917]: I0318 08:00:00.993552 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563680-blp4p"] Mar 18 08:00:01 crc kubenswrapper[4917]: I0318 08:00:01.592843 4917 generic.go:334] "Generic (PLEG): container finished" podID="713196bd-eb4e-42a5-b17e-265566a93719" containerID="a57c4f67a3b1fce162b3fe2bcd969d8a103a309c852e882cfab522ce663ced76" exitCode=0 Mar 18 08:00:01 crc kubenswrapper[4917]: I0318 08:00:01.593083 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" event={"ID":"713196bd-eb4e-42a5-b17e-265566a93719","Type":"ContainerDied","Data":"a57c4f67a3b1fce162b3fe2bcd969d8a103a309c852e882cfab522ce663ced76"} Mar 18 08:00:01 crc kubenswrapper[4917]: I0318 08:00:01.593349 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" event={"ID":"713196bd-eb4e-42a5-b17e-265566a93719","Type":"ContainerStarted","Data":"b267872c49ffcbec9baa5ed3b2ae6b2ae776fe2ddbe89acf3937161eaddc36da"} Mar 18 08:00:01 crc kubenswrapper[4917]: I0318 08:00:01.598183 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563680-blp4p" event={"ID":"1a5bb6b9-4df3-4211-8e19-7077e71eb072","Type":"ContainerStarted","Data":"fd27e9e2f460e328817a4fa53c9223d0ac7a2e238bc74dc38685c3f0ddb17882"} Mar 18 08:00:02 crc kubenswrapper[4917]: I0318 08:00:02.929283 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:00:02 crc kubenswrapper[4917]: I0318 08:00:02.929632 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.016464 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.046066 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/713196bd-eb4e-42a5-b17e-265566a93719-config-volume\") pod \"713196bd-eb4e-42a5-b17e-265566a93719\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.046132 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/713196bd-eb4e-42a5-b17e-265566a93719-secret-volume\") pod \"713196bd-eb4e-42a5-b17e-265566a93719\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.046241 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssccs\" (UniqueName: \"kubernetes.io/projected/713196bd-eb4e-42a5-b17e-265566a93719-kube-api-access-ssccs\") pod \"713196bd-eb4e-42a5-b17e-265566a93719\" (UID: \"713196bd-eb4e-42a5-b17e-265566a93719\") " Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.048685 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/713196bd-eb4e-42a5-b17e-265566a93719-config-volume" (OuterVolumeSpecName: "config-volume") pod "713196bd-eb4e-42a5-b17e-265566a93719" (UID: "713196bd-eb4e-42a5-b17e-265566a93719"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.053123 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713196bd-eb4e-42a5-b17e-265566a93719-kube-api-access-ssccs" (OuterVolumeSpecName: "kube-api-access-ssccs") pod "713196bd-eb4e-42a5-b17e-265566a93719" (UID: "713196bd-eb4e-42a5-b17e-265566a93719"). InnerVolumeSpecName "kube-api-access-ssccs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.053317 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713196bd-eb4e-42a5-b17e-265566a93719-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "713196bd-eb4e-42a5-b17e-265566a93719" (UID: "713196bd-eb4e-42a5-b17e-265566a93719"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.148195 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/713196bd-eb4e-42a5-b17e-265566a93719-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.148255 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/713196bd-eb4e-42a5-b17e-265566a93719-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.148275 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssccs\" (UniqueName: \"kubernetes.io/projected/713196bd-eb4e-42a5-b17e-265566a93719-kube-api-access-ssccs\") on node \"crc\" DevicePath \"\"" Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.619423 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" event={"ID":"713196bd-eb4e-42a5-b17e-265566a93719","Type":"ContainerDied","Data":"b267872c49ffcbec9baa5ed3b2ae6b2ae776fe2ddbe89acf3937161eaddc36da"} Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.619479 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b267872c49ffcbec9baa5ed3b2ae6b2ae776fe2ddbe89acf3937161eaddc36da" Mar 18 08:00:03 crc kubenswrapper[4917]: I0318 08:00:03.620070 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx" Mar 18 08:00:04 crc kubenswrapper[4917]: I0318 08:00:04.120108 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv"] Mar 18 08:00:04 crc kubenswrapper[4917]: I0318 08:00:04.130665 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563635-gspnv"] Mar 18 08:00:04 crc kubenswrapper[4917]: I0318 08:00:04.631121 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563680-blp4p" event={"ID":"1a5bb6b9-4df3-4211-8e19-7077e71eb072","Type":"ContainerStarted","Data":"294971aa5e589627de978c972b9332a4c63c8c1bc1d3ae87458055cdf6a87cd7"} Mar 18 08:00:04 crc kubenswrapper[4917]: I0318 08:00:04.656152 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563680-blp4p" podStartSLOduration=1.4888716020000001 podStartE2EDuration="4.656123705s" podCreationTimestamp="2026-03-18 08:00:00 +0000 UTC" firstStartedPulling="2026-03-18 08:00:00.99606384 +0000 UTC m=+4385.937218594" lastFinishedPulling="2026-03-18 08:00:04.163315963 +0000 UTC m=+4389.104470697" observedRunningTime="2026-03-18 08:00:04.644174657 +0000 UTC m=+4389.585329411" watchObservedRunningTime="2026-03-18 08:00:04.656123705 +0000 UTC m=+4389.597278429" Mar 18 08:00:05 crc kubenswrapper[4917]: I0318 08:00:05.642572 4917 generic.go:334] "Generic (PLEG): container finished" podID="1a5bb6b9-4df3-4211-8e19-7077e71eb072" containerID="294971aa5e589627de978c972b9332a4c63c8c1bc1d3ae87458055cdf6a87cd7" exitCode=0 Mar 18 08:00:05 crc kubenswrapper[4917]: I0318 08:00:05.642676 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563680-blp4p" event={"ID":"1a5bb6b9-4df3-4211-8e19-7077e71eb072","Type":"ContainerDied","Data":"294971aa5e589627de978c972b9332a4c63c8c1bc1d3ae87458055cdf6a87cd7"} Mar 18 08:00:05 crc kubenswrapper[4917]: I0318 08:00:05.790578 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e92583-8234-44fb-9d55-eedaaaf5bc93" path="/var/lib/kubelet/pods/45e92583-8234-44fb-9d55-eedaaaf5bc93/volumes" Mar 18 08:00:05 crc kubenswrapper[4917]: I0318 08:00:05.947435 4917 scope.go:117] "RemoveContainer" containerID="22270db9537b0b78ef3ed19d802b4a00b2fda30aee694f1a251e366cf3bab844" Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.096391 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563680-blp4p" Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.220483 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dhh2\" (UniqueName: \"kubernetes.io/projected/1a5bb6b9-4df3-4211-8e19-7077e71eb072-kube-api-access-2dhh2\") pod \"1a5bb6b9-4df3-4211-8e19-7077e71eb072\" (UID: \"1a5bb6b9-4df3-4211-8e19-7077e71eb072\") " Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.229945 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a5bb6b9-4df3-4211-8e19-7077e71eb072-kube-api-access-2dhh2" (OuterVolumeSpecName: "kube-api-access-2dhh2") pod "1a5bb6b9-4df3-4211-8e19-7077e71eb072" (UID: "1a5bb6b9-4df3-4211-8e19-7077e71eb072"). InnerVolumeSpecName "kube-api-access-2dhh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.322442 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dhh2\" (UniqueName: \"kubernetes.io/projected/1a5bb6b9-4df3-4211-8e19-7077e71eb072-kube-api-access-2dhh2\") on node \"crc\" DevicePath \"\"" Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.663640 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563680-blp4p" Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.663629 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563680-blp4p" event={"ID":"1a5bb6b9-4df3-4211-8e19-7077e71eb072","Type":"ContainerDied","Data":"fd27e9e2f460e328817a4fa53c9223d0ac7a2e238bc74dc38685c3f0ddb17882"} Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.663817 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd27e9e2f460e328817a4fa53c9223d0ac7a2e238bc74dc38685c3f0ddb17882" Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.730940 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563674-zgzjb"] Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.741885 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563674-zgzjb"] Mar 18 08:00:07 crc kubenswrapper[4917]: I0318 08:00:07.789422 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a64d73ee-a6dd-42af-b1f5-ccbf24ee8292" path="/var/lib/kubelet/pods/a64d73ee-a6dd-42af-b1f5-ccbf24ee8292/volumes" Mar 18 08:00:32 crc kubenswrapper[4917]: I0318 08:00:32.929498 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:00:32 crc kubenswrapper[4917]: I0318 08:00:32.930221 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:01:02 crc kubenswrapper[4917]: I0318 08:01:02.929740 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:01:02 crc kubenswrapper[4917]: I0318 08:01:02.930475 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:01:02 crc kubenswrapper[4917]: I0318 08:01:02.930545 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:01:02 crc kubenswrapper[4917]: I0318 08:01:02.931360 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5ab49504cd4a3cbf79368c9f693de03d31b8e9c6679f12fadae736ae2434a51"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:01:02 crc kubenswrapper[4917]: I0318 08:01:02.931469 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://c5ab49504cd4a3cbf79368c9f693de03d31b8e9c6679f12fadae736ae2434a51" gracePeriod=600 Mar 18 08:01:03 crc kubenswrapper[4917]: I0318 08:01:03.198487 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="c5ab49504cd4a3cbf79368c9f693de03d31b8e9c6679f12fadae736ae2434a51" exitCode=0 Mar 18 08:01:03 crc kubenswrapper[4917]: I0318 08:01:03.198684 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"c5ab49504cd4a3cbf79368c9f693de03d31b8e9c6679f12fadae736ae2434a51"} Mar 18 08:01:03 crc kubenswrapper[4917]: I0318 08:01:03.198920 4917 scope.go:117] "RemoveContainer" containerID="cd5ddaceffb3e07bd0429d8b9a8da1927b1def26f2078192c9d69560d9484f21" Mar 18 08:01:04 crc kubenswrapper[4917]: I0318 08:01:04.211548 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b"} Mar 18 08:01:06 crc kubenswrapper[4917]: I0318 08:01:06.024446 4917 scope.go:117] "RemoveContainer" containerID="0b394a0b47f6272235de311e9fc7b141d01bba0d58f651618789a9944e7590e5" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.374169 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-bd8lr"] Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.404267 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-bd8lr"] Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.481385 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-tn4xd"] Mar 18 08:01:59 crc kubenswrapper[4917]: E0318 08:01:59.481851 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5bb6b9-4df3-4211-8e19-7077e71eb072" containerName="oc" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.481916 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5bb6b9-4df3-4211-8e19-7077e71eb072" containerName="oc" Mar 18 08:01:59 crc kubenswrapper[4917]: E0318 08:01:59.481955 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713196bd-eb4e-42a5-b17e-265566a93719" containerName="collect-profiles" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.481970 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="713196bd-eb4e-42a5-b17e-265566a93719" containerName="collect-profiles" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.482234 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="713196bd-eb4e-42a5-b17e-265566a93719" containerName="collect-profiles" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.482264 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5bb6b9-4df3-4211-8e19-7077e71eb072" containerName="oc" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.483034 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.485241 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.485766 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.486020 4917 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-p9ph2" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.493267 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.503857 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tn4xd"] Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.505862 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/65ceffa7-f3d8-438f-a68f-f374329a27e2-node-mnt\") pod \"crc-storage-crc-tn4xd\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.506096 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/65ceffa7-f3d8-438f-a68f-f374329a27e2-crc-storage\") pod \"crc-storage-crc-tn4xd\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.506381 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76pdc\" (UniqueName: \"kubernetes.io/projected/65ceffa7-f3d8-438f-a68f-f374329a27e2-kube-api-access-76pdc\") pod \"crc-storage-crc-tn4xd\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.608331 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/65ceffa7-f3d8-438f-a68f-f374329a27e2-node-mnt\") pod \"crc-storage-crc-tn4xd\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.608379 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/65ceffa7-f3d8-438f-a68f-f374329a27e2-crc-storage\") pod \"crc-storage-crc-tn4xd\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.608455 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76pdc\" (UniqueName: \"kubernetes.io/projected/65ceffa7-f3d8-438f-a68f-f374329a27e2-kube-api-access-76pdc\") pod \"crc-storage-crc-tn4xd\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.608840 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/65ceffa7-f3d8-438f-a68f-f374329a27e2-node-mnt\") pod \"crc-storage-crc-tn4xd\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.609427 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/65ceffa7-f3d8-438f-a68f-f374329a27e2-crc-storage\") pod \"crc-storage-crc-tn4xd\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.631819 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76pdc\" (UniqueName: \"kubernetes.io/projected/65ceffa7-f3d8-438f-a68f-f374329a27e2-kube-api-access-76pdc\") pod \"crc-storage-crc-tn4xd\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.794079 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="984ea079-192f-436b-bde0-0b8bc6df9fe8" path="/var/lib/kubelet/pods/984ea079-192f-436b-bde0-0b8bc6df9fe8/volumes" Mar 18 08:01:59 crc kubenswrapper[4917]: I0318 08:01:59.818908 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.137384 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563682-mgm58"] Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.138288 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563682-mgm58" Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.140631 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.140989 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.141228 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.151234 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563682-mgm58"] Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.217851 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tshs2\" (UniqueName: \"kubernetes.io/projected/7b289f9a-33e9-449f-9ddb-5fdd1d92b79f-kube-api-access-tshs2\") pod \"auto-csr-approver-29563682-mgm58\" (UID: \"7b289f9a-33e9-449f-9ddb-5fdd1d92b79f\") " pod="openshift-infra/auto-csr-approver-29563682-mgm58" Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.265168 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-tn4xd"] Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.319673 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tshs2\" (UniqueName: \"kubernetes.io/projected/7b289f9a-33e9-449f-9ddb-5fdd1d92b79f-kube-api-access-tshs2\") pod \"auto-csr-approver-29563682-mgm58\" (UID: \"7b289f9a-33e9-449f-9ddb-5fdd1d92b79f\") " pod="openshift-infra/auto-csr-approver-29563682-mgm58" Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.340814 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tshs2\" (UniqueName: \"kubernetes.io/projected/7b289f9a-33e9-449f-9ddb-5fdd1d92b79f-kube-api-access-tshs2\") pod \"auto-csr-approver-29563682-mgm58\" (UID: \"7b289f9a-33e9-449f-9ddb-5fdd1d92b79f\") " pod="openshift-infra/auto-csr-approver-29563682-mgm58" Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.460070 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563682-mgm58" Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.722431 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tn4xd" event={"ID":"65ceffa7-f3d8-438f-a68f-f374329a27e2","Type":"ContainerStarted","Data":"e6d825a2343ee7a7a3a9a18fd00aa7ef4c85834ff5eef4f1b34460e8126089a3"} Mar 18 08:02:00 crc kubenswrapper[4917]: I0318 08:02:00.986186 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563682-mgm58"] Mar 18 08:02:00 crc kubenswrapper[4917]: W0318 08:02:00.997337 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b289f9a_33e9_449f_9ddb_5fdd1d92b79f.slice/crio-364762eb7c869ea4cbbb194889dcacca8fbae8ac1ec6f52d96919c5b44f5f9f9 WatchSource:0}: Error finding container 364762eb7c869ea4cbbb194889dcacca8fbae8ac1ec6f52d96919c5b44f5f9f9: Status 404 returned error can't find the container with id 364762eb7c869ea4cbbb194889dcacca8fbae8ac1ec6f52d96919c5b44f5f9f9 Mar 18 08:02:01 crc kubenswrapper[4917]: I0318 08:02:01.735450 4917 generic.go:334] "Generic (PLEG): container finished" podID="65ceffa7-f3d8-438f-a68f-f374329a27e2" containerID="44ccfa5f00f482cb279c279053272c502518cb70f2a28df13579dcd6ec0c22ac" exitCode=0 Mar 18 08:02:01 crc kubenswrapper[4917]: I0318 08:02:01.735550 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tn4xd" event={"ID":"65ceffa7-f3d8-438f-a68f-f374329a27e2","Type":"ContainerDied","Data":"44ccfa5f00f482cb279c279053272c502518cb70f2a28df13579dcd6ec0c22ac"} Mar 18 08:02:01 crc kubenswrapper[4917]: I0318 08:02:01.738640 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563682-mgm58" event={"ID":"7b289f9a-33e9-449f-9ddb-5fdd1d92b79f","Type":"ContainerStarted","Data":"364762eb7c869ea4cbbb194889dcacca8fbae8ac1ec6f52d96919c5b44f5f9f9"} Mar 18 08:02:02 crc kubenswrapper[4917]: I0318 08:02:02.750724 4917 generic.go:334] "Generic (PLEG): container finished" podID="7b289f9a-33e9-449f-9ddb-5fdd1d92b79f" containerID="b760d7516f5496e065370afe109fa37cdb48790ec80da757afa71f0d60e4e792" exitCode=0 Mar 18 08:02:02 crc kubenswrapper[4917]: I0318 08:02:02.750871 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563682-mgm58" event={"ID":"7b289f9a-33e9-449f-9ddb-5fdd1d92b79f","Type":"ContainerDied","Data":"b760d7516f5496e065370afe109fa37cdb48790ec80da757afa71f0d60e4e792"} Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.175763 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.266198 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/65ceffa7-f3d8-438f-a68f-f374329a27e2-crc-storage\") pod \"65ceffa7-f3d8-438f-a68f-f374329a27e2\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.266318 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76pdc\" (UniqueName: \"kubernetes.io/projected/65ceffa7-f3d8-438f-a68f-f374329a27e2-kube-api-access-76pdc\") pod \"65ceffa7-f3d8-438f-a68f-f374329a27e2\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.266341 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/65ceffa7-f3d8-438f-a68f-f374329a27e2-node-mnt\") pod \"65ceffa7-f3d8-438f-a68f-f374329a27e2\" (UID: \"65ceffa7-f3d8-438f-a68f-f374329a27e2\") " Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.266518 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65ceffa7-f3d8-438f-a68f-f374329a27e2-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "65ceffa7-f3d8-438f-a68f-f374329a27e2" (UID: "65ceffa7-f3d8-438f-a68f-f374329a27e2"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.273855 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65ceffa7-f3d8-438f-a68f-f374329a27e2-kube-api-access-76pdc" (OuterVolumeSpecName: "kube-api-access-76pdc") pod "65ceffa7-f3d8-438f-a68f-f374329a27e2" (UID: "65ceffa7-f3d8-438f-a68f-f374329a27e2"). InnerVolumeSpecName "kube-api-access-76pdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.293481 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65ceffa7-f3d8-438f-a68f-f374329a27e2-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "65ceffa7-f3d8-438f-a68f-f374329a27e2" (UID: "65ceffa7-f3d8-438f-a68f-f374329a27e2"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.367711 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76pdc\" (UniqueName: \"kubernetes.io/projected/65ceffa7-f3d8-438f-a68f-f374329a27e2-kube-api-access-76pdc\") on node \"crc\" DevicePath \"\"" Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.367745 4917 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/65ceffa7-f3d8-438f-a68f-f374329a27e2-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.367756 4917 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/65ceffa7-f3d8-438f-a68f-f374329a27e2-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.763972 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-tn4xd" Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.763962 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-tn4xd" event={"ID":"65ceffa7-f3d8-438f-a68f-f374329a27e2","Type":"ContainerDied","Data":"e6d825a2343ee7a7a3a9a18fd00aa7ef4c85834ff5eef4f1b34460e8126089a3"} Mar 18 08:02:03 crc kubenswrapper[4917]: I0318 08:02:03.764059 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6d825a2343ee7a7a3a9a18fd00aa7ef4c85834ff5eef4f1b34460e8126089a3" Mar 18 08:02:04 crc kubenswrapper[4917]: I0318 08:02:04.285795 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563682-mgm58" Mar 18 08:02:04 crc kubenswrapper[4917]: I0318 08:02:04.483052 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tshs2\" (UniqueName: \"kubernetes.io/projected/7b289f9a-33e9-449f-9ddb-5fdd1d92b79f-kube-api-access-tshs2\") pod \"7b289f9a-33e9-449f-9ddb-5fdd1d92b79f\" (UID: \"7b289f9a-33e9-449f-9ddb-5fdd1d92b79f\") " Mar 18 08:02:04 crc kubenswrapper[4917]: I0318 08:02:04.487058 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b289f9a-33e9-449f-9ddb-5fdd1d92b79f-kube-api-access-tshs2" (OuterVolumeSpecName: "kube-api-access-tshs2") pod "7b289f9a-33e9-449f-9ddb-5fdd1d92b79f" (UID: "7b289f9a-33e9-449f-9ddb-5fdd1d92b79f"). InnerVolumeSpecName "kube-api-access-tshs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:02:04 crc kubenswrapper[4917]: I0318 08:02:04.584715 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tshs2\" (UniqueName: \"kubernetes.io/projected/7b289f9a-33e9-449f-9ddb-5fdd1d92b79f-kube-api-access-tshs2\") on node \"crc\" DevicePath \"\"" Mar 18 08:02:04 crc kubenswrapper[4917]: I0318 08:02:04.774795 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563682-mgm58" event={"ID":"7b289f9a-33e9-449f-9ddb-5fdd1d92b79f","Type":"ContainerDied","Data":"364762eb7c869ea4cbbb194889dcacca8fbae8ac1ec6f52d96919c5b44f5f9f9"} Mar 18 08:02:04 crc kubenswrapper[4917]: I0318 08:02:04.774889 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="364762eb7c869ea4cbbb194889dcacca8fbae8ac1ec6f52d96919c5b44f5f9f9" Mar 18 08:02:04 crc kubenswrapper[4917]: I0318 08:02:04.774888 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563682-mgm58" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.249370 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-tn4xd"] Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.255968 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-tn4xd"] Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.351013 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563676-pb54r"] Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.360273 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563676-pb54r"] Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.368042 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-7mvs4"] Mar 18 08:02:05 crc kubenswrapper[4917]: E0318 08:02:05.368343 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65ceffa7-f3d8-438f-a68f-f374329a27e2" containerName="storage" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.368362 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="65ceffa7-f3d8-438f-a68f-f374329a27e2" containerName="storage" Mar 18 08:02:05 crc kubenswrapper[4917]: E0318 08:02:05.368393 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b289f9a-33e9-449f-9ddb-5fdd1d92b79f" containerName="oc" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.368403 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b289f9a-33e9-449f-9ddb-5fdd1d92b79f" containerName="oc" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.368602 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="65ceffa7-f3d8-438f-a68f-f374329a27e2" containerName="storage" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.368620 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b289f9a-33e9-449f-9ddb-5fdd1d92b79f" containerName="oc" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.369095 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.371035 4917 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-p9ph2" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.371386 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.371855 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.372574 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.379958 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7mvs4"] Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.497766 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/48f5adc3-0b57-42d8-a18d-340a703725b2-crc-storage\") pod \"crc-storage-crc-7mvs4\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.498061 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/48f5adc3-0b57-42d8-a18d-340a703725b2-node-mnt\") pod \"crc-storage-crc-7mvs4\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.498085 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m5cq\" (UniqueName: \"kubernetes.io/projected/48f5adc3-0b57-42d8-a18d-340a703725b2-kube-api-access-7m5cq\") pod \"crc-storage-crc-7mvs4\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.599117 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/48f5adc3-0b57-42d8-a18d-340a703725b2-crc-storage\") pod \"crc-storage-crc-7mvs4\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.599204 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/48f5adc3-0b57-42d8-a18d-340a703725b2-node-mnt\") pod \"crc-storage-crc-7mvs4\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.599238 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m5cq\" (UniqueName: \"kubernetes.io/projected/48f5adc3-0b57-42d8-a18d-340a703725b2-kube-api-access-7m5cq\") pod \"crc-storage-crc-7mvs4\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.599629 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/48f5adc3-0b57-42d8-a18d-340a703725b2-node-mnt\") pod \"crc-storage-crc-7mvs4\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.600050 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/48f5adc3-0b57-42d8-a18d-340a703725b2-crc-storage\") pod \"crc-storage-crc-7mvs4\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.621002 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m5cq\" (UniqueName: \"kubernetes.io/projected/48f5adc3-0b57-42d8-a18d-340a703725b2-kube-api-access-7m5cq\") pod \"crc-storage-crc-7mvs4\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.692202 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.792623 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65ceffa7-f3d8-438f-a68f-f374329a27e2" path="/var/lib/kubelet/pods/65ceffa7-f3d8-438f-a68f-f374329a27e2/volumes" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.794199 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57f2cb0-a891-427e-a4e3-1ef419741e5c" path="/var/lib/kubelet/pods/c57f2cb0-a891-427e-a4e3-1ef419741e5c/volumes" Mar 18 08:02:05 crc kubenswrapper[4917]: I0318 08:02:05.923339 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-7mvs4"] Mar 18 08:02:06 crc kubenswrapper[4917]: I0318 08:02:06.102979 4917 scope.go:117] "RemoveContainer" containerID="f670e74d63a6066383ea159d4a2a66749784be47b4bc831fa5b46e8221d75c25" Mar 18 08:02:06 crc kubenswrapper[4917]: W0318 08:02:06.336278 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48f5adc3_0b57_42d8_a18d_340a703725b2.slice/crio-b129a101b4bc895b8ec650aa16b9ad479cd3f3698402df0e280616d788f6baa8 WatchSource:0}: Error finding container b129a101b4bc895b8ec650aa16b9ad479cd3f3698402df0e280616d788f6baa8: Status 404 returned error can't find the container with id b129a101b4bc895b8ec650aa16b9ad479cd3f3698402df0e280616d788f6baa8 Mar 18 08:02:06 crc kubenswrapper[4917]: I0318 08:02:06.448578 4917 scope.go:117] "RemoveContainer" containerID="f74cc1947fd6115f75ced40312df2aa9e0d67ff7a2456bf48277b0abcf98e9f0" Mar 18 08:02:06 crc kubenswrapper[4917]: I0318 08:02:06.801369 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7mvs4" event={"ID":"48f5adc3-0b57-42d8-a18d-340a703725b2","Type":"ContainerStarted","Data":"b129a101b4bc895b8ec650aa16b9ad479cd3f3698402df0e280616d788f6baa8"} Mar 18 08:02:07 crc kubenswrapper[4917]: I0318 08:02:07.812380 4917 generic.go:334] "Generic (PLEG): container finished" podID="48f5adc3-0b57-42d8-a18d-340a703725b2" containerID="2cdb91d337f34f102060a70ed23a8f9f04151b63f591085e49492071be3927f9" exitCode=0 Mar 18 08:02:07 crc kubenswrapper[4917]: I0318 08:02:07.812442 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7mvs4" event={"ID":"48f5adc3-0b57-42d8-a18d-340a703725b2","Type":"ContainerDied","Data":"2cdb91d337f34f102060a70ed23a8f9f04151b63f591085e49492071be3927f9"} Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.156648 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.271975 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/48f5adc3-0b57-42d8-a18d-340a703725b2-node-mnt\") pod \"48f5adc3-0b57-42d8-a18d-340a703725b2\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.272055 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/48f5adc3-0b57-42d8-a18d-340a703725b2-crc-storage\") pod \"48f5adc3-0b57-42d8-a18d-340a703725b2\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.272090 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48f5adc3-0b57-42d8-a18d-340a703725b2-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "48f5adc3-0b57-42d8-a18d-340a703725b2" (UID: "48f5adc3-0b57-42d8-a18d-340a703725b2"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.272247 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m5cq\" (UniqueName: \"kubernetes.io/projected/48f5adc3-0b57-42d8-a18d-340a703725b2-kube-api-access-7m5cq\") pod \"48f5adc3-0b57-42d8-a18d-340a703725b2\" (UID: \"48f5adc3-0b57-42d8-a18d-340a703725b2\") " Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.272831 4917 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/48f5adc3-0b57-42d8-a18d-340a703725b2-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.277715 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f5adc3-0b57-42d8-a18d-340a703725b2-kube-api-access-7m5cq" (OuterVolumeSpecName: "kube-api-access-7m5cq") pod "48f5adc3-0b57-42d8-a18d-340a703725b2" (UID: "48f5adc3-0b57-42d8-a18d-340a703725b2"). InnerVolumeSpecName "kube-api-access-7m5cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.298432 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48f5adc3-0b57-42d8-a18d-340a703725b2-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "48f5adc3-0b57-42d8-a18d-340a703725b2" (UID: "48f5adc3-0b57-42d8-a18d-340a703725b2"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.374076 4917 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/48f5adc3-0b57-42d8-a18d-340a703725b2-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.374113 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m5cq\" (UniqueName: \"kubernetes.io/projected/48f5adc3-0b57-42d8-a18d-340a703725b2-kube-api-access-7m5cq\") on node \"crc\" DevicePath \"\"" Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.829394 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-7mvs4" event={"ID":"48f5adc3-0b57-42d8-a18d-340a703725b2","Type":"ContainerDied","Data":"b129a101b4bc895b8ec650aa16b9ad479cd3f3698402df0e280616d788f6baa8"} Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.829431 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b129a101b4bc895b8ec650aa16b9ad479cd3f3698402df0e280616d788f6baa8" Mar 18 08:02:09 crc kubenswrapper[4917]: I0318 08:02:09.829482 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-7mvs4" Mar 18 08:03:32 crc kubenswrapper[4917]: I0318 08:03:32.928848 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:03:32 crc kubenswrapper[4917]: I0318 08:03:32.931100 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.471686 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lrnfk"] Mar 18 08:03:47 crc kubenswrapper[4917]: E0318 08:03:47.472884 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f5adc3-0b57-42d8-a18d-340a703725b2" containerName="storage" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.472910 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f5adc3-0b57-42d8-a18d-340a703725b2" containerName="storage" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.473238 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f5adc3-0b57-42d8-a18d-340a703725b2" containerName="storage" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.476395 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.486028 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrnfk"] Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.583045 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11b8a666-98f1-4d91-aae4-259d185a772a-utilities\") pod \"community-operators-lrnfk\" (UID: \"11b8a666-98f1-4d91-aae4-259d185a772a\") " pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.583148 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wl7w\" (UniqueName: \"kubernetes.io/projected/11b8a666-98f1-4d91-aae4-259d185a772a-kube-api-access-6wl7w\") pod \"community-operators-lrnfk\" (UID: \"11b8a666-98f1-4d91-aae4-259d185a772a\") " pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.583174 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11b8a666-98f1-4d91-aae4-259d185a772a-catalog-content\") pod \"community-operators-lrnfk\" (UID: \"11b8a666-98f1-4d91-aae4-259d185a772a\") " pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.684790 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wl7w\" (UniqueName: \"kubernetes.io/projected/11b8a666-98f1-4d91-aae4-259d185a772a-kube-api-access-6wl7w\") pod \"community-operators-lrnfk\" (UID: \"11b8a666-98f1-4d91-aae4-259d185a772a\") " pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.684844 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11b8a666-98f1-4d91-aae4-259d185a772a-catalog-content\") pod \"community-operators-lrnfk\" (UID: \"11b8a666-98f1-4d91-aae4-259d185a772a\") " pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.684925 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11b8a666-98f1-4d91-aae4-259d185a772a-utilities\") pod \"community-operators-lrnfk\" (UID: \"11b8a666-98f1-4d91-aae4-259d185a772a\") " pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.685347 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11b8a666-98f1-4d91-aae4-259d185a772a-utilities\") pod \"community-operators-lrnfk\" (UID: \"11b8a666-98f1-4d91-aae4-259d185a772a\") " pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.685468 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11b8a666-98f1-4d91-aae4-259d185a772a-catalog-content\") pod \"community-operators-lrnfk\" (UID: \"11b8a666-98f1-4d91-aae4-259d185a772a\") " pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.723509 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wl7w\" (UniqueName: \"kubernetes.io/projected/11b8a666-98f1-4d91-aae4-259d185a772a-kube-api-access-6wl7w\") pod \"community-operators-lrnfk\" (UID: \"11b8a666-98f1-4d91-aae4-259d185a772a\") " pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:47 crc kubenswrapper[4917]: I0318 08:03:47.810620 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:48 crc kubenswrapper[4917]: I0318 08:03:48.382645 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrnfk"] Mar 18 08:03:48 crc kubenswrapper[4917]: I0318 08:03:48.668548 4917 generic.go:334] "Generic (PLEG): container finished" podID="11b8a666-98f1-4d91-aae4-259d185a772a" containerID="29900e45f919c7ce25ba115915d0fa50262d3e2e885aedfebcd9922d8e1bbdf3" exitCode=0 Mar 18 08:03:48 crc kubenswrapper[4917]: I0318 08:03:48.668650 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrnfk" event={"ID":"11b8a666-98f1-4d91-aae4-259d185a772a","Type":"ContainerDied","Data":"29900e45f919c7ce25ba115915d0fa50262d3e2e885aedfebcd9922d8e1bbdf3"} Mar 18 08:03:48 crc kubenswrapper[4917]: I0318 08:03:48.668681 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrnfk" event={"ID":"11b8a666-98f1-4d91-aae4-259d185a772a","Type":"ContainerStarted","Data":"86b373dac0f139632c28efda07050bfe7500a38c8def84d4485f1441da326f82"} Mar 18 08:03:48 crc kubenswrapper[4917]: I0318 08:03:48.670721 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:03:52 crc kubenswrapper[4917]: I0318 08:03:52.697143 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrnfk" event={"ID":"11b8a666-98f1-4d91-aae4-259d185a772a","Type":"ContainerStarted","Data":"e34276deb7052a6a9aefd8d3761e48b90470e82ce8f47d58cf050bc4bc9de95a"} Mar 18 08:03:53 crc kubenswrapper[4917]: I0318 08:03:53.706937 4917 generic.go:334] "Generic (PLEG): container finished" podID="11b8a666-98f1-4d91-aae4-259d185a772a" containerID="e34276deb7052a6a9aefd8d3761e48b90470e82ce8f47d58cf050bc4bc9de95a" exitCode=0 Mar 18 08:03:53 crc kubenswrapper[4917]: I0318 08:03:53.707015 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrnfk" event={"ID":"11b8a666-98f1-4d91-aae4-259d185a772a","Type":"ContainerDied","Data":"e34276deb7052a6a9aefd8d3761e48b90470e82ce8f47d58cf050bc4bc9de95a"} Mar 18 08:03:54 crc kubenswrapper[4917]: I0318 08:03:54.716552 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrnfk" event={"ID":"11b8a666-98f1-4d91-aae4-259d185a772a","Type":"ContainerStarted","Data":"6bb0b792ecbf5a08ddb7f9467d24d5ce4236d3bc6b04b6aab4546fd4d28d4401"} Mar 18 08:03:57 crc kubenswrapper[4917]: I0318 08:03:57.811672 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:57 crc kubenswrapper[4917]: I0318 08:03:57.812342 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:57 crc kubenswrapper[4917]: I0318 08:03:57.895785 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:03:57 crc kubenswrapper[4917]: I0318 08:03:57.923804 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lrnfk" podStartSLOduration=5.369409027 podStartE2EDuration="10.923785224s" podCreationTimestamp="2026-03-18 08:03:47 +0000 UTC" firstStartedPulling="2026-03-18 08:03:48.67042254 +0000 UTC m=+4613.611577254" lastFinishedPulling="2026-03-18 08:03:54.224798727 +0000 UTC m=+4619.165953451" observedRunningTime="2026-03-18 08:03:54.741492239 +0000 UTC m=+4619.682646943" watchObservedRunningTime="2026-03-18 08:03:57.923785224 +0000 UTC m=+4622.864939958" Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.139671 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563684-gvwjl"] Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.140797 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563684-gvwjl" Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.142812 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.145161 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.146114 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.159640 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563684-gvwjl"] Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.214889 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr976\" (UniqueName: \"kubernetes.io/projected/4892baad-c970-4cbe-b214-aa0b925a071c-kube-api-access-rr976\") pod \"auto-csr-approver-29563684-gvwjl\" (UID: \"4892baad-c970-4cbe-b214-aa0b925a071c\") " pod="openshift-infra/auto-csr-approver-29563684-gvwjl" Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.315992 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr976\" (UniqueName: \"kubernetes.io/projected/4892baad-c970-4cbe-b214-aa0b925a071c-kube-api-access-rr976\") pod \"auto-csr-approver-29563684-gvwjl\" (UID: \"4892baad-c970-4cbe-b214-aa0b925a071c\") " pod="openshift-infra/auto-csr-approver-29563684-gvwjl" Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.357404 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr976\" (UniqueName: \"kubernetes.io/projected/4892baad-c970-4cbe-b214-aa0b925a071c-kube-api-access-rr976\") pod \"auto-csr-approver-29563684-gvwjl\" (UID: \"4892baad-c970-4cbe-b214-aa0b925a071c\") " pod="openshift-infra/auto-csr-approver-29563684-gvwjl" Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.469192 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563684-gvwjl" Mar 18 08:04:00 crc kubenswrapper[4917]: I0318 08:04:00.990880 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563684-gvwjl"] Mar 18 08:04:01 crc kubenswrapper[4917]: I0318 08:04:01.770040 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563684-gvwjl" event={"ID":"4892baad-c970-4cbe-b214-aa0b925a071c","Type":"ContainerStarted","Data":"ed6008cc4613e8ae9143fb019e5a3eb28567200c3a10e1006f9d8a0b6e4859ca"} Mar 18 08:04:02 crc kubenswrapper[4917]: I0318 08:04:02.777179 4917 generic.go:334] "Generic (PLEG): container finished" podID="4892baad-c970-4cbe-b214-aa0b925a071c" containerID="92158f7a67ada285ce933d134bf88e16927900ad8018321e5223ab5e83cd0547" exitCode=0 Mar 18 08:04:02 crc kubenswrapper[4917]: I0318 08:04:02.777247 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563684-gvwjl" event={"ID":"4892baad-c970-4cbe-b214-aa0b925a071c","Type":"ContainerDied","Data":"92158f7a67ada285ce933d134bf88e16927900ad8018321e5223ab5e83cd0547"} Mar 18 08:04:02 crc kubenswrapper[4917]: I0318 08:04:02.929967 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:04:02 crc kubenswrapper[4917]: I0318 08:04:02.930063 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:04:04 crc kubenswrapper[4917]: I0318 08:04:04.061440 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563684-gvwjl" Mar 18 08:04:04 crc kubenswrapper[4917]: I0318 08:04:04.183874 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr976\" (UniqueName: \"kubernetes.io/projected/4892baad-c970-4cbe-b214-aa0b925a071c-kube-api-access-rr976\") pod \"4892baad-c970-4cbe-b214-aa0b925a071c\" (UID: \"4892baad-c970-4cbe-b214-aa0b925a071c\") " Mar 18 08:04:04 crc kubenswrapper[4917]: I0318 08:04:04.193149 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4892baad-c970-4cbe-b214-aa0b925a071c-kube-api-access-rr976" (OuterVolumeSpecName: "kube-api-access-rr976") pod "4892baad-c970-4cbe-b214-aa0b925a071c" (UID: "4892baad-c970-4cbe-b214-aa0b925a071c"). InnerVolumeSpecName "kube-api-access-rr976". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:04:04 crc kubenswrapper[4917]: I0318 08:04:04.285734 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr976\" (UniqueName: \"kubernetes.io/projected/4892baad-c970-4cbe-b214-aa0b925a071c-kube-api-access-rr976\") on node \"crc\" DevicePath \"\"" Mar 18 08:04:04 crc kubenswrapper[4917]: I0318 08:04:04.797894 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563684-gvwjl" event={"ID":"4892baad-c970-4cbe-b214-aa0b925a071c","Type":"ContainerDied","Data":"ed6008cc4613e8ae9143fb019e5a3eb28567200c3a10e1006f9d8a0b6e4859ca"} Mar 18 08:04:04 crc kubenswrapper[4917]: I0318 08:04:04.797947 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed6008cc4613e8ae9143fb019e5a3eb28567200c3a10e1006f9d8a0b6e4859ca" Mar 18 08:04:04 crc kubenswrapper[4917]: I0318 08:04:04.797973 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563684-gvwjl" Mar 18 08:04:05 crc kubenswrapper[4917]: I0318 08:04:05.157873 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563678-n8lgk"] Mar 18 08:04:05 crc kubenswrapper[4917]: I0318 08:04:05.165472 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563678-n8lgk"] Mar 18 08:04:05 crc kubenswrapper[4917]: I0318 08:04:05.782183 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277b2e07-a273-498b-b9c6-05e0a1a2c297" path="/var/lib/kubelet/pods/277b2e07-a273-498b-b9c6-05e0a1a2c297/volumes" Mar 18 08:04:06 crc kubenswrapper[4917]: I0318 08:04:06.588343 4917 scope.go:117] "RemoveContainer" containerID="665f9315bf2bf2d4a05233390fd6e683f3ee5307d41ec139b23c07efda7d2697" Mar 18 08:04:07 crc kubenswrapper[4917]: I0318 08:04:07.886856 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lrnfk" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.005893 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrnfk"] Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.155675 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lpwxl"] Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.155966 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lpwxl" podUID="51220ace-9206-4db4-86c5-752d63a97ae2" containerName="registry-server" containerID="cri-o://6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2" gracePeriod=2 Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.578992 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lpwxl" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.673071 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-utilities\") pod \"51220ace-9206-4db4-86c5-752d63a97ae2\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.673112 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-catalog-content\") pod \"51220ace-9206-4db4-86c5-752d63a97ae2\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.674186 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-utilities" (OuterVolumeSpecName: "utilities") pod "51220ace-9206-4db4-86c5-752d63a97ae2" (UID: "51220ace-9206-4db4-86c5-752d63a97ae2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.673495 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjswp\" (UniqueName: \"kubernetes.io/projected/51220ace-9206-4db4-86c5-752d63a97ae2-kube-api-access-vjswp\") pod \"51220ace-9206-4db4-86c5-752d63a97ae2\" (UID: \"51220ace-9206-4db4-86c5-752d63a97ae2\") " Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.678848 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.704575 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51220ace-9206-4db4-86c5-752d63a97ae2-kube-api-access-vjswp" (OuterVolumeSpecName: "kube-api-access-vjswp") pod "51220ace-9206-4db4-86c5-752d63a97ae2" (UID: "51220ace-9206-4db4-86c5-752d63a97ae2"). InnerVolumeSpecName "kube-api-access-vjswp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.726456 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51220ace-9206-4db4-86c5-752d63a97ae2" (UID: "51220ace-9206-4db4-86c5-752d63a97ae2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.782642 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51220ace-9206-4db4-86c5-752d63a97ae2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.782669 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjswp\" (UniqueName: \"kubernetes.io/projected/51220ace-9206-4db4-86c5-752d63a97ae2-kube-api-access-vjswp\") on node \"crc\" DevicePath \"\"" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.839104 4917 generic.go:334] "Generic (PLEG): container finished" podID="51220ace-9206-4db4-86c5-752d63a97ae2" containerID="6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2" exitCode=0 Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.839197 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lpwxl" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.839190 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lpwxl" event={"ID":"51220ace-9206-4db4-86c5-752d63a97ae2","Type":"ContainerDied","Data":"6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2"} Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.839279 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lpwxl" event={"ID":"51220ace-9206-4db4-86c5-752d63a97ae2","Type":"ContainerDied","Data":"126546a9bae03fb26b37b4011caef93b0e96e76e0c8d4a3024b9f582c035f5cc"} Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.839311 4917 scope.go:117] "RemoveContainer" containerID="6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.858220 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lpwxl"] Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.858733 4917 scope.go:117] "RemoveContainer" containerID="c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.873794 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lpwxl"] Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.880609 4917 scope.go:117] "RemoveContainer" containerID="ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.906049 4917 scope.go:117] "RemoveContainer" containerID="6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2" Mar 18 08:04:09 crc kubenswrapper[4917]: E0318 08:04:09.906614 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2\": container with ID starting with 6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2 not found: ID does not exist" containerID="6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.906664 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2"} err="failed to get container status \"6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2\": rpc error: code = NotFound desc = could not find container \"6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2\": container with ID starting with 6637daf13c65e8971c44a54ea2adb662c5573e8c973e91139227eae0f372fab2 not found: ID does not exist" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.906700 4917 scope.go:117] "RemoveContainer" containerID="c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793" Mar 18 08:04:09 crc kubenswrapper[4917]: E0318 08:04:09.907205 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793\": container with ID starting with c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793 not found: ID does not exist" containerID="c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.907241 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793"} err="failed to get container status \"c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793\": rpc error: code = NotFound desc = could not find container \"c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793\": container with ID starting with c6c46fea8e8fd4052df5e106ef0f2b00060a0bd43aca69616acc60b80fec3793 not found: ID does not exist" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.907264 4917 scope.go:117] "RemoveContainer" containerID="ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90" Mar 18 08:04:09 crc kubenswrapper[4917]: E0318 08:04:09.907627 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90\": container with ID starting with ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90 not found: ID does not exist" containerID="ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90" Mar 18 08:04:09 crc kubenswrapper[4917]: I0318 08:04:09.907663 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90"} err="failed to get container status \"ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90\": rpc error: code = NotFound desc = could not find container \"ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90\": container with ID starting with ee0b3a0f0de3f8a0c127a7bcb2733a9f0d381d268083c3e833daf74add5b8c90 not found: ID does not exist" Mar 18 08:04:11 crc kubenswrapper[4917]: I0318 08:04:11.782182 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51220ace-9206-4db4-86c5-752d63a97ae2" path="/var/lib/kubelet/pods/51220ace-9206-4db4-86c5-752d63a97ae2/volumes" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.843964 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77fdfd445f-tbst4"] Mar 18 08:04:12 crc kubenswrapper[4917]: E0318 08:04:12.845196 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51220ace-9206-4db4-86c5-752d63a97ae2" containerName="extract-utilities" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.845272 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="51220ace-9206-4db4-86c5-752d63a97ae2" containerName="extract-utilities" Mar 18 08:04:12 crc kubenswrapper[4917]: E0318 08:04:12.845364 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51220ace-9206-4db4-86c5-752d63a97ae2" containerName="extract-content" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.845432 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="51220ace-9206-4db4-86c5-752d63a97ae2" containerName="extract-content" Mar 18 08:04:12 crc kubenswrapper[4917]: E0318 08:04:12.845492 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51220ace-9206-4db4-86c5-752d63a97ae2" containerName="registry-server" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.845578 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="51220ace-9206-4db4-86c5-752d63a97ae2" containerName="registry-server" Mar 18 08:04:12 crc kubenswrapper[4917]: E0318 08:04:12.845672 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4892baad-c970-4cbe-b214-aa0b925a071c" containerName="oc" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.845724 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4892baad-c970-4cbe-b214-aa0b925a071c" containerName="oc" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.845920 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="51220ace-9206-4db4-86c5-752d63a97ae2" containerName="registry-server" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.845987 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4892baad-c970-4cbe-b214-aa0b925a071c" containerName="oc" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.846840 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.848751 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.848915 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.849042 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.848747 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cdfcf6547-nwwbg"] Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.849239 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.850179 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.851131 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-fdbwr" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.876231 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cdfcf6547-nwwbg"] Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.897928 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77fdfd445f-tbst4"] Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.925872 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jssw5\" (UniqueName: \"kubernetes.io/projected/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-kube-api-access-jssw5\") pod \"dnsmasq-dns-cdfcf6547-nwwbg\" (UID: \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\") " pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.926118 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmvrq\" (UniqueName: \"kubernetes.io/projected/9471592b-6a9e-4b3b-817c-3dbaee4dde51-kube-api-access-tmvrq\") pod \"dnsmasq-dns-77fdfd445f-tbst4\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.926225 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-config\") pod \"dnsmasq-dns-77fdfd445f-tbst4\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.926311 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-config\") pod \"dnsmasq-dns-cdfcf6547-nwwbg\" (UID: \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\") " pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:12 crc kubenswrapper[4917]: I0318 08:04:12.926435 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-dns-svc\") pod \"dnsmasq-dns-77fdfd445f-tbst4\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.028123 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jssw5\" (UniqueName: \"kubernetes.io/projected/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-kube-api-access-jssw5\") pod \"dnsmasq-dns-cdfcf6547-nwwbg\" (UID: \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\") " pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.028434 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmvrq\" (UniqueName: \"kubernetes.io/projected/9471592b-6a9e-4b3b-817c-3dbaee4dde51-kube-api-access-tmvrq\") pod \"dnsmasq-dns-77fdfd445f-tbst4\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.028866 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-config\") pod \"dnsmasq-dns-77fdfd445f-tbst4\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.029686 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-config\") pod \"dnsmasq-dns-77fdfd445f-tbst4\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.029825 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-config\") pod \"dnsmasq-dns-cdfcf6547-nwwbg\" (UID: \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\") " pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.030114 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-dns-svc\") pod \"dnsmasq-dns-77fdfd445f-tbst4\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.030746 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-dns-svc\") pod \"dnsmasq-dns-77fdfd445f-tbst4\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.030829 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-config\") pod \"dnsmasq-dns-cdfcf6547-nwwbg\" (UID: \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\") " pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.050049 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmvrq\" (UniqueName: \"kubernetes.io/projected/9471592b-6a9e-4b3b-817c-3dbaee4dde51-kube-api-access-tmvrq\") pod \"dnsmasq-dns-77fdfd445f-tbst4\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.051459 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jssw5\" (UniqueName: \"kubernetes.io/projected/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-kube-api-access-jssw5\") pod \"dnsmasq-dns-cdfcf6547-nwwbg\" (UID: \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\") " pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.163991 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.171027 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.436207 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77fdfd445f-tbst4"] Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.452079 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-647889bd4c-wzxzw"] Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.453661 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.460546 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647889bd4c-wzxzw"] Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.538726 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkl44\" (UniqueName: \"kubernetes.io/projected/7a0c15c5-7587-4953-b381-b37fa4fbee25-kube-api-access-kkl44\") pod \"dnsmasq-dns-647889bd4c-wzxzw\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.538791 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-dns-svc\") pod \"dnsmasq-dns-647889bd4c-wzxzw\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.538880 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-config\") pod \"dnsmasq-dns-647889bd4c-wzxzw\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.640232 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkl44\" (UniqueName: \"kubernetes.io/projected/7a0c15c5-7587-4953-b381-b37fa4fbee25-kube-api-access-kkl44\") pod \"dnsmasq-dns-647889bd4c-wzxzw\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.640278 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-dns-svc\") pod \"dnsmasq-dns-647889bd4c-wzxzw\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.640325 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-config\") pod \"dnsmasq-dns-647889bd4c-wzxzw\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.641113 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-config\") pod \"dnsmasq-dns-647889bd4c-wzxzw\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.641409 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-dns-svc\") pod \"dnsmasq-dns-647889bd4c-wzxzw\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.669616 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77fdfd445f-tbst4"] Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.677541 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkl44\" (UniqueName: \"kubernetes.io/projected/7a0c15c5-7587-4953-b381-b37fa4fbee25-kube-api-access-kkl44\") pod \"dnsmasq-dns-647889bd4c-wzxzw\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.710834 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cdfcf6547-nwwbg"] Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.746108 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dccc9c6df-872gm"] Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.747397 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.752211 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cdfcf6547-nwwbg"] Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.762030 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dccc9c6df-872gm"] Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.777313 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.847401 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-dns-svc\") pod \"dnsmasq-dns-5dccc9c6df-872gm\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.847512 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-config\") pod \"dnsmasq-dns-5dccc9c6df-872gm\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.847557 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlwlp\" (UniqueName: \"kubernetes.io/projected/632534e3-b259-4851-8d0c-13b538a945f8-kube-api-access-zlwlp\") pod \"dnsmasq-dns-5dccc9c6df-872gm\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.889421 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" event={"ID":"f78dc3bc-6142-423a-a2f2-4af81d88ef9b","Type":"ContainerStarted","Data":"9cd22930a81854e210663a2a7f603c0cfa5721e13d4990d843f6460f14765161"} Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.894653 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" event={"ID":"9471592b-6a9e-4b3b-817c-3dbaee4dde51","Type":"ContainerStarted","Data":"33b4d3afeb46774790c5c41440bcaa88315b05a06829426b8b4b7d785f2ab0ba"} Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.962281 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-dns-svc\") pod \"dnsmasq-dns-5dccc9c6df-872gm\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.962581 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-config\") pod \"dnsmasq-dns-5dccc9c6df-872gm\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.962625 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlwlp\" (UniqueName: \"kubernetes.io/projected/632534e3-b259-4851-8d0c-13b538a945f8-kube-api-access-zlwlp\") pod \"dnsmasq-dns-5dccc9c6df-872gm\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.963283 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-dns-svc\") pod \"dnsmasq-dns-5dccc9c6df-872gm\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.963922 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-config\") pod \"dnsmasq-dns-5dccc9c6df-872gm\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:13 crc kubenswrapper[4917]: I0318 08:04:13.980505 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlwlp\" (UniqueName: \"kubernetes.io/projected/632534e3-b259-4851-8d0c-13b538a945f8-kube-api-access-zlwlp\") pod \"dnsmasq-dns-5dccc9c6df-872gm\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.088985 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.210374 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-647889bd4c-wzxzw"] Mar 18 08:04:14 crc kubenswrapper[4917]: W0318 08:04:14.221554 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a0c15c5_7587_4953_b381_b37fa4fbee25.slice/crio-6d5227182f50c70ad14b0a563ac7234ffca5d9bf1c688fbef3a8cd2cf5bf0260 WatchSource:0}: Error finding container 6d5227182f50c70ad14b0a563ac7234ffca5d9bf1c688fbef3a8cd2cf5bf0260: Status 404 returned error can't find the container with id 6d5227182f50c70ad14b0a563ac7234ffca5d9bf1c688fbef3a8cd2cf5bf0260 Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.518832 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dccc9c6df-872gm"] Mar 18 08:04:14 crc kubenswrapper[4917]: W0318 08:04:14.528483 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod632534e3_b259_4851_8d0c_13b538a945f8.slice/crio-7aaf5d091e6399137ac598c92057c0dca8502066e91cfe342d8845e609ec65c5 WatchSource:0}: Error finding container 7aaf5d091e6399137ac598c92057c0dca8502066e91cfe342d8845e609ec65c5: Status 404 returned error can't find the container with id 7aaf5d091e6399137ac598c92057c0dca8502066e91cfe342d8845e609ec65c5 Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.598693 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.604403 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.607708 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.607938 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.607957 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.607963 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.608102 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.608135 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fxjm7" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.608302 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.612793 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.695459 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.695522 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.695550 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.695587 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.695638 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-server-conf\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.695835 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.695899 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13dca274-b8d1-439f-b3cc-a073f12bdc37-pod-info\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.696273 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.696352 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-config-data\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.696405 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13dca274-b8d1-439f-b3cc-a073f12bdc37-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.696505 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7j9q\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-kube-api-access-k7j9q\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798078 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13dca274-b8d1-439f-b3cc-a073f12bdc37-pod-info\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798141 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798213 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-config-data\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798267 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13dca274-b8d1-439f-b3cc-a073f12bdc37-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798305 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7j9q\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-kube-api-access-k7j9q\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798371 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798414 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798435 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798486 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798511 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-server-conf\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.798556 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.799450 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.800105 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.800306 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.801062 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-server-conf\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.803387 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-config-data\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.804833 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13dca274-b8d1-439f-b3cc-a073f12bdc37-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.804947 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.805642 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13dca274-b8d1-439f-b3cc-a073f12bdc37-pod-info\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.807140 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.810420 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.810448 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/28cec60ac3c0a43ee12064d068e25a8071e127eaeb21d18b0e80bb3c74ab3653/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.816404 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7j9q\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-kube-api-access-k7j9q\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.844837 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") pod \"rabbitmq-server-0\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " pod="openstack/rabbitmq-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.893749 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.894850 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.897759 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.898149 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.898210 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.898367 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.898543 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.900924 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.903630 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kmn7r" Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.907127 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" event={"ID":"632534e3-b259-4851-8d0c-13b538a945f8","Type":"ContainerStarted","Data":"7aaf5d091e6399137ac598c92057c0dca8502066e91cfe342d8845e609ec65c5"} Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.911170 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" event={"ID":"7a0c15c5-7587-4953-b381-b37fa4fbee25","Type":"ContainerStarted","Data":"6d5227182f50c70ad14b0a563ac7234ffca5d9bf1c688fbef3a8cd2cf5bf0260"} Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.919837 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 08:04:14 crc kubenswrapper[4917]: I0318 08:04:14.931970 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.003747 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d758aedb-3012-4f5c-badd-725b4a4b8a42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.003937 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.003999 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.004030 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.004046 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-kube-api-access-8x885\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.004079 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d758aedb-3012-4f5c-badd-725b4a4b8a42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.004131 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.004241 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.004270 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.004291 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.004313 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108267 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108321 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108341 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108360 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108393 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d758aedb-3012-4f5c-badd-725b4a4b8a42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108441 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108467 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108486 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108500 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-kube-api-access-8x885\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108520 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d758aedb-3012-4f5c-badd-725b4a4b8a42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108546 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.108995 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.110069 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.110210 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.110336 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.111404 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.113467 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.113565 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.117049 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d758aedb-3012-4f5c-badd-725b4a4b8a42-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.135969 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d758aedb-3012-4f5c-badd-725b4a4b8a42-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.136861 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-kube-api-access-8x885\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.169289 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.169334 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/51e6c584895ab848edd398d70774c4e05a3109af4de762115a2f273ac296e4ec/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.371229 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.478983 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.516530 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.714887 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.723363 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.726448 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.728251 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.729997 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.730270 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-z2c2l" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.735273 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.737142 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.818716 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0524b90-39ac-4532-820a-23f804a96420-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.818777 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0524b90-39ac-4532-820a-23f804a96420-kolla-config\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.818899 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0524b90-39ac-4532-820a-23f804a96420-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.818928 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0524b90-39ac-4532-820a-23f804a96420-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.819012 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8974s\" (UniqueName: \"kubernetes.io/projected/b0524b90-39ac-4532-820a-23f804a96420-kube-api-access-8974s\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.819055 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-98a6621c-a22e-4a3d-ad53-2d85800f7ea3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98a6621c-a22e-4a3d-ad53-2d85800f7ea3\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.819086 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0524b90-39ac-4532-820a-23f804a96420-config-data-default\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.819116 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0524b90-39ac-4532-820a-23f804a96420-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.922093 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8974s\" (UniqueName: \"kubernetes.io/projected/b0524b90-39ac-4532-820a-23f804a96420-kube-api-access-8974s\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.922197 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-98a6621c-a22e-4a3d-ad53-2d85800f7ea3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98a6621c-a22e-4a3d-ad53-2d85800f7ea3\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.922546 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0524b90-39ac-4532-820a-23f804a96420-config-data-default\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.922689 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0524b90-39ac-4532-820a-23f804a96420-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.922779 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0524b90-39ac-4532-820a-23f804a96420-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.922857 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0524b90-39ac-4532-820a-23f804a96420-kolla-config\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.923022 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0524b90-39ac-4532-820a-23f804a96420-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.923075 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0524b90-39ac-4532-820a-23f804a96420-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.923546 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b0524b90-39ac-4532-820a-23f804a96420-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.924147 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b0524b90-39ac-4532-820a-23f804a96420-kolla-config\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.927119 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b0524b90-39ac-4532-820a-23f804a96420-config-data-default\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.929310 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b0524b90-39ac-4532-820a-23f804a96420-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.932016 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.932330 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-98a6621c-a22e-4a3d-ad53-2d85800f7ea3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98a6621c-a22e-4a3d-ad53-2d85800f7ea3\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/95f02d86839e41df8a148856aaea6ca95a4d1cc74a9b7a6f006e4d32081ca152/globalmount\"" pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.944918 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0524b90-39ac-4532-820a-23f804a96420-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.946167 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0524b90-39ac-4532-820a-23f804a96420-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.950329 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"13dca274-b8d1-439f-b3cc-a073f12bdc37","Type":"ContainerStarted","Data":"1a2fc1d8c51eb649f0b6c71b0fac9504ab7f2ed2db33f9a31a100afc66861d52"} Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.959250 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8974s\" (UniqueName: \"kubernetes.io/projected/b0524b90-39ac-4532-820a-23f804a96420-kube-api-access-8974s\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:15 crc kubenswrapper[4917]: I0318 08:04:15.988724 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-98a6621c-a22e-4a3d-ad53-2d85800f7ea3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98a6621c-a22e-4a3d-ad53-2d85800f7ea3\") pod \"openstack-galera-0\" (UID: \"b0524b90-39ac-4532-820a-23f804a96420\") " pod="openstack/openstack-galera-0" Mar 18 08:04:16 crc kubenswrapper[4917]: I0318 08:04:16.013510 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 08:04:16 crc kubenswrapper[4917]: I0318 08:04:16.063652 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 18 08:04:16 crc kubenswrapper[4917]: I0318 08:04:16.632329 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 18 08:04:16 crc kubenswrapper[4917]: I0318 08:04:16.960865 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b0524b90-39ac-4532-820a-23f804a96420","Type":"ContainerStarted","Data":"1765ab76efab24c89a8f9626149388bdd8685629d919f12180eb7e6048f0bbb5"} Mar 18 08:04:16 crc kubenswrapper[4917]: I0318 08:04:16.964198 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d758aedb-3012-4f5c-badd-725b4a4b8a42","Type":"ContainerStarted","Data":"619daaf98b4bc47a0d75901e694bc34e5c6f99a15ddacc7ad24c57efd7d77e5b"} Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.207835 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.209192 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.213871 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.213987 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.214849 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-vdksl" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.217256 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.233984 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.252750 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ff386b88-0ec0-4ff3-8579-84af54562ab6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.252805 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff386b88-0ec0-4ff3-8579-84af54562ab6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.252838 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ff386b88-0ec0-4ff3-8579-84af54562ab6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.252917 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-29063dc1-8348-46cc-883b-ee5aa3613f89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29063dc1-8348-46cc-883b-ee5aa3613f89\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.252954 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ff386b88-0ec0-4ff3-8579-84af54562ab6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.253014 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff386b88-0ec0-4ff3-8579-84af54562ab6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.253079 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff386b88-0ec0-4ff3-8579-84af54562ab6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.253098 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcdb6\" (UniqueName: \"kubernetes.io/projected/ff386b88-0ec0-4ff3-8579-84af54562ab6-kube-api-access-wcdb6\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.354337 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff386b88-0ec0-4ff3-8579-84af54562ab6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.354385 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcdb6\" (UniqueName: \"kubernetes.io/projected/ff386b88-0ec0-4ff3-8579-84af54562ab6-kube-api-access-wcdb6\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.354436 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ff386b88-0ec0-4ff3-8579-84af54562ab6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.354492 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff386b88-0ec0-4ff3-8579-84af54562ab6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.354516 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ff386b88-0ec0-4ff3-8579-84af54562ab6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.354550 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-29063dc1-8348-46cc-883b-ee5aa3613f89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29063dc1-8348-46cc-883b-ee5aa3613f89\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.354584 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ff386b88-0ec0-4ff3-8579-84af54562ab6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.354619 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff386b88-0ec0-4ff3-8579-84af54562ab6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.355966 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ff386b88-0ec0-4ff3-8579-84af54562ab6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.356258 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff386b88-0ec0-4ff3-8579-84af54562ab6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.356680 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ff386b88-0ec0-4ff3-8579-84af54562ab6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.357872 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.357903 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-29063dc1-8348-46cc-883b-ee5aa3613f89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29063dc1-8348-46cc-883b-ee5aa3613f89\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8a08093b71fbe37b7b39c7ae1201f951c3734e3af4d5bf3865a91a41de44c087/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.359113 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ff386b88-0ec0-4ff3-8579-84af54562ab6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.360528 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff386b88-0ec0-4ff3-8579-84af54562ab6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.366932 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff386b88-0ec0-4ff3-8579-84af54562ab6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.373030 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcdb6\" (UniqueName: \"kubernetes.io/projected/ff386b88-0ec0-4ff3-8579-84af54562ab6-kube-api-access-wcdb6\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.412030 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-29063dc1-8348-46cc-883b-ee5aa3613f89\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29063dc1-8348-46cc-883b-ee5aa3613f89\") pod \"openstack-cell1-galera-0\" (UID: \"ff386b88-0ec0-4ff3-8579-84af54562ab6\") " pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.509828 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.510866 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.512908 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.513088 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-r5qsk" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.514765 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.520863 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.536285 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.660689 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t6km\" (UniqueName: \"kubernetes.io/projected/694c424e-6894-48b0-9724-22d72b167a8c-kube-api-access-6t6km\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.661040 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/694c424e-6894-48b0-9724-22d72b167a8c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.661062 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c424e-6894-48b0-9724-22d72b167a8c-config-data\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.661174 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694c424e-6894-48b0-9724-22d72b167a8c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.661191 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/694c424e-6894-48b0-9724-22d72b167a8c-kolla-config\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.762816 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694c424e-6894-48b0-9724-22d72b167a8c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.763602 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/694c424e-6894-48b0-9724-22d72b167a8c-kolla-config\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.763678 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t6km\" (UniqueName: \"kubernetes.io/projected/694c424e-6894-48b0-9724-22d72b167a8c-kube-api-access-6t6km\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.763730 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/694c424e-6894-48b0-9724-22d72b167a8c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.763751 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c424e-6894-48b0-9724-22d72b167a8c-config-data\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.765025 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c424e-6894-48b0-9724-22d72b167a8c-config-data\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.765487 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/694c424e-6894-48b0-9724-22d72b167a8c-kolla-config\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.766762 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/694c424e-6894-48b0-9724-22d72b167a8c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.768655 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/694c424e-6894-48b0-9724-22d72b167a8c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.780619 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t6km\" (UniqueName: \"kubernetes.io/projected/694c424e-6894-48b0-9724-22d72b167a8c-kube-api-access-6t6km\") pod \"memcached-0\" (UID: \"694c424e-6894-48b0-9724-22d72b167a8c\") " pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.954901 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 18 08:04:17 crc kubenswrapper[4917]: I0318 08:04:17.983151 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 18 08:04:17 crc kubenswrapper[4917]: W0318 08:04:17.986828 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff386b88_0ec0_4ff3_8579_84af54562ab6.slice/crio-9ebef5f705060f21e869ef5b454b8f7fcee6a80d5b03280b8edfddf55a27c25e WatchSource:0}: Error finding container 9ebef5f705060f21e869ef5b454b8f7fcee6a80d5b03280b8edfddf55a27c25e: Status 404 returned error can't find the container with id 9ebef5f705060f21e869ef5b454b8f7fcee6a80d5b03280b8edfddf55a27c25e Mar 18 08:04:18 crc kubenswrapper[4917]: I0318 08:04:18.385011 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 18 08:04:18 crc kubenswrapper[4917]: I0318 08:04:18.985952 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ff386b88-0ec0-4ff3-8579-84af54562ab6","Type":"ContainerStarted","Data":"9ebef5f705060f21e869ef5b454b8f7fcee6a80d5b03280b8edfddf55a27c25e"} Mar 18 08:04:18 crc kubenswrapper[4917]: I0318 08:04:18.989397 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"694c424e-6894-48b0-9724-22d72b167a8c","Type":"ContainerStarted","Data":"dca7f1e494555bbd8937e9abf688df205697cd2a0257ed640b0ec73b892d4df1"} Mar 18 08:04:32 crc kubenswrapper[4917]: I0318 08:04:32.929270 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:04:32 crc kubenswrapper[4917]: I0318 08:04:32.930009 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:04:32 crc kubenswrapper[4917]: I0318 08:04:32.930091 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:04:32 crc kubenswrapper[4917]: I0318 08:04:32.931075 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:04:32 crc kubenswrapper[4917]: I0318 08:04:32.931168 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" gracePeriod=600 Mar 18 08:04:33 crc kubenswrapper[4917]: I0318 08:04:33.109011 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" exitCode=0 Mar 18 08:04:33 crc kubenswrapper[4917]: I0318 08:04:33.109057 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b"} Mar 18 08:04:33 crc kubenswrapper[4917]: I0318 08:04:33.109089 4917 scope.go:117] "RemoveContainer" containerID="c5ab49504cd4a3cbf79368c9f693de03d31b8e9c6679f12fadae736ae2434a51" Mar 18 08:04:34 crc kubenswrapper[4917]: E0318 08:04:34.039267 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:04:34 crc kubenswrapper[4917]: I0318 08:04:34.115991 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:04:34 crc kubenswrapper[4917]: E0318 08:04:34.116281 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:04:44 crc kubenswrapper[4917]: E0318 08:04:44.637900 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:44 crc kubenswrapper[4917]: E0318 08:04:44.638463 4917 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:44 crc kubenswrapper[4917]: E0318 08:04:44.638634 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:059169826d1e668c44c01b5bb9959b22,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wcdb6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(ff386b88-0ec0-4ff3-8579-84af54562ab6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 08:04:44 crc kubenswrapper[4917]: E0318 08:04:44.639906 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="ff386b88-0ec0-4ff3-8579-84af54562ab6" Mar 18 08:04:44 crc kubenswrapper[4917]: E0318 08:04:44.749617 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:44 crc kubenswrapper[4917]: E0318 08:04:44.749709 4917 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:44 crc kubenswrapper[4917]: E0318 08:04:44.749905 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:059169826d1e668c44c01b5bb9959b22,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8974s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(b0524b90-39ac-4532-820a-23f804a96420): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 08:04:44 crc kubenswrapper[4917]: E0318 08:04:44.751147 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="b0524b90-39ac-4532-820a-23f804a96420" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.205432 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:059169826d1e668c44c01b5bb9959b22\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="ff386b88-0ec0-4ff3-8579-84af54562ab6" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.206260 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-mariadb:059169826d1e668c44c01b5bb9959b22\\\"\"" pod="openstack/openstack-galera-0" podUID="b0524b90-39ac-4532-820a-23f804a96420" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.378688 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.378908 4917 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.379049 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n647h57bh695h68dh54fhf5hc5h67h5d4hb6h696h685h54ch6h599h5c5h679h74h689h644h5c8h64ch555h5c6h5dh569h698h59fh66ch57bh5b9hb7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jssw5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-cdfcf6547-nwwbg_openstack(f78dc3bc-6142-423a-a2f2-4af81d88ef9b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.380225 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" podUID="f78dc3bc-6142-423a-a2f2-4af81d88ef9b" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.395663 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.395734 4917 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.395885 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kkl44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-647889bd4c-wzxzw_openstack(7a0c15c5-7587-4953-b381-b37fa4fbee25): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.397024 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" podUID="7a0c15c5-7587-4953-b381-b37fa4fbee25" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.418109 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.418493 4917 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.418721 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h564h676h699hcdh67bh66hfdh569h545h648h94h546h696h668h89h96h667h575h595h5d9h584h8dhbdh697h54bhb7h58fh5c9hd8h5cdh5c7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zlwlp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5dccc9c6df-872gm_openstack(632534e3-b259-4851-8d0c-13b538a945f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.420005 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" podUID="632534e3-b259-4851-8d0c-13b538a945f8" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.426896 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.426978 4917 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.427204 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h54dhb7h666h69h76h59ch55ch65ch596h8h79h5c8h57hc8hfch5d7h697h79h698h5fch644hf9h54chbfh655hfchcbh5f8h646h5f7h89q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tmvrq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-77fdfd445f-tbst4_openstack(9471592b-6a9e-4b3b-817c-3dbaee4dde51): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.428823 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" podUID="9471592b-6a9e-4b3b-817c-3dbaee4dde51" Mar 18 08:04:45 crc kubenswrapper[4917]: I0318 08:04:45.781945 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:04:45 crc kubenswrapper[4917]: E0318 08:04:45.782501 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:04:46 crc kubenswrapper[4917]: I0318 08:04:46.603158 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"694c424e-6894-48b0-9724-22d72b167a8c","Type":"ContainerStarted","Data":"f8fa932e527ec3d1e90de00c8fcd2ceaf40e33f443712fc987a03da82dc4d02e"} Mar 18 08:04:46 crc kubenswrapper[4917]: I0318 08:04:46.604240 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 18 08:04:46 crc kubenswrapper[4917]: E0318 08:04:46.604491 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22\\\"\"" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" podUID="7a0c15c5-7587-4953-b381-b37fa4fbee25" Mar 18 08:04:46 crc kubenswrapper[4917]: E0318 08:04:46.606840 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:059169826d1e668c44c01b5bb9959b22\\\"\"" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" podUID="632534e3-b259-4851-8d0c-13b538a945f8" Mar 18 08:04:46 crc kubenswrapper[4917]: I0318 08:04:46.659850 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.704255441 podStartE2EDuration="29.65982949s" podCreationTimestamp="2026-03-18 08:04:17 +0000 UTC" firstStartedPulling="2026-03-18 08:04:18.397768501 +0000 UTC m=+4643.338923215" lastFinishedPulling="2026-03-18 08:04:45.35334255 +0000 UTC m=+4670.294497264" observedRunningTime="2026-03-18 08:04:46.651481589 +0000 UTC m=+4671.592636333" watchObservedRunningTime="2026-03-18 08:04:46.65982949 +0000 UTC m=+4671.600984224" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.319350 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.383207 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.494595 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-config\") pod \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\" (UID: \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\") " Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.494677 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmvrq\" (UniqueName: \"kubernetes.io/projected/9471592b-6a9e-4b3b-817c-3dbaee4dde51-kube-api-access-tmvrq\") pod \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.494805 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jssw5\" (UniqueName: \"kubernetes.io/projected/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-kube-api-access-jssw5\") pod \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\" (UID: \"f78dc3bc-6142-423a-a2f2-4af81d88ef9b\") " Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.494900 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-dns-svc\") pod \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.494924 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-config\") pod \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\" (UID: \"9471592b-6a9e-4b3b-817c-3dbaee4dde51\") " Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.495547 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-config" (OuterVolumeSpecName: "config") pod "f78dc3bc-6142-423a-a2f2-4af81d88ef9b" (UID: "f78dc3bc-6142-423a-a2f2-4af81d88ef9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.495679 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9471592b-6a9e-4b3b-817c-3dbaee4dde51" (UID: "9471592b-6a9e-4b3b-817c-3dbaee4dde51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.495722 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-config" (OuterVolumeSpecName: "config") pod "9471592b-6a9e-4b3b-817c-3dbaee4dde51" (UID: "9471592b-6a9e-4b3b-817c-3dbaee4dde51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.502509 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9471592b-6a9e-4b3b-817c-3dbaee4dde51-kube-api-access-tmvrq" (OuterVolumeSpecName: "kube-api-access-tmvrq") pod "9471592b-6a9e-4b3b-817c-3dbaee4dde51" (UID: "9471592b-6a9e-4b3b-817c-3dbaee4dde51"). InnerVolumeSpecName "kube-api-access-tmvrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.503557 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-kube-api-access-jssw5" (OuterVolumeSpecName: "kube-api-access-jssw5") pod "f78dc3bc-6142-423a-a2f2-4af81d88ef9b" (UID: "f78dc3bc-6142-423a-a2f2-4af81d88ef9b"). InnerVolumeSpecName "kube-api-access-jssw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.596908 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jssw5\" (UniqueName: \"kubernetes.io/projected/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-kube-api-access-jssw5\") on node \"crc\" DevicePath \"\"" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.596970 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.596992 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9471592b-6a9e-4b3b-817c-3dbaee4dde51-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.597014 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f78dc3bc-6142-423a-a2f2-4af81d88ef9b-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.597032 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmvrq\" (UniqueName: \"kubernetes.io/projected/9471592b-6a9e-4b3b-817c-3dbaee4dde51-kube-api-access-tmvrq\") on node \"crc\" DevicePath \"\"" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.614342 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"13dca274-b8d1-439f-b3cc-a073f12bdc37","Type":"ContainerStarted","Data":"d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a"} Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.617350 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.617368 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77fdfd445f-tbst4" event={"ID":"9471592b-6a9e-4b3b-817c-3dbaee4dde51","Type":"ContainerDied","Data":"33b4d3afeb46774790c5c41440bcaa88315b05a06829426b8b4b7d785f2ab0ba"} Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.621086 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d758aedb-3012-4f5c-badd-725b4a4b8a42","Type":"ContainerStarted","Data":"dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c"} Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.625003 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.625008 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cdfcf6547-nwwbg" event={"ID":"f78dc3bc-6142-423a-a2f2-4af81d88ef9b","Type":"ContainerDied","Data":"9cd22930a81854e210663a2a7f603c0cfa5721e13d4990d843f6460f14765161"} Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.769946 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cdfcf6547-nwwbg"] Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.781070 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cdfcf6547-nwwbg"] Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.798626 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77fdfd445f-tbst4"] Mar 18 08:04:47 crc kubenswrapper[4917]: I0318 08:04:47.805324 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77fdfd445f-tbst4"] Mar 18 08:04:49 crc kubenswrapper[4917]: I0318 08:04:49.788909 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9471592b-6a9e-4b3b-817c-3dbaee4dde51" path="/var/lib/kubelet/pods/9471592b-6a9e-4b3b-817c-3dbaee4dde51/volumes" Mar 18 08:04:49 crc kubenswrapper[4917]: I0318 08:04:49.789957 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78dc3bc-6142-423a-a2f2-4af81d88ef9b" path="/var/lib/kubelet/pods/f78dc3bc-6142-423a-a2f2-4af81d88ef9b/volumes" Mar 18 08:04:52 crc kubenswrapper[4917]: I0318 08:04:52.957130 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 18 08:04:56 crc kubenswrapper[4917]: I0318 08:04:56.701397 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ff386b88-0ec0-4ff3-8579-84af54562ab6","Type":"ContainerStarted","Data":"82e8291003a98df27069c81914075d9161e4dc9d1e83921ffae3e9a9ab5dfce3"} Mar 18 08:04:58 crc kubenswrapper[4917]: I0318 08:04:58.773143 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:04:58 crc kubenswrapper[4917]: E0318 08:04:58.773900 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:04:59 crc kubenswrapper[4917]: I0318 08:04:59.728831 4917 generic.go:334] "Generic (PLEG): container finished" podID="ff386b88-0ec0-4ff3-8579-84af54562ab6" containerID="82e8291003a98df27069c81914075d9161e4dc9d1e83921ffae3e9a9ab5dfce3" exitCode=0 Mar 18 08:04:59 crc kubenswrapper[4917]: I0318 08:04:59.728900 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ff386b88-0ec0-4ff3-8579-84af54562ab6","Type":"ContainerDied","Data":"82e8291003a98df27069c81914075d9161e4dc9d1e83921ffae3e9a9ab5dfce3"} Mar 18 08:05:00 crc kubenswrapper[4917]: I0318 08:05:00.738450 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"ff386b88-0ec0-4ff3-8579-84af54562ab6","Type":"ContainerStarted","Data":"a41076b50d34f30638ed22a9f214833f99328dd90480f9a07c0c18be02e87eff"} Mar 18 08:05:00 crc kubenswrapper[4917]: I0318 08:05:00.740394 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b0524b90-39ac-4532-820a-23f804a96420","Type":"ContainerStarted","Data":"4af02212d8c2b388ffe4cf592e637365554f7b69fa301249f111505e740d5451"} Mar 18 08:05:00 crc kubenswrapper[4917]: I0318 08:05:00.768396 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=6.760092137 podStartE2EDuration="44.768380407s" podCreationTimestamp="2026-03-18 08:04:16 +0000 UTC" firstStartedPulling="2026-03-18 08:04:17.992104596 +0000 UTC m=+4642.933259310" lastFinishedPulling="2026-03-18 08:04:56.000392796 +0000 UTC m=+4680.941547580" observedRunningTime="2026-03-18 08:05:00.764671167 +0000 UTC m=+4685.705825891" watchObservedRunningTime="2026-03-18 08:05:00.768380407 +0000 UTC m=+4685.709535131" Mar 18 08:05:01 crc kubenswrapper[4917]: E0318 08:05:01.799302 4917 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.184:36946->38.102.83.184:33891: write tcp 38.102.83.184:36946->38.102.83.184:33891: write: broken pipe Mar 18 08:05:02 crc kubenswrapper[4917]: I0318 08:05:02.768090 4917 generic.go:334] "Generic (PLEG): container finished" podID="632534e3-b259-4851-8d0c-13b538a945f8" containerID="bd7ff41208e62b87739117b39a29a47ef59ec14c5e923a113a868a5ca5782a1e" exitCode=0 Mar 18 08:05:02 crc kubenswrapper[4917]: I0318 08:05:02.768552 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" event={"ID":"632534e3-b259-4851-8d0c-13b538a945f8","Type":"ContainerDied","Data":"bd7ff41208e62b87739117b39a29a47ef59ec14c5e923a113a868a5ca5782a1e"} Mar 18 08:05:02 crc kubenswrapper[4917]: I0318 08:05:02.774762 4917 generic.go:334] "Generic (PLEG): container finished" podID="7a0c15c5-7587-4953-b381-b37fa4fbee25" containerID="022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7" exitCode=0 Mar 18 08:05:02 crc kubenswrapper[4917]: I0318 08:05:02.777013 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" event={"ID":"7a0c15c5-7587-4953-b381-b37fa4fbee25","Type":"ContainerDied","Data":"022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7"} Mar 18 08:05:03 crc kubenswrapper[4917]: I0318 08:05:03.791411 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" event={"ID":"7a0c15c5-7587-4953-b381-b37fa4fbee25","Type":"ContainerStarted","Data":"f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe"} Mar 18 08:05:03 crc kubenswrapper[4917]: I0318 08:05:03.791710 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:05:03 crc kubenswrapper[4917]: I0318 08:05:03.795738 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" event={"ID":"632534e3-b259-4851-8d0c-13b538a945f8","Type":"ContainerStarted","Data":"77958d7c96a2eccf500d43f2b02194fcca45347280aee7120a60bf439d67bf42"} Mar 18 08:05:03 crc kubenswrapper[4917]: I0318 08:05:03.796542 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:05:03 crc kubenswrapper[4917]: I0318 08:05:03.836094 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" podStartSLOduration=-9223371986.018705 podStartE2EDuration="50.83607106s" podCreationTimestamp="2026-03-18 08:04:13 +0000 UTC" firstStartedPulling="2026-03-18 08:04:14.538211049 +0000 UTC m=+4639.479365753" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:05:03.831410357 +0000 UTC m=+4688.772565101" watchObservedRunningTime="2026-03-18 08:05:03.83607106 +0000 UTC m=+4688.777225804" Mar 18 08:05:03 crc kubenswrapper[4917]: I0318 08:05:03.838465 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" podStartSLOduration=3.443442536 podStartE2EDuration="50.838453447s" podCreationTimestamp="2026-03-18 08:04:13 +0000 UTC" firstStartedPulling="2026-03-18 08:04:14.225942894 +0000 UTC m=+4639.167097608" lastFinishedPulling="2026-03-18 08:05:01.620953775 +0000 UTC m=+4686.562108519" observedRunningTime="2026-03-18 08:05:03.814240791 +0000 UTC m=+4688.755395545" watchObservedRunningTime="2026-03-18 08:05:03.838453447 +0000 UTC m=+4688.779608191" Mar 18 08:05:04 crc kubenswrapper[4917]: I0318 08:05:04.811358 4917 generic.go:334] "Generic (PLEG): container finished" podID="b0524b90-39ac-4532-820a-23f804a96420" containerID="4af02212d8c2b388ffe4cf592e637365554f7b69fa301249f111505e740d5451" exitCode=0 Mar 18 08:05:04 crc kubenswrapper[4917]: I0318 08:05:04.811481 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b0524b90-39ac-4532-820a-23f804a96420","Type":"ContainerDied","Data":"4af02212d8c2b388ffe4cf592e637365554f7b69fa301249f111505e740d5451"} Mar 18 08:05:05 crc kubenswrapper[4917]: I0318 08:05:05.823088 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b0524b90-39ac-4532-820a-23f804a96420","Type":"ContainerStarted","Data":"15401f7981a4eb2336236d4a17daeab807835c827f7d4525fca0430c57a94647"} Mar 18 08:05:05 crc kubenswrapper[4917]: I0318 08:05:05.859555 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371984.995247 podStartE2EDuration="51.859528067s" podCreationTimestamp="2026-03-18 08:04:14 +0000 UTC" firstStartedPulling="2026-03-18 08:04:16.649916452 +0000 UTC m=+4641.591071166" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:05:05.854411244 +0000 UTC m=+4690.795566008" watchObservedRunningTime="2026-03-18 08:05:05.859528067 +0000 UTC m=+4690.800682811" Mar 18 08:05:06 crc kubenswrapper[4917]: I0318 08:05:06.065631 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 18 08:05:06 crc kubenswrapper[4917]: I0318 08:05:06.065710 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 18 08:05:07 crc kubenswrapper[4917]: I0318 08:05:07.537882 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 18 08:05:07 crc kubenswrapper[4917]: I0318 08:05:07.538386 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 18 08:05:07 crc kubenswrapper[4917]: I0318 08:05:07.622311 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 18 08:05:07 crc kubenswrapper[4917]: I0318 08:05:07.903870 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 18 08:05:08 crc kubenswrapper[4917]: I0318 08:05:08.782786 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.091823 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.136764 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647889bd4c-wzxzw"] Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.137008 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" podUID="7a0c15c5-7587-4953-b381-b37fa4fbee25" containerName="dnsmasq-dns" containerID="cri-o://f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe" gracePeriod=10 Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.579031 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.686333 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-dns-svc\") pod \"7a0c15c5-7587-4953-b381-b37fa4fbee25\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.686408 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkl44\" (UniqueName: \"kubernetes.io/projected/7a0c15c5-7587-4953-b381-b37fa4fbee25-kube-api-access-kkl44\") pod \"7a0c15c5-7587-4953-b381-b37fa4fbee25\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.686510 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-config\") pod \"7a0c15c5-7587-4953-b381-b37fa4fbee25\" (UID: \"7a0c15c5-7587-4953-b381-b37fa4fbee25\") " Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.691801 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a0c15c5-7587-4953-b381-b37fa4fbee25-kube-api-access-kkl44" (OuterVolumeSpecName: "kube-api-access-kkl44") pod "7a0c15c5-7587-4953-b381-b37fa4fbee25" (UID: "7a0c15c5-7587-4953-b381-b37fa4fbee25"). InnerVolumeSpecName "kube-api-access-kkl44". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.722749 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a0c15c5-7587-4953-b381-b37fa4fbee25" (UID: "7a0c15c5-7587-4953-b381-b37fa4fbee25"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.723856 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-config" (OuterVolumeSpecName: "config") pod "7a0c15c5-7587-4953-b381-b37fa4fbee25" (UID: "7a0c15c5-7587-4953-b381-b37fa4fbee25"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.788789 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.788828 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a0c15c5-7587-4953-b381-b37fa4fbee25-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.788843 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkl44\" (UniqueName: \"kubernetes.io/projected/7a0c15c5-7587-4953-b381-b37fa4fbee25-kube-api-access-kkl44\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.855334 4917 generic.go:334] "Generic (PLEG): container finished" podID="7a0c15c5-7587-4953-b381-b37fa4fbee25" containerID="f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe" exitCode=0 Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.855393 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" event={"ID":"7a0c15c5-7587-4953-b381-b37fa4fbee25","Type":"ContainerDied","Data":"f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe"} Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.855430 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" event={"ID":"7a0c15c5-7587-4953-b381-b37fa4fbee25","Type":"ContainerDied","Data":"6d5227182f50c70ad14b0a563ac7234ffca5d9bf1c688fbef3a8cd2cf5bf0260"} Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.855457 4917 scope.go:117] "RemoveContainer" containerID="f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.855455 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-647889bd4c-wzxzw" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.875096 4917 scope.go:117] "RemoveContainer" containerID="022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.880478 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-647889bd4c-wzxzw"] Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.894720 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-647889bd4c-wzxzw"] Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.902058 4917 scope.go:117] "RemoveContainer" containerID="f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe" Mar 18 08:05:09 crc kubenswrapper[4917]: E0318 08:05:09.902906 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe\": container with ID starting with f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe not found: ID does not exist" containerID="f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.902962 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe"} err="failed to get container status \"f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe\": rpc error: code = NotFound desc = could not find container \"f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe\": container with ID starting with f30cb1b783ad8a6bcb44317f01bf7ab494265322bd30bbabddfa8fdaa59e4efe not found: ID does not exist" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.902997 4917 scope.go:117] "RemoveContainer" containerID="022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7" Mar 18 08:05:09 crc kubenswrapper[4917]: E0318 08:05:09.903372 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7\": container with ID starting with 022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7 not found: ID does not exist" containerID="022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7" Mar 18 08:05:09 crc kubenswrapper[4917]: I0318 08:05:09.903418 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7"} err="failed to get container status \"022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7\": rpc error: code = NotFound desc = could not find container \"022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7\": container with ID starting with 022bd9db9af2a4df9236b72b5f2aa5dd5f85b158e84e33593df3d30aec27bef7 not found: ID does not exist" Mar 18 08:05:10 crc kubenswrapper[4917]: I0318 08:05:10.189356 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 18 08:05:10 crc kubenswrapper[4917]: I0318 08:05:10.311566 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 18 08:05:10 crc kubenswrapper[4917]: I0318 08:05:10.772850 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:05:10 crc kubenswrapper[4917]: E0318 08:05:10.773746 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:05:11 crc kubenswrapper[4917]: I0318 08:05:11.788051 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a0c15c5-7587-4953-b381-b37fa4fbee25" path="/var/lib/kubelet/pods/7a0c15c5-7587-4953-b381-b37fa4fbee25/volumes" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.695018 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7rh72"] Mar 18 08:05:14 crc kubenswrapper[4917]: E0318 08:05:14.695463 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0c15c5-7587-4953-b381-b37fa4fbee25" containerName="dnsmasq-dns" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.695484 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0c15c5-7587-4953-b381-b37fa4fbee25" containerName="dnsmasq-dns" Mar 18 08:05:14 crc kubenswrapper[4917]: E0318 08:05:14.695497 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a0c15c5-7587-4953-b381-b37fa4fbee25" containerName="init" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.695506 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a0c15c5-7587-4953-b381-b37fa4fbee25" containerName="init" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.695779 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a0c15c5-7587-4953-b381-b37fa4fbee25" containerName="dnsmasq-dns" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.696820 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.699152 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.711816 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7rh72"] Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.776192 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-operator-scripts\") pod \"root-account-create-update-7rh72\" (UID: \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\") " pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.776303 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5kk9\" (UniqueName: \"kubernetes.io/projected/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-kube-api-access-d5kk9\") pod \"root-account-create-update-7rh72\" (UID: \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\") " pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.878330 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5kk9\" (UniqueName: \"kubernetes.io/projected/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-kube-api-access-d5kk9\") pod \"root-account-create-update-7rh72\" (UID: \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\") " pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.878557 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-operator-scripts\") pod \"root-account-create-update-7rh72\" (UID: \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\") " pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:14 crc kubenswrapper[4917]: I0318 08:05:14.879672 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-operator-scripts\") pod \"root-account-create-update-7rh72\" (UID: \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\") " pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:15 crc kubenswrapper[4917]: I0318 08:05:15.031782 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5kk9\" (UniqueName: \"kubernetes.io/projected/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-kube-api-access-d5kk9\") pod \"root-account-create-update-7rh72\" (UID: \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\") " pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:15 crc kubenswrapper[4917]: I0318 08:05:15.032279 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:15 crc kubenswrapper[4917]: I0318 08:05:15.324654 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7rh72"] Mar 18 08:05:15 crc kubenswrapper[4917]: W0318 08:05:15.332426 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c1a4f4f_3a96_4b03_b7c5_6c766dac3a0c.slice/crio-f09f8f7bc9e90006ce2fdb66f8dbc5c6f43a28774336dc425c3eeff61fe6383e WatchSource:0}: Error finding container f09f8f7bc9e90006ce2fdb66f8dbc5c6f43a28774336dc425c3eeff61fe6383e: Status 404 returned error can't find the container with id f09f8f7bc9e90006ce2fdb66f8dbc5c6f43a28774336dc425c3eeff61fe6383e Mar 18 08:05:15 crc kubenswrapper[4917]: I0318 08:05:15.917324 4917 generic.go:334] "Generic (PLEG): container finished" podID="3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c" containerID="4cc0be2c51a22908fbfc14e2bcd4cb64a04baa8286228c019cb76b36c3428d40" exitCode=0 Mar 18 08:05:15 crc kubenswrapper[4917]: I0318 08:05:15.917405 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7rh72" event={"ID":"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c","Type":"ContainerDied","Data":"4cc0be2c51a22908fbfc14e2bcd4cb64a04baa8286228c019cb76b36c3428d40"} Mar 18 08:05:15 crc kubenswrapper[4917]: I0318 08:05:15.917453 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7rh72" event={"ID":"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c","Type":"ContainerStarted","Data":"f09f8f7bc9e90006ce2fdb66f8dbc5c6f43a28774336dc425c3eeff61fe6383e"} Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.303244 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.423840 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5kk9\" (UniqueName: \"kubernetes.io/projected/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-kube-api-access-d5kk9\") pod \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\" (UID: \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\") " Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.424049 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-operator-scripts\") pod \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\" (UID: \"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c\") " Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.425266 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c" (UID: "3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.432935 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-kube-api-access-d5kk9" (OuterVolumeSpecName: "kube-api-access-d5kk9") pod "3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c" (UID: "3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c"). InnerVolumeSpecName "kube-api-access-d5kk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.526169 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.526202 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5kk9\" (UniqueName: \"kubernetes.io/projected/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c-kube-api-access-d5kk9\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.940854 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7rh72" event={"ID":"3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c","Type":"ContainerDied","Data":"f09f8f7bc9e90006ce2fdb66f8dbc5c6f43a28774336dc425c3eeff61fe6383e"} Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.940911 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09f8f7bc9e90006ce2fdb66f8dbc5c6f43a28774336dc425c3eeff61fe6383e" Mar 18 08:05:17 crc kubenswrapper[4917]: I0318 08:05:17.940936 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7rh72" Mar 18 08:05:18 crc kubenswrapper[4917]: I0318 08:05:18.829646 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-k64gl"] Mar 18 08:05:18 crc kubenswrapper[4917]: E0318 08:05:18.830027 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c" containerName="mariadb-account-create-update" Mar 18 08:05:18 crc kubenswrapper[4917]: I0318 08:05:18.830043 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c" containerName="mariadb-account-create-update" Mar 18 08:05:18 crc kubenswrapper[4917]: I0318 08:05:18.830252 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c" containerName="mariadb-account-create-update" Mar 18 08:05:18 crc kubenswrapper[4917]: I0318 08:05:18.831580 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:18 crc kubenswrapper[4917]: I0318 08:05:18.855489 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k64gl"] Mar 18 08:05:18 crc kubenswrapper[4917]: I0318 08:05:18.949518 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6946\" (UniqueName: \"kubernetes.io/projected/e1c73f4b-319d-4586-a76d-16aaa83eff0e-kube-api-access-c6946\") pod \"redhat-marketplace-k64gl\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:18 crc kubenswrapper[4917]: I0318 08:05:18.949572 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-utilities\") pod \"redhat-marketplace-k64gl\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:18 crc kubenswrapper[4917]: I0318 08:05:18.949633 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-catalog-content\") pod \"redhat-marketplace-k64gl\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.050378 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6946\" (UniqueName: \"kubernetes.io/projected/e1c73f4b-319d-4586-a76d-16aaa83eff0e-kube-api-access-c6946\") pod \"redhat-marketplace-k64gl\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.050430 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-utilities\") pod \"redhat-marketplace-k64gl\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.050459 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-catalog-content\") pod \"redhat-marketplace-k64gl\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.051036 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-catalog-content\") pod \"redhat-marketplace-k64gl\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.051336 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-utilities\") pod \"redhat-marketplace-k64gl\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.071835 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6946\" (UniqueName: \"kubernetes.io/projected/e1c73f4b-319d-4586-a76d-16aaa83eff0e-kube-api-access-c6946\") pod \"redhat-marketplace-k64gl\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.166715 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:19 crc kubenswrapper[4917]: W0318 08:05:19.662469 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1c73f4b_319d_4586_a76d_16aaa83eff0e.slice/crio-f320cee45170227ff3b4b09a1cfaf8a3913b76ad2c71929737f41276780eb8e7 WatchSource:0}: Error finding container f320cee45170227ff3b4b09a1cfaf8a3913b76ad2c71929737f41276780eb8e7: Status 404 returned error can't find the container with id f320cee45170227ff3b4b09a1cfaf8a3913b76ad2c71929737f41276780eb8e7 Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.664105 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-k64gl"] Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.962484 4917 generic.go:334] "Generic (PLEG): container finished" podID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerID="7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606" exitCode=0 Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.962544 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k64gl" event={"ID":"e1c73f4b-319d-4586-a76d-16aaa83eff0e","Type":"ContainerDied","Data":"7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606"} Mar 18 08:05:19 crc kubenswrapper[4917]: I0318 08:05:19.962608 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k64gl" event={"ID":"e1c73f4b-319d-4586-a76d-16aaa83eff0e","Type":"ContainerStarted","Data":"f320cee45170227ff3b4b09a1cfaf8a3913b76ad2c71929737f41276780eb8e7"} Mar 18 08:05:20 crc kubenswrapper[4917]: I0318 08:05:20.974580 4917 generic.go:334] "Generic (PLEG): container finished" podID="d758aedb-3012-4f5c-badd-725b4a4b8a42" containerID="dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c" exitCode=0 Mar 18 08:05:20 crc kubenswrapper[4917]: I0318 08:05:20.974679 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d758aedb-3012-4f5c-badd-725b4a4b8a42","Type":"ContainerDied","Data":"dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c"} Mar 18 08:05:20 crc kubenswrapper[4917]: I0318 08:05:20.977381 4917 generic.go:334] "Generic (PLEG): container finished" podID="13dca274-b8d1-439f-b3cc-a073f12bdc37" containerID="d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a" exitCode=0 Mar 18 08:05:20 crc kubenswrapper[4917]: I0318 08:05:20.977477 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"13dca274-b8d1-439f-b3cc-a073f12bdc37","Type":"ContainerDied","Data":"d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a"} Mar 18 08:05:20 crc kubenswrapper[4917]: I0318 08:05:20.993679 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k64gl" event={"ID":"e1c73f4b-319d-4586-a76d-16aaa83eff0e","Type":"ContainerStarted","Data":"65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89"} Mar 18 08:05:21 crc kubenswrapper[4917]: I0318 08:05:21.181651 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7rh72"] Mar 18 08:05:21 crc kubenswrapper[4917]: I0318 08:05:21.188753 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7rh72"] Mar 18 08:05:21 crc kubenswrapper[4917]: I0318 08:05:21.772529 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:05:21 crc kubenswrapper[4917]: E0318 08:05:21.773114 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:05:21 crc kubenswrapper[4917]: I0318 08:05:21.782066 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c" path="/var/lib/kubelet/pods/3c1a4f4f-3a96-4b03-b7c5-6c766dac3a0c/volumes" Mar 18 08:05:22 crc kubenswrapper[4917]: I0318 08:05:22.002745 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"13dca274-b8d1-439f-b3cc-a073f12bdc37","Type":"ContainerStarted","Data":"a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab"} Mar 18 08:05:22 crc kubenswrapper[4917]: I0318 08:05:22.003843 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 08:05:22 crc kubenswrapper[4917]: I0318 08:05:22.006569 4917 generic.go:334] "Generic (PLEG): container finished" podID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerID="65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89" exitCode=0 Mar 18 08:05:22 crc kubenswrapper[4917]: I0318 08:05:22.006621 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k64gl" event={"ID":"e1c73f4b-319d-4586-a76d-16aaa83eff0e","Type":"ContainerDied","Data":"65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89"} Mar 18 08:05:22 crc kubenswrapper[4917]: I0318 08:05:22.009440 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d758aedb-3012-4f5c-badd-725b4a4b8a42","Type":"ContainerStarted","Data":"a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7"} Mar 18 08:05:22 crc kubenswrapper[4917]: I0318 08:05:22.009646 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:22 crc kubenswrapper[4917]: I0318 08:05:22.044913 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.160654183 podStartE2EDuration="1m9.044893641s" podCreationTimestamp="2026-03-18 08:04:13 +0000 UTC" firstStartedPulling="2026-03-18 08:04:15.501429884 +0000 UTC m=+4640.442584598" lastFinishedPulling="2026-03-18 08:04:45.385669342 +0000 UTC m=+4670.326824056" observedRunningTime="2026-03-18 08:05:22.036446127 +0000 UTC m=+4706.977600851" watchObservedRunningTime="2026-03-18 08:05:22.044893641 +0000 UTC m=+4706.986048375" Mar 18 08:05:22 crc kubenswrapper[4917]: I0318 08:05:22.100913 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.770336333 podStartE2EDuration="1m9.100888526s" podCreationTimestamp="2026-03-18 08:04:13 +0000 UTC" firstStartedPulling="2026-03-18 08:04:16.03446008 +0000 UTC m=+4640.975614794" lastFinishedPulling="2026-03-18 08:04:45.365012273 +0000 UTC m=+4670.306166987" observedRunningTime="2026-03-18 08:05:22.090970616 +0000 UTC m=+4707.032125370" watchObservedRunningTime="2026-03-18 08:05:22.100888526 +0000 UTC m=+4707.042043280" Mar 18 08:05:23 crc kubenswrapper[4917]: I0318 08:05:23.020287 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k64gl" event={"ID":"e1c73f4b-319d-4586-a76d-16aaa83eff0e","Type":"ContainerStarted","Data":"bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40"} Mar 18 08:05:23 crc kubenswrapper[4917]: I0318 08:05:23.046134 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-k64gl" podStartSLOduration=2.564385539 podStartE2EDuration="5.046110455s" podCreationTimestamp="2026-03-18 08:05:18 +0000 UTC" firstStartedPulling="2026-03-18 08:05:19.965970981 +0000 UTC m=+4704.907125735" lastFinishedPulling="2026-03-18 08:05:22.447695917 +0000 UTC m=+4707.388850651" observedRunningTime="2026-03-18 08:05:23.040044599 +0000 UTC m=+4707.981199313" watchObservedRunningTime="2026-03-18 08:05:23.046110455 +0000 UTC m=+4707.987265169" Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.187703 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jnk6v"] Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.189614 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.193076 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.194501 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jnk6v"] Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.378782 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfghk\" (UniqueName: \"kubernetes.io/projected/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-kube-api-access-rfghk\") pod \"root-account-create-update-jnk6v\" (UID: \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\") " pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.378905 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-operator-scripts\") pod \"root-account-create-update-jnk6v\" (UID: \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\") " pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.481385 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfghk\" (UniqueName: \"kubernetes.io/projected/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-kube-api-access-rfghk\") pod \"root-account-create-update-jnk6v\" (UID: \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\") " pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.482030 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-operator-scripts\") pod \"root-account-create-update-jnk6v\" (UID: \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\") " pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.482940 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-operator-scripts\") pod \"root-account-create-update-jnk6v\" (UID: \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\") " pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.511680 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfghk\" (UniqueName: \"kubernetes.io/projected/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-kube-api-access-rfghk\") pod \"root-account-create-update-jnk6v\" (UID: \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\") " pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:26 crc kubenswrapper[4917]: I0318 08:05:26.517305 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:27 crc kubenswrapper[4917]: I0318 08:05:27.078433 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jnk6v"] Mar 18 08:05:27 crc kubenswrapper[4917]: W0318 08:05:27.236609 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd2030a7_9b6f_4481_8f9f_ac6bb054e97b.slice/crio-2f4d2b6d97efc50f40c5befa4640eb4a1c5fc762cc23c0ba3938277a6f15a143 WatchSource:0}: Error finding container 2f4d2b6d97efc50f40c5befa4640eb4a1c5fc762cc23c0ba3938277a6f15a143: Status 404 returned error can't find the container with id 2f4d2b6d97efc50f40c5befa4640eb4a1c5fc762cc23c0ba3938277a6f15a143 Mar 18 08:05:28 crc kubenswrapper[4917]: I0318 08:05:28.059775 4917 generic.go:334] "Generic (PLEG): container finished" podID="bd2030a7-9b6f-4481-8f9f-ac6bb054e97b" containerID="4784dd8b1b5bae5ddbbb1c54f5f48045e21a8b542035602e8b0fa07f0ff00001" exitCode=0 Mar 18 08:05:28 crc kubenswrapper[4917]: I0318 08:05:28.059895 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jnk6v" event={"ID":"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b","Type":"ContainerDied","Data":"4784dd8b1b5bae5ddbbb1c54f5f48045e21a8b542035602e8b0fa07f0ff00001"} Mar 18 08:05:28 crc kubenswrapper[4917]: I0318 08:05:28.060200 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jnk6v" event={"ID":"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b","Type":"ContainerStarted","Data":"2f4d2b6d97efc50f40c5befa4640eb4a1c5fc762cc23c0ba3938277a6f15a143"} Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.167969 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.168027 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.221307 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.389225 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.539915 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-operator-scripts\") pod \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\" (UID: \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\") " Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.540703 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfghk\" (UniqueName: \"kubernetes.io/projected/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-kube-api-access-rfghk\") pod \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\" (UID: \"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b\") " Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.540646 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bd2030a7-9b6f-4481-8f9f-ac6bb054e97b" (UID: "bd2030a7-9b6f-4481-8f9f-ac6bb054e97b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.541752 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.549445 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-kube-api-access-rfghk" (OuterVolumeSpecName: "kube-api-access-rfghk") pod "bd2030a7-9b6f-4481-8f9f-ac6bb054e97b" (UID: "bd2030a7-9b6f-4481-8f9f-ac6bb054e97b"). InnerVolumeSpecName "kube-api-access-rfghk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:29 crc kubenswrapper[4917]: I0318 08:05:29.642538 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfghk\" (UniqueName: \"kubernetes.io/projected/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b-kube-api-access-rfghk\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:30 crc kubenswrapper[4917]: I0318 08:05:30.079996 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jnk6v" Mar 18 08:05:30 crc kubenswrapper[4917]: I0318 08:05:30.079987 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jnk6v" event={"ID":"bd2030a7-9b6f-4481-8f9f-ac6bb054e97b","Type":"ContainerDied","Data":"2f4d2b6d97efc50f40c5befa4640eb4a1c5fc762cc23c0ba3938277a6f15a143"} Mar 18 08:05:30 crc kubenswrapper[4917]: I0318 08:05:30.080184 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4d2b6d97efc50f40c5befa4640eb4a1c5fc762cc23c0ba3938277a6f15a143" Mar 18 08:05:30 crc kubenswrapper[4917]: I0318 08:05:30.157763 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:30 crc kubenswrapper[4917]: I0318 08:05:30.224685 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k64gl"] Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.103960 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-k64gl" podUID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerName="registry-server" containerID="cri-o://bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40" gracePeriod=2 Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.628662 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.706544 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-utilities\") pod \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.706756 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6946\" (UniqueName: \"kubernetes.io/projected/e1c73f4b-319d-4586-a76d-16aaa83eff0e-kube-api-access-c6946\") pod \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.706912 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-catalog-content\") pod \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\" (UID: \"e1c73f4b-319d-4586-a76d-16aaa83eff0e\") " Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.708268 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-utilities" (OuterVolumeSpecName: "utilities") pod "e1c73f4b-319d-4586-a76d-16aaa83eff0e" (UID: "e1c73f4b-319d-4586-a76d-16aaa83eff0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.712187 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1c73f4b-319d-4586-a76d-16aaa83eff0e-kube-api-access-c6946" (OuterVolumeSpecName: "kube-api-access-c6946") pod "e1c73f4b-319d-4586-a76d-16aaa83eff0e" (UID: "e1c73f4b-319d-4586-a76d-16aaa83eff0e"). InnerVolumeSpecName "kube-api-access-c6946". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.733143 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1c73f4b-319d-4586-a76d-16aaa83eff0e" (UID: "e1c73f4b-319d-4586-a76d-16aaa83eff0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.773498 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:05:32 crc kubenswrapper[4917]: E0318 08:05:32.773932 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.809076 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6946\" (UniqueName: \"kubernetes.io/projected/e1c73f4b-319d-4586-a76d-16aaa83eff0e-kube-api-access-c6946\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.809149 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:32 crc kubenswrapper[4917]: I0318 08:05:32.809197 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1c73f4b-319d-4586-a76d-16aaa83eff0e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.118695 4917 generic.go:334] "Generic (PLEG): container finished" podID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerID="bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40" exitCode=0 Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.118763 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k64gl" event={"ID":"e1c73f4b-319d-4586-a76d-16aaa83eff0e","Type":"ContainerDied","Data":"bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40"} Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.118809 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-k64gl" event={"ID":"e1c73f4b-319d-4586-a76d-16aaa83eff0e","Type":"ContainerDied","Data":"f320cee45170227ff3b4b09a1cfaf8a3913b76ad2c71929737f41276780eb8e7"} Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.118827 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-k64gl" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.118839 4917 scope.go:117] "RemoveContainer" containerID="bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.149345 4917 scope.go:117] "RemoveContainer" containerID="65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.185112 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-k64gl"] Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.194232 4917 scope.go:117] "RemoveContainer" containerID="7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.196998 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-k64gl"] Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.219112 4917 scope.go:117] "RemoveContainer" containerID="bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40" Mar 18 08:05:33 crc kubenswrapper[4917]: E0318 08:05:33.219564 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40\": container with ID starting with bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40 not found: ID does not exist" containerID="bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.219622 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40"} err="failed to get container status \"bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40\": rpc error: code = NotFound desc = could not find container \"bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40\": container with ID starting with bcc72babce03e763945e47e58a253ade972ebddea442e2f85a8e8380fecfdd40 not found: ID does not exist" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.219647 4917 scope.go:117] "RemoveContainer" containerID="65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89" Mar 18 08:05:33 crc kubenswrapper[4917]: E0318 08:05:33.220178 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89\": container with ID starting with 65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89 not found: ID does not exist" containerID="65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.220243 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89"} err="failed to get container status \"65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89\": rpc error: code = NotFound desc = could not find container \"65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89\": container with ID starting with 65e4ff733b213615e1dd797147c939b29f8190235c3a5f93ebcac6fbaf9abf89 not found: ID does not exist" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.220288 4917 scope.go:117] "RemoveContainer" containerID="7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606" Mar 18 08:05:33 crc kubenswrapper[4917]: E0318 08:05:33.220714 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606\": container with ID starting with 7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606 not found: ID does not exist" containerID="7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.220739 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606"} err="failed to get container status \"7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606\": rpc error: code = NotFound desc = could not find container \"7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606\": container with ID starting with 7c7928e61cda56abfabd60b15bae46448d83cf3046027c7437080310a6553606 not found: ID does not exist" Mar 18 08:05:33 crc kubenswrapper[4917]: I0318 08:05:33.785506 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" path="/var/lib/kubelet/pods/e1c73f4b-319d-4586-a76d-16aaa83eff0e/volumes" Mar 18 08:05:34 crc kubenswrapper[4917]: I0318 08:05:34.934862 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 08:05:35 crc kubenswrapper[4917]: I0318 08:05:35.519761 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.652317 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74754f857-2tf2v"] Mar 18 08:05:40 crc kubenswrapper[4917]: E0318 08:05:40.653270 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd2030a7-9b6f-4481-8f9f-ac6bb054e97b" containerName="mariadb-account-create-update" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.653286 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd2030a7-9b6f-4481-8f9f-ac6bb054e97b" containerName="mariadb-account-create-update" Mar 18 08:05:40 crc kubenswrapper[4917]: E0318 08:05:40.653313 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerName="extract-content" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.653323 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerName="extract-content" Mar 18 08:05:40 crc kubenswrapper[4917]: E0318 08:05:40.653335 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerName="registry-server" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.653343 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerName="registry-server" Mar 18 08:05:40 crc kubenswrapper[4917]: E0318 08:05:40.653363 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerName="extract-utilities" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.653374 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerName="extract-utilities" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.653561 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1c73f4b-319d-4586-a76d-16aaa83eff0e" containerName="registry-server" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.653624 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd2030a7-9b6f-4481-8f9f-ac6bb054e97b" containerName="mariadb-account-create-update" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.654716 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.666955 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74754f857-2tf2v"] Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.839886 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-dns-svc\") pod \"dnsmasq-dns-74754f857-2tf2v\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.840034 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-config\") pod \"dnsmasq-dns-74754f857-2tf2v\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.840068 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbsvs\" (UniqueName: \"kubernetes.io/projected/58a2777d-8ff4-4eb1-9d32-5410377794c7-kube-api-access-sbsvs\") pod \"dnsmasq-dns-74754f857-2tf2v\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.941826 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-config\") pod \"dnsmasq-dns-74754f857-2tf2v\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.941908 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbsvs\" (UniqueName: \"kubernetes.io/projected/58a2777d-8ff4-4eb1-9d32-5410377794c7-kube-api-access-sbsvs\") pod \"dnsmasq-dns-74754f857-2tf2v\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.942563 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-dns-svc\") pod \"dnsmasq-dns-74754f857-2tf2v\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.943151 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-config\") pod \"dnsmasq-dns-74754f857-2tf2v\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.944014 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-dns-svc\") pod \"dnsmasq-dns-74754f857-2tf2v\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.962082 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbsvs\" (UniqueName: \"kubernetes.io/projected/58a2777d-8ff4-4eb1-9d32-5410377794c7-kube-api-access-sbsvs\") pod \"dnsmasq-dns-74754f857-2tf2v\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:40 crc kubenswrapper[4917]: I0318 08:05:40.984569 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:41 crc kubenswrapper[4917]: I0318 08:05:41.446715 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74754f857-2tf2v"] Mar 18 08:05:42 crc kubenswrapper[4917]: I0318 08:05:42.045314 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 08:05:42 crc kubenswrapper[4917]: I0318 08:05:42.176268 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 08:05:42 crc kubenswrapper[4917]: I0318 08:05:42.195821 4917 generic.go:334] "Generic (PLEG): container finished" podID="58a2777d-8ff4-4eb1-9d32-5410377794c7" containerID="e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1" exitCode=0 Mar 18 08:05:42 crc kubenswrapper[4917]: I0318 08:05:42.195862 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74754f857-2tf2v" event={"ID":"58a2777d-8ff4-4eb1-9d32-5410377794c7","Type":"ContainerDied","Data":"e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1"} Mar 18 08:05:42 crc kubenswrapper[4917]: I0318 08:05:42.195886 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74754f857-2tf2v" event={"ID":"58a2777d-8ff4-4eb1-9d32-5410377794c7","Type":"ContainerStarted","Data":"d812f733e4cd6337d765b24ef725376fc2ea8334c4a75ad320e3dd0eeba0b1b3"} Mar 18 08:05:43 crc kubenswrapper[4917]: I0318 08:05:43.205332 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74754f857-2tf2v" event={"ID":"58a2777d-8ff4-4eb1-9d32-5410377794c7","Type":"ContainerStarted","Data":"b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2"} Mar 18 08:05:43 crc kubenswrapper[4917]: I0318 08:05:43.205771 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:43 crc kubenswrapper[4917]: I0318 08:05:43.225007 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74754f857-2tf2v" podStartSLOduration=3.224991372 podStartE2EDuration="3.224991372s" podCreationTimestamp="2026-03-18 08:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:05:43.222065152 +0000 UTC m=+4728.163219866" watchObservedRunningTime="2026-03-18 08:05:43.224991372 +0000 UTC m=+4728.166146086" Mar 18 08:05:44 crc kubenswrapper[4917]: I0318 08:05:44.772665 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:05:44 crc kubenswrapper[4917]: E0318 08:05:44.773359 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:05:45 crc kubenswrapper[4917]: I0318 08:05:45.994253 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="13dca274-b8d1-439f-b3cc-a073f12bdc37" containerName="rabbitmq" containerID="cri-o://a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab" gracePeriod=604797 Mar 18 08:05:46 crc kubenswrapper[4917]: I0318 08:05:46.000919 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="d758aedb-3012-4f5c-badd-725b4a4b8a42" containerName="rabbitmq" containerID="cri-o://a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7" gracePeriod=604797 Mar 18 08:05:50 crc kubenswrapper[4917]: I0318 08:05:50.986816 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.050233 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dccc9c6df-872gm"] Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.050575 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" podUID="632534e3-b259-4851-8d0c-13b538a945f8" containerName="dnsmasq-dns" containerID="cri-o://77958d7c96a2eccf500d43f2b02194fcca45347280aee7120a60bf439d67bf42" gracePeriod=10 Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.279969 4917 generic.go:334] "Generic (PLEG): container finished" podID="632534e3-b259-4851-8d0c-13b538a945f8" containerID="77958d7c96a2eccf500d43f2b02194fcca45347280aee7120a60bf439d67bf42" exitCode=0 Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.280012 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" event={"ID":"632534e3-b259-4851-8d0c-13b538a945f8","Type":"ContainerDied","Data":"77958d7c96a2eccf500d43f2b02194fcca45347280aee7120a60bf439d67bf42"} Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.498647 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.687027 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlwlp\" (UniqueName: \"kubernetes.io/projected/632534e3-b259-4851-8d0c-13b538a945f8-kube-api-access-zlwlp\") pod \"632534e3-b259-4851-8d0c-13b538a945f8\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.687255 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-dns-svc\") pod \"632534e3-b259-4851-8d0c-13b538a945f8\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.687390 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-config\") pod \"632534e3-b259-4851-8d0c-13b538a945f8\" (UID: \"632534e3-b259-4851-8d0c-13b538a945f8\") " Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.693227 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/632534e3-b259-4851-8d0c-13b538a945f8-kube-api-access-zlwlp" (OuterVolumeSpecName: "kube-api-access-zlwlp") pod "632534e3-b259-4851-8d0c-13b538a945f8" (UID: "632534e3-b259-4851-8d0c-13b538a945f8"). InnerVolumeSpecName "kube-api-access-zlwlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.721365 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "632534e3-b259-4851-8d0c-13b538a945f8" (UID: "632534e3-b259-4851-8d0c-13b538a945f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.789724 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlwlp\" (UniqueName: \"kubernetes.io/projected/632534e3-b259-4851-8d0c-13b538a945f8-kube-api-access-zlwlp\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:51 crc kubenswrapper[4917]: I0318 08:05:51.789763 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.294246 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" event={"ID":"632534e3-b259-4851-8d0c-13b538a945f8","Type":"ContainerDied","Data":"7aaf5d091e6399137ac598c92057c0dca8502066e91cfe342d8845e609ec65c5"} Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.294308 4917 scope.go:117] "RemoveContainer" containerID="77958d7c96a2eccf500d43f2b02194fcca45347280aee7120a60bf439d67bf42" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.294359 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dccc9c6df-872gm" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.432269 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-config" (OuterVolumeSpecName: "config") pod "632534e3-b259-4851-8d0c-13b538a945f8" (UID: "632534e3-b259-4851-8d0c-13b538a945f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.503748 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/632534e3-b259-4851-8d0c-13b538a945f8-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.550787 4917 scope.go:117] "RemoveContainer" containerID="bd7ff41208e62b87739117b39a29a47ef59ec14c5e923a113a868a5ca5782a1e" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.632100 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dccc9c6df-872gm"] Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.651068 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dccc9c6df-872gm"] Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.788846 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.801941 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.815057 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-server-conf\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.815472 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d758aedb-3012-4f5c-badd-725b4a4b8a42-erlang-cookie-secret\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.815628 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-confd\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.815666 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d758aedb-3012-4f5c-badd-725b4a4b8a42-pod-info\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.815697 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-erlang-cookie\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.815732 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-confd\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.821819 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d758aedb-3012-4f5c-badd-725b4a4b8a42-pod-info" (OuterVolumeSpecName: "pod-info") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.821932 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.823510 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d758aedb-3012-4f5c-badd-725b4a4b8a42-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919561 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-plugins-conf\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919641 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13dca274-b8d1-439f-b3cc-a073f12bdc37-pod-info\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919668 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-server-conf\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919697 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-tls\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919723 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-erlang-cookie\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919753 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-config-data\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919767 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-plugins-conf\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919791 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13dca274-b8d1-439f-b3cc-a073f12bdc37-erlang-cookie-secret\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919813 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-plugins\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919851 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-plugins\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919874 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-tls\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919898 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7j9q\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-kube-api-access-k7j9q\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.919920 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-kube-api-access-8x885\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.920097 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") pod \"13dca274-b8d1-439f-b3cc-a073f12bdc37\" (UID: \"13dca274-b8d1-439f-b3cc-a073f12bdc37\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.920116 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-config-data\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.920174 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") pod \"d758aedb-3012-4f5c-badd-725b4a4b8a42\" (UID: \"d758aedb-3012-4f5c-badd-725b4a4b8a42\") " Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.920460 4917 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d758aedb-3012-4f5c-badd-725b4a4b8a42-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.920490 4917 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d758aedb-3012-4f5c-badd-725b4a4b8a42-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.920500 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.921170 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.921292 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.921402 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.921709 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.921918 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.932258 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/13dca274-b8d1-439f-b3cc-a073f12bdc37-pod-info" (OuterVolumeSpecName: "pod-info") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.932258 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13dca274-b8d1-439f-b3cc-a073f12bdc37-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.932276 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-kube-api-access-8x885" (OuterVolumeSpecName: "kube-api-access-8x885") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "kube-api-access-8x885". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.932433 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-kube-api-access-k7j9q" (OuterVolumeSpecName: "kube-api-access-k7j9q") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "kube-api-access-k7j9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.932829 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.934457 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9" (OuterVolumeSpecName: "persistence") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "pvc-f7f690ba-6c8a-4406-b038-3359896866d9". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.935650 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32" (OuterVolumeSpecName: "persistence") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "pvc-29cdb22a-389d-4f28-b65a-194b060dea32". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.940429 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.943812 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-config-data" (OuterVolumeSpecName: "config-data") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.944338 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-config-data" (OuterVolumeSpecName: "config-data") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.954249 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-server-conf" (OuterVolumeSpecName: "server-conf") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.964369 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-server-conf" (OuterVolumeSpecName: "server-conf") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:05:52 crc kubenswrapper[4917]: I0318 08:05:52.994418 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "13dca274-b8d1-439f-b3cc-a073f12bdc37" (UID: "13dca274-b8d1-439f-b3cc-a073f12bdc37"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.008768 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d758aedb-3012-4f5c-badd-725b4a4b8a42" (UID: "d758aedb-3012-4f5c-badd-725b4a4b8a42"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.021814 4917 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022029 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022105 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022204 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") on node \"crc\" " Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022270 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022341 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") on node \"crc\" " Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022438 4917 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d758aedb-3012-4f5c-badd-725b4a4b8a42-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022511 4917 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/13dca274-b8d1-439f-b3cc-a073f12bdc37-pod-info\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022566 4917 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-server-conf\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022640 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022707 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022768 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022843 4917 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/13dca274-b8d1-439f-b3cc-a073f12bdc37-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022927 4917 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/13dca274-b8d1-439f-b3cc-a073f12bdc37-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.022990 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.023045 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d758aedb-3012-4f5c-badd-725b4a4b8a42-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.023099 4917 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.023160 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7j9q\" (UniqueName: \"kubernetes.io/projected/13dca274-b8d1-439f-b3cc-a073f12bdc37-kube-api-access-k7j9q\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.023231 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x885\" (UniqueName: \"kubernetes.io/projected/d758aedb-3012-4f5c-badd-725b4a4b8a42-kube-api-access-8x885\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.036113 4917 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.036244 4917 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-f7f690ba-6c8a-4406-b038-3359896866d9" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9") on node "crc" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.047322 4917 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.047471 4917 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-29cdb22a-389d-4f28-b65a-194b060dea32" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32") on node "crc" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.124837 4917 reconciler_common.go:293] "Volume detached for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.125089 4917 reconciler_common.go:293] "Volume detached for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") on node \"crc\" DevicePath \"\"" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.304926 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.305375 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"13dca274-b8d1-439f-b3cc-a073f12bdc37","Type":"ContainerDied","Data":"a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab"} Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.305461 4917 scope.go:117] "RemoveContainer" containerID="a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.305327 4917 generic.go:334] "Generic (PLEG): container finished" podID="13dca274-b8d1-439f-b3cc-a073f12bdc37" containerID="a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab" exitCode=0 Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.305822 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"13dca274-b8d1-439f-b3cc-a073f12bdc37","Type":"ContainerDied","Data":"1a2fc1d8c51eb649f0b6c71b0fac9504ab7f2ed2db33f9a31a100afc66861d52"} Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.308300 4917 generic.go:334] "Generic (PLEG): container finished" podID="d758aedb-3012-4f5c-badd-725b4a4b8a42" containerID="a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7" exitCode=0 Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.308335 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d758aedb-3012-4f5c-badd-725b4a4b8a42","Type":"ContainerDied","Data":"a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7"} Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.308357 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"d758aedb-3012-4f5c-badd-725b4a4b8a42","Type":"ContainerDied","Data":"619daaf98b4bc47a0d75901e694bc34e5c6f99a15ddacc7ad24c57efd7d77e5b"} Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.308510 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.321805 4917 scope.go:117] "RemoveContainer" containerID="d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.341770 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.357277 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.363568 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.364851 4917 scope.go:117] "RemoveContainer" containerID="a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab" Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.366858 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab\": container with ID starting with a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab not found: ID does not exist" containerID="a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.366903 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab"} err="failed to get container status \"a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab\": rpc error: code = NotFound desc = could not find container \"a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab\": container with ID starting with a2c8de320ba8eb8cfc106a1cd860859571a16359aafeba38cd92aef93329b1ab not found: ID does not exist" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.366935 4917 scope.go:117] "RemoveContainer" containerID="d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a" Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.368230 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a\": container with ID starting with d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a not found: ID does not exist" containerID="d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.368391 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a"} err="failed to get container status \"d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a\": rpc error: code = NotFound desc = could not find container \"d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a\": container with ID starting with d2f6e8839b0bc77e9d85f96586cdb4624d6a99d9476c044e6a439d18eb42d52a not found: ID does not exist" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.368539 4917 scope.go:117] "RemoveContainer" containerID="a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.372224 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.394658 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.395357 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632534e3-b259-4851-8d0c-13b538a945f8" containerName="dnsmasq-dns" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.395372 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="632534e3-b259-4851-8d0c-13b538a945f8" containerName="dnsmasq-dns" Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.395386 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dca274-b8d1-439f-b3cc-a073f12bdc37" containerName="setup-container" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.395394 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dca274-b8d1-439f-b3cc-a073f12bdc37" containerName="setup-container" Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.395411 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="632534e3-b259-4851-8d0c-13b538a945f8" containerName="init" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.395419 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="632534e3-b259-4851-8d0c-13b538a945f8" containerName="init" Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.395443 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13dca274-b8d1-439f-b3cc-a073f12bdc37" containerName="rabbitmq" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.395451 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dca274-b8d1-439f-b3cc-a073f12bdc37" containerName="rabbitmq" Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.395467 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d758aedb-3012-4f5c-badd-725b4a4b8a42" containerName="rabbitmq" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.395476 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d758aedb-3012-4f5c-badd-725b4a4b8a42" containerName="rabbitmq" Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.395490 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d758aedb-3012-4f5c-badd-725b4a4b8a42" containerName="setup-container" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.395497 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d758aedb-3012-4f5c-badd-725b4a4b8a42" containerName="setup-container" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.395753 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d758aedb-3012-4f5c-badd-725b4a4b8a42" containerName="rabbitmq" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.395780 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="632534e3-b259-4851-8d0c-13b538a945f8" containerName="dnsmasq-dns" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.395839 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="13dca274-b8d1-439f-b3cc-a073f12bdc37" containerName="rabbitmq" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.396897 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.402105 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.402445 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.402635 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.402806 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.402934 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-fxjm7" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.403687 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.406070 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.419141 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.420294 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.423699 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.423830 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.423895 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.424018 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.424036 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-kmn7r" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.424171 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.424217 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.432140 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.438919 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.446278 4917 scope.go:117] "RemoveContainer" containerID="dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.479026 4917 scope.go:117] "RemoveContainer" containerID="a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7" Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.479391 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7\": container with ID starting with a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7 not found: ID does not exist" containerID="a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.479427 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7"} err="failed to get container status \"a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7\": rpc error: code = NotFound desc = could not find container \"a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7\": container with ID starting with a439f31fda8790f75de9cd10b111f0dbd274e21cfec0ea97e9d0e861aac74dd7 not found: ID does not exist" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.479451 4917 scope.go:117] "RemoveContainer" containerID="dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c" Mar 18 08:05:53 crc kubenswrapper[4917]: E0318 08:05:53.479791 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c\": container with ID starting with dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c not found: ID does not exist" containerID="dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.479812 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c"} err="failed to get container status \"dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c\": rpc error: code = NotFound desc = could not find container \"dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c\": container with ID starting with dfffc77504db3dc30cc2fe0dca6158abff42f429b80a64009871da355a99c96c not found: ID does not exist" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.529796 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.529862 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.529890 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.529912 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/669c480a-39bf-4e91-985b-528c87fa0129-pod-info\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.529947 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/669c480a-39bf-4e91-985b-528c87fa0129-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.529976 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhk9r\" (UniqueName: \"kubernetes.io/projected/669c480a-39bf-4e91-985b-528c87fa0129-kube-api-access-lhk9r\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530008 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530033 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530054 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530077 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/669c480a-39bf-4e91-985b-528c87fa0129-server-conf\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530110 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530136 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530177 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530206 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530227 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530253 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530279 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530302 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530333 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/669c480a-39bf-4e91-985b-528c87fa0129-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530355 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530397 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxbq8\" (UniqueName: \"kubernetes.io/projected/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-kube-api-access-kxbq8\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.530420 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/669c480a-39bf-4e91-985b-528c87fa0129-config-data\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631639 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631680 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631699 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631720 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/669c480a-39bf-4e91-985b-528c87fa0129-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631735 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631773 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxbq8\" (UniqueName: \"kubernetes.io/projected/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-kube-api-access-kxbq8\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631797 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/669c480a-39bf-4e91-985b-528c87fa0129-config-data\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631820 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631844 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631861 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631878 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/669c480a-39bf-4e91-985b-528c87fa0129-pod-info\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631906 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/669c480a-39bf-4e91-985b-528c87fa0129-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631926 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhk9r\" (UniqueName: \"kubernetes.io/projected/669c480a-39bf-4e91-985b-528c87fa0129-kube-api-access-lhk9r\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631947 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631967 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631983 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.631999 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/669c480a-39bf-4e91-985b-528c87fa0129-server-conf\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.632031 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.632051 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.632066 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.632082 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.632095 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.632421 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.632702 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.633094 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.634292 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.634521 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/669c480a-39bf-4e91-985b-528c87fa0129-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.634610 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/669c480a-39bf-4e91-985b-528c87fa0129-config-data\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.634783 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.635476 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.635960 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.636019 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/669c480a-39bf-4e91-985b-528c87fa0129-server-conf\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.637798 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.637828 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/28cec60ac3c0a43ee12064d068e25a8071e127eaeb21d18b0e80bb3c74ab3653/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.638075 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.638137 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/51e6c584895ab848edd398d70774c4e05a3109af4de762115a2f273ac296e4ec/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.781967 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13dca274-b8d1-439f-b3cc-a073f12bdc37" path="/var/lib/kubelet/pods/13dca274-b8d1-439f-b3cc-a073f12bdc37/volumes" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.782505 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="632534e3-b259-4851-8d0c-13b538a945f8" path="/var/lib/kubelet/pods/632534e3-b259-4851-8d0c-13b538a945f8/volumes" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.783801 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d758aedb-3012-4f5c-badd-725b4a4b8a42" path="/var/lib/kubelet/pods/d758aedb-3012-4f5c-badd-725b4a4b8a42/volumes" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.831830 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.831884 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/669c480a-39bf-4e91-985b-528c87fa0129-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.831927 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/669c480a-39bf-4e91-985b-528c87fa0129-pod-info\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.832420 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.834741 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.836636 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.838468 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.838598 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/669c480a-39bf-4e91-985b-528c87fa0129-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.840166 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhk9r\" (UniqueName: \"kubernetes.io/projected/669c480a-39bf-4e91-985b-528c87fa0129-kube-api-access-lhk9r\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.840544 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxbq8\" (UniqueName: \"kubernetes.io/projected/0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72-kube-api-access-kxbq8\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.860061 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-29cdb22a-389d-4f28-b65a-194b060dea32\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-29cdb22a-389d-4f28-b65a-194b060dea32\") pod \"rabbitmq-server-0\" (UID: \"669c480a-39bf-4e91-985b-528c87fa0129\") " pod="openstack/rabbitmq-server-0" Mar 18 08:05:53 crc kubenswrapper[4917]: I0318 08:05:53.860101 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f7f690ba-6c8a-4406-b038-3359896866d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f7f690ba-6c8a-4406-b038-3359896866d9\") pod \"rabbitmq-cell1-server-0\" (UID: \"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72\") " pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:54 crc kubenswrapper[4917]: I0318 08:05:54.110855 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 18 08:05:54 crc kubenswrapper[4917]: I0318 08:05:54.122840 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:05:54 crc kubenswrapper[4917]: I0318 08:05:54.455266 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 18 08:05:54 crc kubenswrapper[4917]: W0318 08:05:54.462117 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f5f6d6e_ea66_4dc9_a78a_d59c5dbdbb72.slice/crio-13b65cbd1e21cef25fe7a4b44ad223720b2e12653ded8eeaa5cbc6e5767f9be0 WatchSource:0}: Error finding container 13b65cbd1e21cef25fe7a4b44ad223720b2e12653ded8eeaa5cbc6e5767f9be0: Status 404 returned error can't find the container with id 13b65cbd1e21cef25fe7a4b44ad223720b2e12653ded8eeaa5cbc6e5767f9be0 Mar 18 08:05:54 crc kubenswrapper[4917]: I0318 08:05:54.598123 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 18 08:05:54 crc kubenswrapper[4917]: W0318 08:05:54.602655 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod669c480a_39bf_4e91_985b_528c87fa0129.slice/crio-6b7bcaff719abe416c173191b9218808b4609bda9271eaf0075215f34c79b65e WatchSource:0}: Error finding container 6b7bcaff719abe416c173191b9218808b4609bda9271eaf0075215f34c79b65e: Status 404 returned error can't find the container with id 6b7bcaff719abe416c173191b9218808b4609bda9271eaf0075215f34c79b65e Mar 18 08:05:55 crc kubenswrapper[4917]: I0318 08:05:55.331495 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"669c480a-39bf-4e91-985b-528c87fa0129","Type":"ContainerStarted","Data":"6b7bcaff719abe416c173191b9218808b4609bda9271eaf0075215f34c79b65e"} Mar 18 08:05:55 crc kubenswrapper[4917]: I0318 08:05:55.333980 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72","Type":"ContainerStarted","Data":"13b65cbd1e21cef25fe7a4b44ad223720b2e12653ded8eeaa5cbc6e5767f9be0"} Mar 18 08:05:57 crc kubenswrapper[4917]: I0318 08:05:57.359229 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"669c480a-39bf-4e91-985b-528c87fa0129","Type":"ContainerStarted","Data":"8c41653e9be8466ee957f9c7f203e23f7ea1cb5017333737a8b73d42599e7725"} Mar 18 08:05:57 crc kubenswrapper[4917]: I0318 08:05:57.361971 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72","Type":"ContainerStarted","Data":"6ff88b70cdd299bfe738630f9732fe1aebbc1c9ea0d79d7ac3ec23bd9487376d"} Mar 18 08:05:57 crc kubenswrapper[4917]: I0318 08:05:57.773393 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:05:57 crc kubenswrapper[4917]: E0318 08:05:57.773749 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.162191 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563686-s8hgz"] Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.164697 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563686-s8hgz" Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.169307 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.169340 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.170891 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563686-s8hgz"] Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.176395 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.267688 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4k9m\" (UniqueName: \"kubernetes.io/projected/7061c831-cc15-4ad1-8748-2327a5bea84e-kube-api-access-s4k9m\") pod \"auto-csr-approver-29563686-s8hgz\" (UID: \"7061c831-cc15-4ad1-8748-2327a5bea84e\") " pod="openshift-infra/auto-csr-approver-29563686-s8hgz" Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.370148 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4k9m\" (UniqueName: \"kubernetes.io/projected/7061c831-cc15-4ad1-8748-2327a5bea84e-kube-api-access-s4k9m\") pod \"auto-csr-approver-29563686-s8hgz\" (UID: \"7061c831-cc15-4ad1-8748-2327a5bea84e\") " pod="openshift-infra/auto-csr-approver-29563686-s8hgz" Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.403935 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4k9m\" (UniqueName: \"kubernetes.io/projected/7061c831-cc15-4ad1-8748-2327a5bea84e-kube-api-access-s4k9m\") pod \"auto-csr-approver-29563686-s8hgz\" (UID: \"7061c831-cc15-4ad1-8748-2327a5bea84e\") " pod="openshift-infra/auto-csr-approver-29563686-s8hgz" Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.500289 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563686-s8hgz" Mar 18 08:06:00 crc kubenswrapper[4917]: I0318 08:06:00.976063 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563686-s8hgz"] Mar 18 08:06:01 crc kubenswrapper[4917]: I0318 08:06:01.408919 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563686-s8hgz" event={"ID":"7061c831-cc15-4ad1-8748-2327a5bea84e","Type":"ContainerStarted","Data":"7133c0ff9bcbffa1ae92acec3824870b465302ca5c5ceb732f865baec99e9232"} Mar 18 08:06:03 crc kubenswrapper[4917]: I0318 08:06:03.427950 4917 generic.go:334] "Generic (PLEG): container finished" podID="7061c831-cc15-4ad1-8748-2327a5bea84e" containerID="fe111208ba0414c8f247d915a0bc05e53a8a3c35d4fda2007b4cb160eb94872f" exitCode=0 Mar 18 08:06:03 crc kubenswrapper[4917]: I0318 08:06:03.428306 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563686-s8hgz" event={"ID":"7061c831-cc15-4ad1-8748-2327a5bea84e","Type":"ContainerDied","Data":"fe111208ba0414c8f247d915a0bc05e53a8a3c35d4fda2007b4cb160eb94872f"} Mar 18 08:06:04 crc kubenswrapper[4917]: I0318 08:06:04.869466 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563686-s8hgz" Mar 18 08:06:04 crc kubenswrapper[4917]: I0318 08:06:04.947387 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4k9m\" (UniqueName: \"kubernetes.io/projected/7061c831-cc15-4ad1-8748-2327a5bea84e-kube-api-access-s4k9m\") pod \"7061c831-cc15-4ad1-8748-2327a5bea84e\" (UID: \"7061c831-cc15-4ad1-8748-2327a5bea84e\") " Mar 18 08:06:04 crc kubenswrapper[4917]: I0318 08:06:04.955603 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7061c831-cc15-4ad1-8748-2327a5bea84e-kube-api-access-s4k9m" (OuterVolumeSpecName: "kube-api-access-s4k9m") pod "7061c831-cc15-4ad1-8748-2327a5bea84e" (UID: "7061c831-cc15-4ad1-8748-2327a5bea84e"). InnerVolumeSpecName "kube-api-access-s4k9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:06:05 crc kubenswrapper[4917]: I0318 08:06:05.049026 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4k9m\" (UniqueName: \"kubernetes.io/projected/7061c831-cc15-4ad1-8748-2327a5bea84e-kube-api-access-s4k9m\") on node \"crc\" DevicePath \"\"" Mar 18 08:06:05 crc kubenswrapper[4917]: I0318 08:06:05.453641 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563686-s8hgz" event={"ID":"7061c831-cc15-4ad1-8748-2327a5bea84e","Type":"ContainerDied","Data":"7133c0ff9bcbffa1ae92acec3824870b465302ca5c5ceb732f865baec99e9232"} Mar 18 08:06:05 crc kubenswrapper[4917]: I0318 08:06:05.453702 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7133c0ff9bcbffa1ae92acec3824870b465302ca5c5ceb732f865baec99e9232" Mar 18 08:06:05 crc kubenswrapper[4917]: I0318 08:06:05.453703 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563686-s8hgz" Mar 18 08:06:05 crc kubenswrapper[4917]: I0318 08:06:05.976074 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563680-blp4p"] Mar 18 08:06:05 crc kubenswrapper[4917]: I0318 08:06:05.984833 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563680-blp4p"] Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.712557 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7k2mb"] Mar 18 08:06:06 crc kubenswrapper[4917]: E0318 08:06:06.713437 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7061c831-cc15-4ad1-8748-2327a5bea84e" containerName="oc" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.713468 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7061c831-cc15-4ad1-8748-2327a5bea84e" containerName="oc" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.713772 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7061c831-cc15-4ad1-8748-2327a5bea84e" containerName="oc" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.718219 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.732437 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7k2mb"] Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.781247 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvx8t\" (UniqueName: \"kubernetes.io/projected/4f62924b-e50b-4fd9-ab16-65074a1f062f-kube-api-access-fvx8t\") pod \"redhat-operators-7k2mb\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.781284 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-catalog-content\") pod \"redhat-operators-7k2mb\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.781315 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-utilities\") pod \"redhat-operators-7k2mb\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.882903 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvx8t\" (UniqueName: \"kubernetes.io/projected/4f62924b-e50b-4fd9-ab16-65074a1f062f-kube-api-access-fvx8t\") pod \"redhat-operators-7k2mb\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.882962 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-catalog-content\") pod \"redhat-operators-7k2mb\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.883015 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-utilities\") pod \"redhat-operators-7k2mb\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.883691 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-utilities\") pod \"redhat-operators-7k2mb\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.883942 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-catalog-content\") pod \"redhat-operators-7k2mb\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:06 crc kubenswrapper[4917]: I0318 08:06:06.909106 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvx8t\" (UniqueName: \"kubernetes.io/projected/4f62924b-e50b-4fd9-ab16-65074a1f062f-kube-api-access-fvx8t\") pod \"redhat-operators-7k2mb\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:07 crc kubenswrapper[4917]: I0318 08:06:07.084942 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:07 crc kubenswrapper[4917]: I0318 08:06:07.544788 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7k2mb"] Mar 18 08:06:07 crc kubenswrapper[4917]: W0318 08:06:07.559836 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f62924b_e50b_4fd9_ab16_65074a1f062f.slice/crio-30959a376bf7a8c4c2f6125420e4219a2b6cde81b2eb0e295451a2c9a23f511e WatchSource:0}: Error finding container 30959a376bf7a8c4c2f6125420e4219a2b6cde81b2eb0e295451a2c9a23f511e: Status 404 returned error can't find the container with id 30959a376bf7a8c4c2f6125420e4219a2b6cde81b2eb0e295451a2c9a23f511e Mar 18 08:06:07 crc kubenswrapper[4917]: I0318 08:06:07.780325 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a5bb6b9-4df3-4211-8e19-7077e71eb072" path="/var/lib/kubelet/pods/1a5bb6b9-4df3-4211-8e19-7077e71eb072/volumes" Mar 18 08:06:08 crc kubenswrapper[4917]: I0318 08:06:08.486887 4917 generic.go:334] "Generic (PLEG): container finished" podID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerID="db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8" exitCode=0 Mar 18 08:06:08 crc kubenswrapper[4917]: I0318 08:06:08.486981 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k2mb" event={"ID":"4f62924b-e50b-4fd9-ab16-65074a1f062f","Type":"ContainerDied","Data":"db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8"} Mar 18 08:06:08 crc kubenswrapper[4917]: I0318 08:06:08.487194 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k2mb" event={"ID":"4f62924b-e50b-4fd9-ab16-65074a1f062f","Type":"ContainerStarted","Data":"30959a376bf7a8c4c2f6125420e4219a2b6cde81b2eb0e295451a2c9a23f511e"} Mar 18 08:06:09 crc kubenswrapper[4917]: I0318 08:06:09.500337 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k2mb" event={"ID":"4f62924b-e50b-4fd9-ab16-65074a1f062f","Type":"ContainerStarted","Data":"2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24"} Mar 18 08:06:09 crc kubenswrapper[4917]: I0318 08:06:09.772301 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:06:09 crc kubenswrapper[4917]: E0318 08:06:09.772633 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:06:10 crc kubenswrapper[4917]: I0318 08:06:10.514298 4917 generic.go:334] "Generic (PLEG): container finished" podID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerID="2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24" exitCode=0 Mar 18 08:06:10 crc kubenswrapper[4917]: I0318 08:06:10.514365 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k2mb" event={"ID":"4f62924b-e50b-4fd9-ab16-65074a1f062f","Type":"ContainerDied","Data":"2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24"} Mar 18 08:06:11 crc kubenswrapper[4917]: I0318 08:06:11.523978 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k2mb" event={"ID":"4f62924b-e50b-4fd9-ab16-65074a1f062f","Type":"ContainerStarted","Data":"72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d"} Mar 18 08:06:11 crc kubenswrapper[4917]: I0318 08:06:11.546288 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7k2mb" podStartSLOduration=3.076298124 podStartE2EDuration="5.546269805s" podCreationTimestamp="2026-03-18 08:06:06 +0000 UTC" firstStartedPulling="2026-03-18 08:06:08.489471995 +0000 UTC m=+4753.430626719" lastFinishedPulling="2026-03-18 08:06:10.959443676 +0000 UTC m=+4755.900598400" observedRunningTime="2026-03-18 08:06:11.542482823 +0000 UTC m=+4756.483637537" watchObservedRunningTime="2026-03-18 08:06:11.546269805 +0000 UTC m=+4756.487424519" Mar 18 08:06:17 crc kubenswrapper[4917]: I0318 08:06:17.085362 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:17 crc kubenswrapper[4917]: I0318 08:06:17.086063 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:18 crc kubenswrapper[4917]: I0318 08:06:18.147075 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7k2mb" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerName="registry-server" probeResult="failure" output=< Mar 18 08:06:18 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:06:18 crc kubenswrapper[4917]: > Mar 18 08:06:20 crc kubenswrapper[4917]: I0318 08:06:20.773472 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:06:20 crc kubenswrapper[4917]: E0318 08:06:20.774282 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:06:27 crc kubenswrapper[4917]: I0318 08:06:27.161012 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:27 crc kubenswrapper[4917]: I0318 08:06:27.247562 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:27 crc kubenswrapper[4917]: I0318 08:06:27.418299 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7k2mb"] Mar 18 08:06:28 crc kubenswrapper[4917]: I0318 08:06:28.704885 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7k2mb" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerName="registry-server" containerID="cri-o://72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d" gracePeriod=2 Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.217812 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.411999 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvx8t\" (UniqueName: \"kubernetes.io/projected/4f62924b-e50b-4fd9-ab16-65074a1f062f-kube-api-access-fvx8t\") pod \"4f62924b-e50b-4fd9-ab16-65074a1f062f\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.412839 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-utilities\") pod \"4f62924b-e50b-4fd9-ab16-65074a1f062f\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.412903 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-catalog-content\") pod \"4f62924b-e50b-4fd9-ab16-65074a1f062f\" (UID: \"4f62924b-e50b-4fd9-ab16-65074a1f062f\") " Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.413821 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-utilities" (OuterVolumeSpecName: "utilities") pod "4f62924b-e50b-4fd9-ab16-65074a1f062f" (UID: "4f62924b-e50b-4fd9-ab16-65074a1f062f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.418614 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f62924b-e50b-4fd9-ab16-65074a1f062f-kube-api-access-fvx8t" (OuterVolumeSpecName: "kube-api-access-fvx8t") pod "4f62924b-e50b-4fd9-ab16-65074a1f062f" (UID: "4f62924b-e50b-4fd9-ab16-65074a1f062f"). InnerVolumeSpecName "kube-api-access-fvx8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.515985 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.516107 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvx8t\" (UniqueName: \"kubernetes.io/projected/4f62924b-e50b-4fd9-ab16-65074a1f062f-kube-api-access-fvx8t\") on node \"crc\" DevicePath \"\"" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.562092 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f62924b-e50b-4fd9-ab16-65074a1f062f" (UID: "4f62924b-e50b-4fd9-ab16-65074a1f062f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.617119 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f62924b-e50b-4fd9-ab16-65074a1f062f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.717152 4917 generic.go:334] "Generic (PLEG): container finished" podID="0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72" containerID="6ff88b70cdd299bfe738630f9732fe1aebbc1c9ea0d79d7ac3ec23bd9487376d" exitCode=0 Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.717274 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72","Type":"ContainerDied","Data":"6ff88b70cdd299bfe738630f9732fe1aebbc1c9ea0d79d7ac3ec23bd9487376d"} Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.726692 4917 generic.go:334] "Generic (PLEG): container finished" podID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerID="72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d" exitCode=0 Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.726745 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k2mb" event={"ID":"4f62924b-e50b-4fd9-ab16-65074a1f062f","Type":"ContainerDied","Data":"72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d"} Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.726787 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7k2mb" event={"ID":"4f62924b-e50b-4fd9-ab16-65074a1f062f","Type":"ContainerDied","Data":"30959a376bf7a8c4c2f6125420e4219a2b6cde81b2eb0e295451a2c9a23f511e"} Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.726791 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7k2mb" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.726816 4917 scope.go:117] "RemoveContainer" containerID="72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.782513 4917 scope.go:117] "RemoveContainer" containerID="2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.802518 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7k2mb"] Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.809865 4917 scope.go:117] "RemoveContainer" containerID="db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.816833 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7k2mb"] Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.828648 4917 scope.go:117] "RemoveContainer" containerID="72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d" Mar 18 08:06:29 crc kubenswrapper[4917]: E0318 08:06:29.829130 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d\": container with ID starting with 72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d not found: ID does not exist" containerID="72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.829158 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d"} err="failed to get container status \"72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d\": rpc error: code = NotFound desc = could not find container \"72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d\": container with ID starting with 72774252f3dc3c2f9851555a54dbdfd6d7cc12388f3be11fbde78f4984eab53d not found: ID does not exist" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.829180 4917 scope.go:117] "RemoveContainer" containerID="2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24" Mar 18 08:06:29 crc kubenswrapper[4917]: E0318 08:06:29.829916 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24\": container with ID starting with 2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24 not found: ID does not exist" containerID="2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.829938 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24"} err="failed to get container status \"2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24\": rpc error: code = NotFound desc = could not find container \"2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24\": container with ID starting with 2c6937da2c027522187de73472998baa77f69f32c375697c386175fe80baee24 not found: ID does not exist" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.829951 4917 scope.go:117] "RemoveContainer" containerID="db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8" Mar 18 08:06:29 crc kubenswrapper[4917]: E0318 08:06:29.830595 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8\": container with ID starting with db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8 not found: ID does not exist" containerID="db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8" Mar 18 08:06:29 crc kubenswrapper[4917]: I0318 08:06:29.830642 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8"} err="failed to get container status \"db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8\": rpc error: code = NotFound desc = could not find container \"db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8\": container with ID starting with db5042abf7ca551ede0ee3c493c6646d3cf1adc104a2dda81a94ccb3068e61f8 not found: ID does not exist" Mar 18 08:06:30 crc kubenswrapper[4917]: I0318 08:06:30.744487 4917 generic.go:334] "Generic (PLEG): container finished" podID="669c480a-39bf-4e91-985b-528c87fa0129" containerID="8c41653e9be8466ee957f9c7f203e23f7ea1cb5017333737a8b73d42599e7725" exitCode=0 Mar 18 08:06:30 crc kubenswrapper[4917]: I0318 08:06:30.744662 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"669c480a-39bf-4e91-985b-528c87fa0129","Type":"ContainerDied","Data":"8c41653e9be8466ee957f9c7f203e23f7ea1cb5017333737a8b73d42599e7725"} Mar 18 08:06:30 crc kubenswrapper[4917]: I0318 08:06:30.750794 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72","Type":"ContainerStarted","Data":"92df4fdd3f20025aba3d0fa1d865b32d22db576b4eee9c610e2f15b26f3ab08e"} Mar 18 08:06:30 crc kubenswrapper[4917]: I0318 08:06:30.751118 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:06:30 crc kubenswrapper[4917]: I0318 08:06:30.821767 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.821748083 podStartE2EDuration="37.821748083s" podCreationTimestamp="2026-03-18 08:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:06:30.80881155 +0000 UTC m=+4775.749966314" watchObservedRunningTime="2026-03-18 08:06:30.821748083 +0000 UTC m=+4775.762902807" Mar 18 08:06:31 crc kubenswrapper[4917]: I0318 08:06:31.763169 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"669c480a-39bf-4e91-985b-528c87fa0129","Type":"ContainerStarted","Data":"be7f52dd0ca06d69997fed294ad34242d58999e6fc97e899be6ffef8b546ec73"} Mar 18 08:06:31 crc kubenswrapper[4917]: I0318 08:06:31.763782 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 18 08:06:31 crc kubenswrapper[4917]: I0318 08:06:31.782524 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" path="/var/lib/kubelet/pods/4f62924b-e50b-4fd9-ab16-65074a1f062f/volumes" Mar 18 08:06:31 crc kubenswrapper[4917]: I0318 08:06:31.790782 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.790761938 podStartE2EDuration="38.790761938s" podCreationTimestamp="2026-03-18 08:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:06:31.78629972 +0000 UTC m=+4776.727454444" watchObservedRunningTime="2026-03-18 08:06:31.790761938 +0000 UTC m=+4776.731916652" Mar 18 08:06:33 crc kubenswrapper[4917]: I0318 08:06:33.772610 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:06:33 crc kubenswrapper[4917]: E0318 08:06:33.772872 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:06:44 crc kubenswrapper[4917]: I0318 08:06:44.116901 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 18 08:06:44 crc kubenswrapper[4917]: I0318 08:06:44.126854 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.624419 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 08:06:47 crc kubenswrapper[4917]: E0318 08:06:47.625196 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerName="extract-utilities" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.625209 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerName="extract-utilities" Mar 18 08:06:47 crc kubenswrapper[4917]: E0318 08:06:47.625223 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerName="extract-content" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.625232 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerName="extract-content" Mar 18 08:06:47 crc kubenswrapper[4917]: E0318 08:06:47.625262 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerName="registry-server" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.625268 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerName="registry-server" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.625409 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f62924b-e50b-4fd9-ab16-65074a1f062f" containerName="registry-server" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.625931 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.630692 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-c568t" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.641862 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.704037 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tb2d\" (UniqueName: \"kubernetes.io/projected/958f92e0-2ce4-4f01-b854-def140d0a23e-kube-api-access-9tb2d\") pod \"mariadb-client\" (UID: \"958f92e0-2ce4-4f01-b854-def140d0a23e\") " pod="openstack/mariadb-client" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.773276 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:06:47 crc kubenswrapper[4917]: E0318 08:06:47.773801 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.806132 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tb2d\" (UniqueName: \"kubernetes.io/projected/958f92e0-2ce4-4f01-b854-def140d0a23e-kube-api-access-9tb2d\") pod \"mariadb-client\" (UID: \"958f92e0-2ce4-4f01-b854-def140d0a23e\") " pod="openstack/mariadb-client" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.833765 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tb2d\" (UniqueName: \"kubernetes.io/projected/958f92e0-2ce4-4f01-b854-def140d0a23e-kube-api-access-9tb2d\") pod \"mariadb-client\" (UID: \"958f92e0-2ce4-4f01-b854-def140d0a23e\") " pod="openstack/mariadb-client" Mar 18 08:06:47 crc kubenswrapper[4917]: I0318 08:06:47.956774 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:06:48 crc kubenswrapper[4917]: I0318 08:06:48.576286 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:06:48 crc kubenswrapper[4917]: W0318 08:06:48.579499 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod958f92e0_2ce4_4f01_b854_def140d0a23e.slice/crio-55f72a8d0ed400eea7779e6bb7e21680cb6e3ed3b96758ea61b7c046b75b9f23 WatchSource:0}: Error finding container 55f72a8d0ed400eea7779e6bb7e21680cb6e3ed3b96758ea61b7c046b75b9f23: Status 404 returned error can't find the container with id 55f72a8d0ed400eea7779e6bb7e21680cb6e3ed3b96758ea61b7c046b75b9f23 Mar 18 08:06:48 crc kubenswrapper[4917]: I0318 08:06:48.922888 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"958f92e0-2ce4-4f01-b854-def140d0a23e","Type":"ContainerStarted","Data":"55f72a8d0ed400eea7779e6bb7e21680cb6e3ed3b96758ea61b7c046b75b9f23"} Mar 18 08:06:49 crc kubenswrapper[4917]: I0318 08:06:49.933774 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"958f92e0-2ce4-4f01-b854-def140d0a23e","Type":"ContainerStarted","Data":"24fd371ad43a46eac5352ccc44df51e7a629b836a1acc620d39235ea332bb7e7"} Mar 18 08:06:49 crc kubenswrapper[4917]: I0318 08:06:49.956991 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.485708407 podStartE2EDuration="2.956956539s" podCreationTimestamp="2026-03-18 08:06:47 +0000 UTC" firstStartedPulling="2026-03-18 08:06:48.582620397 +0000 UTC m=+4793.523775131" lastFinishedPulling="2026-03-18 08:06:49.053868519 +0000 UTC m=+4793.995023263" observedRunningTime="2026-03-18 08:06:49.9487433 +0000 UTC m=+4794.889898034" watchObservedRunningTime="2026-03-18 08:06:49.956956539 +0000 UTC m=+4794.898111283" Mar 18 08:07:00 crc kubenswrapper[4917]: I0318 08:07:00.772791 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:07:00 crc kubenswrapper[4917]: E0318 08:07:00.773683 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:07:01 crc kubenswrapper[4917]: I0318 08:07:01.532144 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:07:01 crc kubenswrapper[4917]: I0318 08:07:01.532427 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="958f92e0-2ce4-4f01-b854-def140d0a23e" containerName="mariadb-client" containerID="cri-o://24fd371ad43a46eac5352ccc44df51e7a629b836a1acc620d39235ea332bb7e7" gracePeriod=30 Mar 18 08:07:02 crc kubenswrapper[4917]: I0318 08:07:02.046271 4917 generic.go:334] "Generic (PLEG): container finished" podID="958f92e0-2ce4-4f01-b854-def140d0a23e" containerID="24fd371ad43a46eac5352ccc44df51e7a629b836a1acc620d39235ea332bb7e7" exitCode=143 Mar 18 08:07:02 crc kubenswrapper[4917]: I0318 08:07:02.046754 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"958f92e0-2ce4-4f01-b854-def140d0a23e","Type":"ContainerDied","Data":"24fd371ad43a46eac5352ccc44df51e7a629b836a1acc620d39235ea332bb7e7"} Mar 18 08:07:02 crc kubenswrapper[4917]: I0318 08:07:02.046798 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"958f92e0-2ce4-4f01-b854-def140d0a23e","Type":"ContainerDied","Data":"55f72a8d0ed400eea7779e6bb7e21680cb6e3ed3b96758ea61b7c046b75b9f23"} Mar 18 08:07:02 crc kubenswrapper[4917]: I0318 08:07:02.046819 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55f72a8d0ed400eea7779e6bb7e21680cb6e3ed3b96758ea61b7c046b75b9f23" Mar 18 08:07:02 crc kubenswrapper[4917]: I0318 08:07:02.090528 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:07:02 crc kubenswrapper[4917]: I0318 08:07:02.144670 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tb2d\" (UniqueName: \"kubernetes.io/projected/958f92e0-2ce4-4f01-b854-def140d0a23e-kube-api-access-9tb2d\") pod \"958f92e0-2ce4-4f01-b854-def140d0a23e\" (UID: \"958f92e0-2ce4-4f01-b854-def140d0a23e\") " Mar 18 08:07:02 crc kubenswrapper[4917]: I0318 08:07:02.151810 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/958f92e0-2ce4-4f01-b854-def140d0a23e-kube-api-access-9tb2d" (OuterVolumeSpecName: "kube-api-access-9tb2d") pod "958f92e0-2ce4-4f01-b854-def140d0a23e" (UID: "958f92e0-2ce4-4f01-b854-def140d0a23e"). InnerVolumeSpecName "kube-api-access-9tb2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:07:02 crc kubenswrapper[4917]: I0318 08:07:02.246986 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tb2d\" (UniqueName: \"kubernetes.io/projected/958f92e0-2ce4-4f01-b854-def140d0a23e-kube-api-access-9tb2d\") on node \"crc\" DevicePath \"\"" Mar 18 08:07:03 crc kubenswrapper[4917]: I0318 08:07:03.058295 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:07:03 crc kubenswrapper[4917]: I0318 08:07:03.114493 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:07:03 crc kubenswrapper[4917]: I0318 08:07:03.125004 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:07:03 crc kubenswrapper[4917]: I0318 08:07:03.789793 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="958f92e0-2ce4-4f01-b854-def140d0a23e" path="/var/lib/kubelet/pods/958f92e0-2ce4-4f01-b854-def140d0a23e/volumes" Mar 18 08:07:06 crc kubenswrapper[4917]: I0318 08:07:06.854090 4917 scope.go:117] "RemoveContainer" containerID="294971aa5e589627de978c972b9332a4c63c8c1bc1d3ae87458055cdf6a87cd7" Mar 18 08:07:15 crc kubenswrapper[4917]: I0318 08:07:15.782875 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:07:15 crc kubenswrapper[4917]: E0318 08:07:15.784172 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:07:28 crc kubenswrapper[4917]: I0318 08:07:28.772845 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:07:28 crc kubenswrapper[4917]: E0318 08:07:28.774039 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.319307 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fjnzj"] Mar 18 08:07:29 crc kubenswrapper[4917]: E0318 08:07:29.320259 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958f92e0-2ce4-4f01-b854-def140d0a23e" containerName="mariadb-client" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.320294 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="958f92e0-2ce4-4f01-b854-def140d0a23e" containerName="mariadb-client" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.320702 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="958f92e0-2ce4-4f01-b854-def140d0a23e" containerName="mariadb-client" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.323352 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.331618 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fjnzj"] Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.391286 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-utilities\") pod \"certified-operators-fjnzj\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.391353 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjrjj\" (UniqueName: \"kubernetes.io/projected/e3786ffa-9e57-449e-8156-549b011d7fea-kube-api-access-sjrjj\") pod \"certified-operators-fjnzj\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.391483 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-catalog-content\") pod \"certified-operators-fjnzj\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.492629 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-catalog-content\") pod \"certified-operators-fjnzj\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.492755 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-utilities\") pod \"certified-operators-fjnzj\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.492790 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjrjj\" (UniqueName: \"kubernetes.io/projected/e3786ffa-9e57-449e-8156-549b011d7fea-kube-api-access-sjrjj\") pod \"certified-operators-fjnzj\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.493389 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-catalog-content\") pod \"certified-operators-fjnzj\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.493773 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-utilities\") pod \"certified-operators-fjnzj\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.521643 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjrjj\" (UniqueName: \"kubernetes.io/projected/e3786ffa-9e57-449e-8156-549b011d7fea-kube-api-access-sjrjj\") pod \"certified-operators-fjnzj\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:29 crc kubenswrapper[4917]: I0318 08:07:29.655179 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:30 crc kubenswrapper[4917]: I0318 08:07:30.142814 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fjnzj"] Mar 18 08:07:30 crc kubenswrapper[4917]: I0318 08:07:30.334686 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjnzj" event={"ID":"e3786ffa-9e57-449e-8156-549b011d7fea","Type":"ContainerStarted","Data":"4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151"} Mar 18 08:07:30 crc kubenswrapper[4917]: I0318 08:07:30.334937 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjnzj" event={"ID":"e3786ffa-9e57-449e-8156-549b011d7fea","Type":"ContainerStarted","Data":"fd377a089114afee9198550bd492c8f3cca3b9f37b367da62ae036802167bf6c"} Mar 18 08:07:31 crc kubenswrapper[4917]: I0318 08:07:31.346332 4917 generic.go:334] "Generic (PLEG): container finished" podID="e3786ffa-9e57-449e-8156-549b011d7fea" containerID="4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151" exitCode=0 Mar 18 08:07:31 crc kubenswrapper[4917]: I0318 08:07:31.346453 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjnzj" event={"ID":"e3786ffa-9e57-449e-8156-549b011d7fea","Type":"ContainerDied","Data":"4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151"} Mar 18 08:07:35 crc kubenswrapper[4917]: I0318 08:07:35.390304 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjnzj" event={"ID":"e3786ffa-9e57-449e-8156-549b011d7fea","Type":"ContainerStarted","Data":"295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487"} Mar 18 08:07:36 crc kubenswrapper[4917]: I0318 08:07:36.402210 4917 generic.go:334] "Generic (PLEG): container finished" podID="e3786ffa-9e57-449e-8156-549b011d7fea" containerID="295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487" exitCode=0 Mar 18 08:07:36 crc kubenswrapper[4917]: I0318 08:07:36.402279 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjnzj" event={"ID":"e3786ffa-9e57-449e-8156-549b011d7fea","Type":"ContainerDied","Data":"295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487"} Mar 18 08:07:37 crc kubenswrapper[4917]: I0318 08:07:37.416382 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjnzj" event={"ID":"e3786ffa-9e57-449e-8156-549b011d7fea","Type":"ContainerStarted","Data":"6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a"} Mar 18 08:07:37 crc kubenswrapper[4917]: I0318 08:07:37.449830 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fjnzj" podStartSLOduration=1.8892895539999999 podStartE2EDuration="8.449812816s" podCreationTimestamp="2026-03-18 08:07:29 +0000 UTC" firstStartedPulling="2026-03-18 08:07:30.336165261 +0000 UTC m=+4835.277319975" lastFinishedPulling="2026-03-18 08:07:36.896688503 +0000 UTC m=+4841.837843237" observedRunningTime="2026-03-18 08:07:37.444400125 +0000 UTC m=+4842.385554859" watchObservedRunningTime="2026-03-18 08:07:37.449812816 +0000 UTC m=+4842.390967530" Mar 18 08:07:39 crc kubenswrapper[4917]: I0318 08:07:39.656430 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:39 crc kubenswrapper[4917]: I0318 08:07:39.656869 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:39 crc kubenswrapper[4917]: I0318 08:07:39.728340 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:40 crc kubenswrapper[4917]: I0318 08:07:40.773296 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:07:40 crc kubenswrapper[4917]: E0318 08:07:40.773773 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:07:49 crc kubenswrapper[4917]: I0318 08:07:49.724861 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:49 crc kubenswrapper[4917]: I0318 08:07:49.784150 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fjnzj"] Mar 18 08:07:50 crc kubenswrapper[4917]: I0318 08:07:50.536681 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fjnzj" podUID="e3786ffa-9e57-449e-8156-549b011d7fea" containerName="registry-server" containerID="cri-o://6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a" gracePeriod=2 Mar 18 08:07:50 crc kubenswrapper[4917]: I0318 08:07:50.996287 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.140065 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-utilities\") pod \"e3786ffa-9e57-449e-8156-549b011d7fea\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.140165 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjrjj\" (UniqueName: \"kubernetes.io/projected/e3786ffa-9e57-449e-8156-549b011d7fea-kube-api-access-sjrjj\") pod \"e3786ffa-9e57-449e-8156-549b011d7fea\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.140346 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-catalog-content\") pod \"e3786ffa-9e57-449e-8156-549b011d7fea\" (UID: \"e3786ffa-9e57-449e-8156-549b011d7fea\") " Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.142084 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-utilities" (OuterVolumeSpecName: "utilities") pod "e3786ffa-9e57-449e-8156-549b011d7fea" (UID: "e3786ffa-9e57-449e-8156-549b011d7fea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.153356 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3786ffa-9e57-449e-8156-549b011d7fea-kube-api-access-sjrjj" (OuterVolumeSpecName: "kube-api-access-sjrjj") pod "e3786ffa-9e57-449e-8156-549b011d7fea" (UID: "e3786ffa-9e57-449e-8156-549b011d7fea"). InnerVolumeSpecName "kube-api-access-sjrjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.208271 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3786ffa-9e57-449e-8156-549b011d7fea" (UID: "e3786ffa-9e57-449e-8156-549b011d7fea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.243230 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.243278 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3786ffa-9e57-449e-8156-549b011d7fea-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.243297 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjrjj\" (UniqueName: \"kubernetes.io/projected/e3786ffa-9e57-449e-8156-549b011d7fea-kube-api-access-sjrjj\") on node \"crc\" DevicePath \"\"" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.550212 4917 generic.go:334] "Generic (PLEG): container finished" podID="e3786ffa-9e57-449e-8156-549b011d7fea" containerID="6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a" exitCode=0 Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.550307 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjnzj" event={"ID":"e3786ffa-9e57-449e-8156-549b011d7fea","Type":"ContainerDied","Data":"6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a"} Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.550631 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fjnzj" event={"ID":"e3786ffa-9e57-449e-8156-549b011d7fea","Type":"ContainerDied","Data":"fd377a089114afee9198550bd492c8f3cca3b9f37b367da62ae036802167bf6c"} Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.550672 4917 scope.go:117] "RemoveContainer" containerID="6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.550340 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fjnzj" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.580191 4917 scope.go:117] "RemoveContainer" containerID="295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.614086 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fjnzj"] Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.620457 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fjnzj"] Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.635721 4917 scope.go:117] "RemoveContainer" containerID="4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.665506 4917 scope.go:117] "RemoveContainer" containerID="6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a" Mar 18 08:07:51 crc kubenswrapper[4917]: E0318 08:07:51.666204 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a\": container with ID starting with 6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a not found: ID does not exist" containerID="6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.666255 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a"} err="failed to get container status \"6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a\": rpc error: code = NotFound desc = could not find container \"6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a\": container with ID starting with 6ea9fbed4ece243bd778d01ff930fca4b259902d8e376be88c155d090b9a075a not found: ID does not exist" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.666287 4917 scope.go:117] "RemoveContainer" containerID="295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487" Mar 18 08:07:51 crc kubenswrapper[4917]: E0318 08:07:51.667222 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487\": container with ID starting with 295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487 not found: ID does not exist" containerID="295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.667316 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487"} err="failed to get container status \"295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487\": rpc error: code = NotFound desc = could not find container \"295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487\": container with ID starting with 295bcb3b5ab0d1060ff35ffc9761def3ea9e8c2bd8e869f97ca0ceaecc69d487 not found: ID does not exist" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.667386 4917 scope.go:117] "RemoveContainer" containerID="4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151" Mar 18 08:07:51 crc kubenswrapper[4917]: E0318 08:07:51.669029 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151\": container with ID starting with 4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151 not found: ID does not exist" containerID="4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.669072 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151"} err="failed to get container status \"4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151\": rpc error: code = NotFound desc = could not find container \"4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151\": container with ID starting with 4575a0f35d464fc8d6cd27ee4ef2ea059700e4c00a6780c80d973f06747c4151 not found: ID does not exist" Mar 18 08:07:51 crc kubenswrapper[4917]: I0318 08:07:51.788772 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3786ffa-9e57-449e-8156-549b011d7fea" path="/var/lib/kubelet/pods/e3786ffa-9e57-449e-8156-549b011d7fea/volumes" Mar 18 08:07:52 crc kubenswrapper[4917]: I0318 08:07:52.773283 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:07:52 crc kubenswrapper[4917]: E0318 08:07:52.773881 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.152351 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563688-526t7"] Mar 18 08:08:00 crc kubenswrapper[4917]: E0318 08:08:00.153215 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3786ffa-9e57-449e-8156-549b011d7fea" containerName="extract-utilities" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.153231 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3786ffa-9e57-449e-8156-549b011d7fea" containerName="extract-utilities" Mar 18 08:08:00 crc kubenswrapper[4917]: E0318 08:08:00.153287 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3786ffa-9e57-449e-8156-549b011d7fea" containerName="registry-server" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.153297 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3786ffa-9e57-449e-8156-549b011d7fea" containerName="registry-server" Mar 18 08:08:00 crc kubenswrapper[4917]: E0318 08:08:00.153317 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3786ffa-9e57-449e-8156-549b011d7fea" containerName="extract-content" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.153325 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3786ffa-9e57-449e-8156-549b011d7fea" containerName="extract-content" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.153502 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3786ffa-9e57-449e-8156-549b011d7fea" containerName="registry-server" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.154257 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563688-526t7" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.157446 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.157494 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.157835 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.162095 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563688-526t7"] Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.298205 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86nhk\" (UniqueName: \"kubernetes.io/projected/730d4203-a2f5-4f9d-82aa-7d06e92c6520-kube-api-access-86nhk\") pod \"auto-csr-approver-29563688-526t7\" (UID: \"730d4203-a2f5-4f9d-82aa-7d06e92c6520\") " pod="openshift-infra/auto-csr-approver-29563688-526t7" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.399386 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86nhk\" (UniqueName: \"kubernetes.io/projected/730d4203-a2f5-4f9d-82aa-7d06e92c6520-kube-api-access-86nhk\") pod \"auto-csr-approver-29563688-526t7\" (UID: \"730d4203-a2f5-4f9d-82aa-7d06e92c6520\") " pod="openshift-infra/auto-csr-approver-29563688-526t7" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.432354 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86nhk\" (UniqueName: \"kubernetes.io/projected/730d4203-a2f5-4f9d-82aa-7d06e92c6520-kube-api-access-86nhk\") pod \"auto-csr-approver-29563688-526t7\" (UID: \"730d4203-a2f5-4f9d-82aa-7d06e92c6520\") " pod="openshift-infra/auto-csr-approver-29563688-526t7" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.470866 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563688-526t7" Mar 18 08:08:00 crc kubenswrapper[4917]: I0318 08:08:00.792072 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563688-526t7"] Mar 18 08:08:01 crc kubenswrapper[4917]: I0318 08:08:01.654703 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563688-526t7" event={"ID":"730d4203-a2f5-4f9d-82aa-7d06e92c6520","Type":"ContainerStarted","Data":"685d9ba59592127e943d0c2b693d988f662dc0d25abd255b28688e2cd905e35c"} Mar 18 08:08:02 crc kubenswrapper[4917]: I0318 08:08:02.668049 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563688-526t7" event={"ID":"730d4203-a2f5-4f9d-82aa-7d06e92c6520","Type":"ContainerStarted","Data":"2d8c9f61990d5db44b16d3ca04a1ab5996fae555fa549183e3785c24ea2e8fb0"} Mar 18 08:08:02 crc kubenswrapper[4917]: I0318 08:08:02.690175 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563688-526t7" podStartSLOduration=1.276633881 podStartE2EDuration="2.690143472s" podCreationTimestamp="2026-03-18 08:08:00 +0000 UTC" firstStartedPulling="2026-03-18 08:08:00.801796486 +0000 UTC m=+4865.742951210" lastFinishedPulling="2026-03-18 08:08:02.215306057 +0000 UTC m=+4867.156460801" observedRunningTime="2026-03-18 08:08:02.68403658 +0000 UTC m=+4867.625191324" watchObservedRunningTime="2026-03-18 08:08:02.690143472 +0000 UTC m=+4867.631298226" Mar 18 08:08:03 crc kubenswrapper[4917]: I0318 08:08:03.675951 4917 generic.go:334] "Generic (PLEG): container finished" podID="730d4203-a2f5-4f9d-82aa-7d06e92c6520" containerID="2d8c9f61990d5db44b16d3ca04a1ab5996fae555fa549183e3785c24ea2e8fb0" exitCode=0 Mar 18 08:08:03 crc kubenswrapper[4917]: I0318 08:08:03.676044 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563688-526t7" event={"ID":"730d4203-a2f5-4f9d-82aa-7d06e92c6520","Type":"ContainerDied","Data":"2d8c9f61990d5db44b16d3ca04a1ab5996fae555fa549183e3785c24ea2e8fb0"} Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.072152 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563688-526t7" Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.174434 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86nhk\" (UniqueName: \"kubernetes.io/projected/730d4203-a2f5-4f9d-82aa-7d06e92c6520-kube-api-access-86nhk\") pod \"730d4203-a2f5-4f9d-82aa-7d06e92c6520\" (UID: \"730d4203-a2f5-4f9d-82aa-7d06e92c6520\") " Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.184858 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/730d4203-a2f5-4f9d-82aa-7d06e92c6520-kube-api-access-86nhk" (OuterVolumeSpecName: "kube-api-access-86nhk") pod "730d4203-a2f5-4f9d-82aa-7d06e92c6520" (UID: "730d4203-a2f5-4f9d-82aa-7d06e92c6520"). InnerVolumeSpecName "kube-api-access-86nhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.276890 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86nhk\" (UniqueName: \"kubernetes.io/projected/730d4203-a2f5-4f9d-82aa-7d06e92c6520-kube-api-access-86nhk\") on node \"crc\" DevicePath \"\"" Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.705044 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563688-526t7" event={"ID":"730d4203-a2f5-4f9d-82aa-7d06e92c6520","Type":"ContainerDied","Data":"685d9ba59592127e943d0c2b693d988f662dc0d25abd255b28688e2cd905e35c"} Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.705090 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="685d9ba59592127e943d0c2b693d988f662dc0d25abd255b28688e2cd905e35c" Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.705153 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563688-526t7" Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.755951 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563682-mgm58"] Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.760929 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563682-mgm58"] Mar 18 08:08:05 crc kubenswrapper[4917]: I0318 08:08:05.780967 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b289f9a-33e9-449f-9ddb-5fdd1d92b79f" path="/var/lib/kubelet/pods/7b289f9a-33e9-449f-9ddb-5fdd1d92b79f/volumes" Mar 18 08:08:06 crc kubenswrapper[4917]: I0318 08:08:06.985250 4917 scope.go:117] "RemoveContainer" containerID="44ccfa5f00f482cb279c279053272c502518cb70f2a28df13579dcd6ec0c22ac" Mar 18 08:08:07 crc kubenswrapper[4917]: I0318 08:08:07.008575 4917 scope.go:117] "RemoveContainer" containerID="b760d7516f5496e065370afe109fa37cdb48790ec80da757afa71f0d60e4e792" Mar 18 08:08:07 crc kubenswrapper[4917]: I0318 08:08:07.773645 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:08:07 crc kubenswrapper[4917]: E0318 08:08:07.774271 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:08:22 crc kubenswrapper[4917]: I0318 08:08:22.773877 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:08:22 crc kubenswrapper[4917]: E0318 08:08:22.775090 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:08:37 crc kubenswrapper[4917]: I0318 08:08:37.773032 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:08:37 crc kubenswrapper[4917]: E0318 08:08:37.774444 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:08:51 crc kubenswrapper[4917]: I0318 08:08:51.773497 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:08:51 crc kubenswrapper[4917]: E0318 08:08:51.774562 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:09:02 crc kubenswrapper[4917]: I0318 08:09:02.772526 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:09:02 crc kubenswrapper[4917]: E0318 08:09:02.773416 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:09:17 crc kubenswrapper[4917]: I0318 08:09:17.773698 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:09:17 crc kubenswrapper[4917]: E0318 08:09:17.774762 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:09:32 crc kubenswrapper[4917]: I0318 08:09:32.772332 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:09:32 crc kubenswrapper[4917]: E0318 08:09:32.773294 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:09:46 crc kubenswrapper[4917]: I0318 08:09:46.772551 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:09:47 crc kubenswrapper[4917]: I0318 08:09:47.652057 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"4857c0f586b258498a2f24a9ee2cbd7165480a218adc673893556e562f68bc3b"} Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.164665 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563690-lfwvt"] Mar 18 08:10:00 crc kubenswrapper[4917]: E0318 08:10:00.167276 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730d4203-a2f5-4f9d-82aa-7d06e92c6520" containerName="oc" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.167441 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="730d4203-a2f5-4f9d-82aa-7d06e92c6520" containerName="oc" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.167856 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="730d4203-a2f5-4f9d-82aa-7d06e92c6520" containerName="oc" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.168793 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563690-lfwvt" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.171963 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.180416 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.180446 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.181516 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563690-lfwvt"] Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.318367 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fvw\" (UniqueName: \"kubernetes.io/projected/f213ddc0-297e-4e41-98bb-92ce80a5f87e-kube-api-access-f6fvw\") pod \"auto-csr-approver-29563690-lfwvt\" (UID: \"f213ddc0-297e-4e41-98bb-92ce80a5f87e\") " pod="openshift-infra/auto-csr-approver-29563690-lfwvt" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.420961 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fvw\" (UniqueName: \"kubernetes.io/projected/f213ddc0-297e-4e41-98bb-92ce80a5f87e-kube-api-access-f6fvw\") pod \"auto-csr-approver-29563690-lfwvt\" (UID: \"f213ddc0-297e-4e41-98bb-92ce80a5f87e\") " pod="openshift-infra/auto-csr-approver-29563690-lfwvt" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.454263 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fvw\" (UniqueName: \"kubernetes.io/projected/f213ddc0-297e-4e41-98bb-92ce80a5f87e-kube-api-access-f6fvw\") pod \"auto-csr-approver-29563690-lfwvt\" (UID: \"f213ddc0-297e-4e41-98bb-92ce80a5f87e\") " pod="openshift-infra/auto-csr-approver-29563690-lfwvt" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.504864 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563690-lfwvt" Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.865554 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563690-lfwvt"] Mar 18 08:10:00 crc kubenswrapper[4917]: I0318 08:10:00.889426 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:10:01 crc kubenswrapper[4917]: I0318 08:10:01.792513 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563690-lfwvt" event={"ID":"f213ddc0-297e-4e41-98bb-92ce80a5f87e","Type":"ContainerStarted","Data":"78945be6cce6f489de727ece706306be07f48d30477ab4f6ed141cb661a483e4"} Mar 18 08:10:02 crc kubenswrapper[4917]: I0318 08:10:02.806259 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563690-lfwvt" event={"ID":"f213ddc0-297e-4e41-98bb-92ce80a5f87e","Type":"ContainerStarted","Data":"b2b9ee93147aaef7294948f17ddf6b8ca090121f8e263aa9ced6ebaf2b0c2fa6"} Mar 18 08:10:02 crc kubenswrapper[4917]: I0318 08:10:02.839504 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563690-lfwvt" podStartSLOduration=1.636600767 podStartE2EDuration="2.839472692s" podCreationTimestamp="2026-03-18 08:10:00 +0000 UTC" firstStartedPulling="2026-03-18 08:10:00.889035006 +0000 UTC m=+4985.830189760" lastFinishedPulling="2026-03-18 08:10:02.091906961 +0000 UTC m=+4987.033061685" observedRunningTime="2026-03-18 08:10:02.831076858 +0000 UTC m=+4987.772231602" watchObservedRunningTime="2026-03-18 08:10:02.839472692 +0000 UTC m=+4987.780627456" Mar 18 08:10:02 crc kubenswrapper[4917]: I0318 08:10:02.866780 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 08:10:02 crc kubenswrapper[4917]: I0318 08:10:02.868824 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 08:10:02 crc kubenswrapper[4917]: I0318 08:10:02.874056 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 08:10:02 crc kubenswrapper[4917]: I0318 08:10:02.874216 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-c568t" Mar 18 08:10:02 crc kubenswrapper[4917]: I0318 08:10:02.964099 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\") pod \"mariadb-copy-data\" (UID: \"8b77e881-8795-4449-bedf-625bdc184ff2\") " pod="openstack/mariadb-copy-data" Mar 18 08:10:02 crc kubenswrapper[4917]: I0318 08:10:02.964161 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znh9b\" (UniqueName: \"kubernetes.io/projected/8b77e881-8795-4449-bedf-625bdc184ff2-kube-api-access-znh9b\") pod \"mariadb-copy-data\" (UID: \"8b77e881-8795-4449-bedf-625bdc184ff2\") " pod="openstack/mariadb-copy-data" Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.066048 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\") pod \"mariadb-copy-data\" (UID: \"8b77e881-8795-4449-bedf-625bdc184ff2\") " pod="openstack/mariadb-copy-data" Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.066144 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znh9b\" (UniqueName: \"kubernetes.io/projected/8b77e881-8795-4449-bedf-625bdc184ff2-kube-api-access-znh9b\") pod \"mariadb-copy-data\" (UID: \"8b77e881-8795-4449-bedf-625bdc184ff2\") " pod="openstack/mariadb-copy-data" Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.069894 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.069951 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\") pod \"mariadb-copy-data\" (UID: \"8b77e881-8795-4449-bedf-625bdc184ff2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bcb131c187ae8cb472b24426a6796abd5573eb50db3614ec83024aa1b78cdd70/globalmount\"" pod="openstack/mariadb-copy-data" Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.133669 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znh9b\" (UniqueName: \"kubernetes.io/projected/8b77e881-8795-4449-bedf-625bdc184ff2-kube-api-access-znh9b\") pod \"mariadb-copy-data\" (UID: \"8b77e881-8795-4449-bedf-625bdc184ff2\") " pod="openstack/mariadb-copy-data" Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.152825 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\") pod \"mariadb-copy-data\" (UID: \"8b77e881-8795-4449-bedf-625bdc184ff2\") " pod="openstack/mariadb-copy-data" Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.195205 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.817282 4917 generic.go:334] "Generic (PLEG): container finished" podID="f213ddc0-297e-4e41-98bb-92ce80a5f87e" containerID="b2b9ee93147aaef7294948f17ddf6b8ca090121f8e263aa9ced6ebaf2b0c2fa6" exitCode=0 Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.818257 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563690-lfwvt" event={"ID":"f213ddc0-297e-4e41-98bb-92ce80a5f87e","Type":"ContainerDied","Data":"b2b9ee93147aaef7294948f17ddf6b8ca090121f8e263aa9ced6ebaf2b0c2fa6"} Mar 18 08:10:03 crc kubenswrapper[4917]: I0318 08:10:03.882565 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 08:10:04 crc kubenswrapper[4917]: I0318 08:10:04.829771 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"8b77e881-8795-4449-bedf-625bdc184ff2","Type":"ContainerStarted","Data":"572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559"} Mar 18 08:10:04 crc kubenswrapper[4917]: I0318 08:10:04.830362 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"8b77e881-8795-4449-bedf-625bdc184ff2","Type":"ContainerStarted","Data":"4ee370d0eae53aef4c597b8f76da6d72e0fd7406a13dc61fc452f98bd7715a1e"} Mar 18 08:10:04 crc kubenswrapper[4917]: I0318 08:10:04.864260 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.864226973 podStartE2EDuration="3.864226973s" podCreationTimestamp="2026-03-18 08:10:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:10:04.856093964 +0000 UTC m=+4989.797248768" watchObservedRunningTime="2026-03-18 08:10:04.864226973 +0000 UTC m=+4989.805381777" Mar 18 08:10:05 crc kubenswrapper[4917]: I0318 08:10:05.431866 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563690-lfwvt" Mar 18 08:10:05 crc kubenswrapper[4917]: I0318 08:10:05.604869 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6fvw\" (UniqueName: \"kubernetes.io/projected/f213ddc0-297e-4e41-98bb-92ce80a5f87e-kube-api-access-f6fvw\") pod \"f213ddc0-297e-4e41-98bb-92ce80a5f87e\" (UID: \"f213ddc0-297e-4e41-98bb-92ce80a5f87e\") " Mar 18 08:10:05 crc kubenswrapper[4917]: I0318 08:10:05.610651 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f213ddc0-297e-4e41-98bb-92ce80a5f87e-kube-api-access-f6fvw" (OuterVolumeSpecName: "kube-api-access-f6fvw") pod "f213ddc0-297e-4e41-98bb-92ce80a5f87e" (UID: "f213ddc0-297e-4e41-98bb-92ce80a5f87e"). InnerVolumeSpecName "kube-api-access-f6fvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:10:05 crc kubenswrapper[4917]: I0318 08:10:05.707449 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6fvw\" (UniqueName: \"kubernetes.io/projected/f213ddc0-297e-4e41-98bb-92ce80a5f87e-kube-api-access-f6fvw\") on node \"crc\" DevicePath \"\"" Mar 18 08:10:05 crc kubenswrapper[4917]: I0318 08:10:05.842955 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563690-lfwvt" event={"ID":"f213ddc0-297e-4e41-98bb-92ce80a5f87e","Type":"ContainerDied","Data":"78945be6cce6f489de727ece706306be07f48d30477ab4f6ed141cb661a483e4"} Mar 18 08:10:05 crc kubenswrapper[4917]: I0318 08:10:05.842991 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78945be6cce6f489de727ece706306be07f48d30477ab4f6ed141cb661a483e4" Mar 18 08:10:05 crc kubenswrapper[4917]: I0318 08:10:05.845078 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563690-lfwvt" Mar 18 08:10:06 crc kubenswrapper[4917]: I0318 08:10:06.507008 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563684-gvwjl"] Mar 18 08:10:06 crc kubenswrapper[4917]: I0318 08:10:06.519058 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563684-gvwjl"] Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.434345 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:07 crc kubenswrapper[4917]: E0318 08:10:07.435949 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f213ddc0-297e-4e41-98bb-92ce80a5f87e" containerName="oc" Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.436149 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f213ddc0-297e-4e41-98bb-92ce80a5f87e" containerName="oc" Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.439319 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f213ddc0-297e-4e41-98bb-92ce80a5f87e" containerName="oc" Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.440426 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.447664 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.540678 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqhq\" (UniqueName: \"kubernetes.io/projected/acb4fe96-cb49-44ff-8e8e-71cd2e14d300-kube-api-access-8wqhq\") pod \"mariadb-client\" (UID: \"acb4fe96-cb49-44ff-8e8e-71cd2e14d300\") " pod="openstack/mariadb-client" Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.643821 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqhq\" (UniqueName: \"kubernetes.io/projected/acb4fe96-cb49-44ff-8e8e-71cd2e14d300-kube-api-access-8wqhq\") pod \"mariadb-client\" (UID: \"acb4fe96-cb49-44ff-8e8e-71cd2e14d300\") " pod="openstack/mariadb-client" Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.669653 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqhq\" (UniqueName: \"kubernetes.io/projected/acb4fe96-cb49-44ff-8e8e-71cd2e14d300-kube-api-access-8wqhq\") pod \"mariadb-client\" (UID: \"acb4fe96-cb49-44ff-8e8e-71cd2e14d300\") " pod="openstack/mariadb-client" Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.787398 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4892baad-c970-4cbe-b214-aa0b925a071c" path="/var/lib/kubelet/pods/4892baad-c970-4cbe-b214-aa0b925a071c/volumes" Mar 18 08:10:07 crc kubenswrapper[4917]: I0318 08:10:07.794631 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:10:08 crc kubenswrapper[4917]: I0318 08:10:08.297200 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:08 crc kubenswrapper[4917]: I0318 08:10:08.882111 4917 generic.go:334] "Generic (PLEG): container finished" podID="acb4fe96-cb49-44ff-8e8e-71cd2e14d300" containerID="2e2c7ee69e1792ddb81da77d1ab7f978e0fa504fc36a5afe536ecba60f615e3c" exitCode=0 Mar 18 08:10:08 crc kubenswrapper[4917]: I0318 08:10:08.882154 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"acb4fe96-cb49-44ff-8e8e-71cd2e14d300","Type":"ContainerDied","Data":"2e2c7ee69e1792ddb81da77d1ab7f978e0fa504fc36a5afe536ecba60f615e3c"} Mar 18 08:10:08 crc kubenswrapper[4917]: I0318 08:10:08.882184 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"acb4fe96-cb49-44ff-8e8e-71cd2e14d300","Type":"ContainerStarted","Data":"4758a93e53468f4a9930f88576fcc586e904f63d5fb9d2f1be873838206e522c"} Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.323037 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.348207 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_acb4fe96-cb49-44ff-8e8e-71cd2e14d300/mariadb-client/0.log" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.373220 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.377936 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.385969 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wqhq\" (UniqueName: \"kubernetes.io/projected/acb4fe96-cb49-44ff-8e8e-71cd2e14d300-kube-api-access-8wqhq\") pod \"acb4fe96-cb49-44ff-8e8e-71cd2e14d300\" (UID: \"acb4fe96-cb49-44ff-8e8e-71cd2e14d300\") " Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.391863 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb4fe96-cb49-44ff-8e8e-71cd2e14d300-kube-api-access-8wqhq" (OuterVolumeSpecName: "kube-api-access-8wqhq") pod "acb4fe96-cb49-44ff-8e8e-71cd2e14d300" (UID: "acb4fe96-cb49-44ff-8e8e-71cd2e14d300"). InnerVolumeSpecName "kube-api-access-8wqhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.487716 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wqhq\" (UniqueName: \"kubernetes.io/projected/acb4fe96-cb49-44ff-8e8e-71cd2e14d300-kube-api-access-8wqhq\") on node \"crc\" DevicePath \"\"" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.488836 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:10 crc kubenswrapper[4917]: E0318 08:10:10.489156 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb4fe96-cb49-44ff-8e8e-71cd2e14d300" containerName="mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.489177 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb4fe96-cb49-44ff-8e8e-71cd2e14d300" containerName="mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.489374 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb4fe96-cb49-44ff-8e8e-71cd2e14d300" containerName="mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.489876 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.495875 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.589206 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lng9k\" (UniqueName: \"kubernetes.io/projected/29f70c95-2693-453a-a278-72f470d223c4-kube-api-access-lng9k\") pod \"mariadb-client\" (UID: \"29f70c95-2693-453a-a278-72f470d223c4\") " pod="openstack/mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.690213 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lng9k\" (UniqueName: \"kubernetes.io/projected/29f70c95-2693-453a-a278-72f470d223c4-kube-api-access-lng9k\") pod \"mariadb-client\" (UID: \"29f70c95-2693-453a-a278-72f470d223c4\") " pod="openstack/mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.718421 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lng9k\" (UniqueName: \"kubernetes.io/projected/29f70c95-2693-453a-a278-72f470d223c4-kube-api-access-lng9k\") pod \"mariadb-client\" (UID: \"29f70c95-2693-453a-a278-72f470d223c4\") " pod="openstack/mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.858929 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.903149 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4758a93e53468f4a9930f88576fcc586e904f63d5fb9d2f1be873838206e522c" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.903230 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:10:10 crc kubenswrapper[4917]: I0318 08:10:10.948437 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="acb4fe96-cb49-44ff-8e8e-71cd2e14d300" podUID="29f70c95-2693-453a-a278-72f470d223c4" Mar 18 08:10:11 crc kubenswrapper[4917]: I0318 08:10:11.158620 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:11 crc kubenswrapper[4917]: W0318 08:10:11.163427 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29f70c95_2693_453a_a278_72f470d223c4.slice/crio-01fc94efcb59c9610bcc22a27cba7c6c6faa0616d932a60c7655fb0a31b05cf5 WatchSource:0}: Error finding container 01fc94efcb59c9610bcc22a27cba7c6c6faa0616d932a60c7655fb0a31b05cf5: Status 404 returned error can't find the container with id 01fc94efcb59c9610bcc22a27cba7c6c6faa0616d932a60c7655fb0a31b05cf5 Mar 18 08:10:11 crc kubenswrapper[4917]: I0318 08:10:11.783066 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb4fe96-cb49-44ff-8e8e-71cd2e14d300" path="/var/lib/kubelet/pods/acb4fe96-cb49-44ff-8e8e-71cd2e14d300/volumes" Mar 18 08:10:11 crc kubenswrapper[4917]: I0318 08:10:11.931822 4917 generic.go:334] "Generic (PLEG): container finished" podID="29f70c95-2693-453a-a278-72f470d223c4" containerID="93479ceea2d23c8398d3da7eb6b61ee203c528ffb8e6945f5299b5e0ad33c678" exitCode=0 Mar 18 08:10:11 crc kubenswrapper[4917]: I0318 08:10:11.931948 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"29f70c95-2693-453a-a278-72f470d223c4","Type":"ContainerDied","Data":"93479ceea2d23c8398d3da7eb6b61ee203c528ffb8e6945f5299b5e0ad33c678"} Mar 18 08:10:11 crc kubenswrapper[4917]: I0318 08:10:11.932291 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"29f70c95-2693-453a-a278-72f470d223c4","Type":"ContainerStarted","Data":"01fc94efcb59c9610bcc22a27cba7c6c6faa0616d932a60c7655fb0a31b05cf5"} Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.316963 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.337961 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_29f70c95-2693-453a-a278-72f470d223c4/mariadb-client/0.log" Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.366342 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.372295 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.438480 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lng9k\" (UniqueName: \"kubernetes.io/projected/29f70c95-2693-453a-a278-72f470d223c4-kube-api-access-lng9k\") pod \"29f70c95-2693-453a-a278-72f470d223c4\" (UID: \"29f70c95-2693-453a-a278-72f470d223c4\") " Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.445034 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f70c95-2693-453a-a278-72f470d223c4-kube-api-access-lng9k" (OuterVolumeSpecName: "kube-api-access-lng9k") pod "29f70c95-2693-453a-a278-72f470d223c4" (UID: "29f70c95-2693-453a-a278-72f470d223c4"). InnerVolumeSpecName "kube-api-access-lng9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.541174 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lng9k\" (UniqueName: \"kubernetes.io/projected/29f70c95-2693-453a-a278-72f470d223c4-kube-api-access-lng9k\") on node \"crc\" DevicePath \"\"" Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.784055 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f70c95-2693-453a-a278-72f470d223c4" path="/var/lib/kubelet/pods/29f70c95-2693-453a-a278-72f470d223c4/volumes" Mar 18 08:10:13 crc kubenswrapper[4917]: E0318 08:10:13.946471 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29f70c95_2693_453a_a278_72f470d223c4.slice\": RecentStats: unable to find data in memory cache]" Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.952826 4917 scope.go:117] "RemoveContainer" containerID="93479ceea2d23c8398d3da7eb6b61ee203c528ffb8e6945f5299b5e0ad33c678" Mar 18 08:10:13 crc kubenswrapper[4917]: I0318 08:10:13.952912 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.030341 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 08:10:47 crc kubenswrapper[4917]: E0318 08:10:47.032782 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f70c95-2693-453a-a278-72f470d223c4" containerName="mariadb-client" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.032827 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f70c95-2693-453a-a278-72f470d223c4" containerName="mariadb-client" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.033173 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f70c95-2693-453a-a278-72f470d223c4" containerName="mariadb-client" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.034830 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.041751 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.042106 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jtsll" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.042114 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.042385 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.042506 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.050293 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.053458 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.070148 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.085790 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.089476 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.116153 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.142842 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-86a6d6ac-55a5-42d2-9323-17f88d5f8708\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86a6d6ac-55a5-42d2-9323-17f88d5f8708\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.142901 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/648fc31e-cfb7-4852-89ab-88296601d71d-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.142939 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/648fc31e-cfb7-4852-89ab-88296601d71d-config\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.142988 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023a312-69d0-4d07-bd00-aadc2d360175-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143028 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c5f1b26e-d478-410d-93a1-4bd22ce2358a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5f1b26e-d478-410d-93a1-4bd22ce2358a\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143052 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/648fc31e-cfb7-4852-89ab-88296601d71d-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143077 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/648fc31e-cfb7-4852-89ab-88296601d71d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143107 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55gt\" (UniqueName: \"kubernetes.io/projected/9023a312-69d0-4d07-bd00-aadc2d360175-kube-api-access-k55gt\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143137 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9023a312-69d0-4d07-bd00-aadc2d360175-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143159 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9023a312-69d0-4d07-bd00-aadc2d360175-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143220 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9023a312-69d0-4d07-bd00-aadc2d360175-config\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143268 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/648fc31e-cfb7-4852-89ab-88296601d71d-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143295 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/648fc31e-cfb7-4852-89ab-88296601d71d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143327 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9023a312-69d0-4d07-bd00-aadc2d360175-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143379 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023a312-69d0-4d07-bd00-aadc2d360175-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.143452 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk6vc\" (UniqueName: \"kubernetes.io/projected/648fc31e-cfb7-4852-89ab-88296601d71d-kube-api-access-lk6vc\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.147800 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.244972 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023a312-69d0-4d07-bd00-aadc2d360175-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.245368 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c5f1b26e-d478-410d-93a1-4bd22ce2358a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5f1b26e-d478-410d-93a1-4bd22ce2358a\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.245701 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/648fc31e-cfb7-4852-89ab-88296601d71d-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.246639 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/648fc31e-cfb7-4852-89ab-88296601d71d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.246788 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e25a8b13-2060-4331-ae83-11d45548dce0-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.246962 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k55gt\" (UniqueName: \"kubernetes.io/projected/9023a312-69d0-4d07-bd00-aadc2d360175-kube-api-access-k55gt\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.247121 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e25a8b13-2060-4331-ae83-11d45548dce0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.247262 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9023a312-69d0-4d07-bd00-aadc2d360175-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.247380 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9023a312-69d0-4d07-bd00-aadc2d360175-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.247486 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9023a312-69d0-4d07-bd00-aadc2d360175-config\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.247606 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/648fc31e-cfb7-4852-89ab-88296601d71d-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.247714 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/648fc31e-cfb7-4852-89ab-88296601d71d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.247824 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9023a312-69d0-4d07-bd00-aadc2d360175-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.247935 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1b113a27-09d7-4ac7-8094-647dfef5d01c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b113a27-09d7-4ac7-8094-647dfef5d01c\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.248029 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023a312-69d0-4d07-bd00-aadc2d360175-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.248126 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e25a8b13-2060-4331-ae83-11d45548dce0-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.248624 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9wr\" (UniqueName: \"kubernetes.io/projected/e25a8b13-2060-4331-ae83-11d45548dce0-kube-api-access-4g9wr\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.248772 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25a8b13-2060-4331-ae83-11d45548dce0-config\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.248879 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9023a312-69d0-4d07-bd00-aadc2d360175-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.248956 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/648fc31e-cfb7-4852-89ab-88296601d71d-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.248901 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk6vc\" (UniqueName: \"kubernetes.io/projected/648fc31e-cfb7-4852-89ab-88296601d71d-kube-api-access-lk6vc\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.249184 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-86a6d6ac-55a5-42d2-9323-17f88d5f8708\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86a6d6ac-55a5-42d2-9323-17f88d5f8708\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.249311 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e25a8b13-2060-4331-ae83-11d45548dce0-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.249412 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/648fc31e-cfb7-4852-89ab-88296601d71d-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.249527 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e25a8b13-2060-4331-ae83-11d45548dce0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.257894 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/648fc31e-cfb7-4852-89ab-88296601d71d-config\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.249473 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.258164 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c5f1b26e-d478-410d-93a1-4bd22ce2358a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5f1b26e-d478-410d-93a1-4bd22ce2358a\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/df595502358a5489362d4790d21416e83d731c75d32478edc42eee71b9c90c77/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.254254 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9023a312-69d0-4d07-bd00-aadc2d360175-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.250416 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/648fc31e-cfb7-4852-89ab-88296601d71d-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.257350 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9023a312-69d0-4d07-bd00-aadc2d360175-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.257771 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/648fc31e-cfb7-4852-89ab-88296601d71d-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.249571 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9023a312-69d0-4d07-bd00-aadc2d360175-config\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.256985 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.258273 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-86a6d6ac-55a5-42d2-9323-17f88d5f8708\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86a6d6ac-55a5-42d2-9323-17f88d5f8708\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e85a3f25bfc6439d98eec333216bc7dce4493bb28c1fe859f996dc4b50a07837/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.259063 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/648fc31e-cfb7-4852-89ab-88296601d71d-config\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.269188 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/648fc31e-cfb7-4852-89ab-88296601d71d-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.269777 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk6vc\" (UniqueName: \"kubernetes.io/projected/648fc31e-cfb7-4852-89ab-88296601d71d-kube-api-access-lk6vc\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.271749 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k55gt\" (UniqueName: \"kubernetes.io/projected/9023a312-69d0-4d07-bd00-aadc2d360175-kube-api-access-k55gt\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.276859 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023a312-69d0-4d07-bd00-aadc2d360175-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.280736 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/648fc31e-cfb7-4852-89ab-88296601d71d-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.283797 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9023a312-69d0-4d07-bd00-aadc2d360175-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.360577 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e25a8b13-2060-4331-ae83-11d45548dce0-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.360724 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e25a8b13-2060-4331-ae83-11d45548dce0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.360771 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1b113a27-09d7-4ac7-8094-647dfef5d01c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b113a27-09d7-4ac7-8094-647dfef5d01c\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.360792 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e25a8b13-2060-4331-ae83-11d45548dce0-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.360818 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9wr\" (UniqueName: \"kubernetes.io/projected/e25a8b13-2060-4331-ae83-11d45548dce0-kube-api-access-4g9wr\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.360834 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25a8b13-2060-4331-ae83-11d45548dce0-config\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.360865 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e25a8b13-2060-4331-ae83-11d45548dce0-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.360889 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e25a8b13-2060-4331-ae83-11d45548dce0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.361807 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e25a8b13-2060-4331-ae83-11d45548dce0-config\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.361862 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e25a8b13-2060-4331-ae83-11d45548dce0-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.362072 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e25a8b13-2060-4331-ae83-11d45548dce0-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.365999 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e25a8b13-2060-4331-ae83-11d45548dce0-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.366150 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e25a8b13-2060-4331-ae83-11d45548dce0-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.368629 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e25a8b13-2060-4331-ae83-11d45548dce0-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.377958 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9wr\" (UniqueName: \"kubernetes.io/projected/e25a8b13-2060-4331-ae83-11d45548dce0-kube-api-access-4g9wr\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.495845 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.495897 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1b113a27-09d7-4ac7-8094-647dfef5d01c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b113a27-09d7-4ac7-8094-647dfef5d01c\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c2c9f7fa8101b09646524e138d32d3987c3a02ba0b897c1f07e3e21f17c22c9a/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.515761 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-86a6d6ac-55a5-42d2-9323-17f88d5f8708\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-86a6d6ac-55a5-42d2-9323-17f88d5f8708\") pod \"ovsdbserver-nb-0\" (UID: \"9023a312-69d0-4d07-bd00-aadc2d360175\") " pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.522290 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c5f1b26e-d478-410d-93a1-4bd22ce2358a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c5f1b26e-d478-410d-93a1-4bd22ce2358a\") pod \"ovsdbserver-nb-1\" (UID: \"648fc31e-cfb7-4852-89ab-88296601d71d\") " pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.525510 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1b113a27-09d7-4ac7-8094-647dfef5d01c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1b113a27-09d7-4ac7-8094-647dfef5d01c\") pod \"ovsdbserver-nb-2\" (UID: \"e25a8b13-2060-4331-ae83-11d45548dce0\") " pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.695926 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.713298 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:47 crc kubenswrapper[4917]: I0318 08:10:47.725836 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.134526 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.212730 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 18 08:10:48 crc kubenswrapper[4917]: W0318 08:10:48.214125 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode25a8b13_2060_4331_ae83_11d45548dce0.slice/crio-00db7a5828d5475b8f12ce63da37dc24b4378a2accc414d48341e6bca79ce2bd WatchSource:0}: Error finding container 00db7a5828d5475b8f12ce63da37dc24b4378a2accc414d48341e6bca79ce2bd: Status 404 returned error can't find the container with id 00db7a5828d5475b8f12ce63da37dc24b4378a2accc414d48341e6bca79ce2bd Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.272529 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"e25a8b13-2060-4331-ae83-11d45548dce0","Type":"ContainerStarted","Data":"00db7a5828d5475b8f12ce63da37dc24b4378a2accc414d48341e6bca79ce2bd"} Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.274078 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"648fc31e-cfb7-4852-89ab-88296601d71d","Type":"ContainerStarted","Data":"18445694e74d1ed4901e9eae7e8f3f8f1f91039d12486a2345bbc76b0aeee6fd"} Mar 18 08:10:48 crc kubenswrapper[4917]: W0318 08:10:48.306546 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9023a312_69d0_4d07_bd00_aadc2d360175.slice/crio-2f78d0e23f33c47c87048e3abc3f095cc146cd46fc0db883f22318e77f531221 WatchSource:0}: Error finding container 2f78d0e23f33c47c87048e3abc3f095cc146cd46fc0db883f22318e77f531221: Status 404 returned error can't find the container with id 2f78d0e23f33c47c87048e3abc3f095cc146cd46fc0db883f22318e77f531221 Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.307639 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.590058 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.592904 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.597391 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.597666 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.597960 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8xjsp" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.598246 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.610987 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.618192 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.619904 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.631418 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.632988 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.642656 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.664265 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.690212 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4bb778c6-ea1a-4f5d-b0c4-7981d6897cbf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4bb778c6-ea1a-4f5d-b0c4-7981d6897cbf\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.690260 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd05943e-fff4-460d-b6a1-14635410027a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.690296 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd05943e-fff4-460d-b6a1-14635410027a-config\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.690318 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd05943e-fff4-460d-b6a1-14635410027a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.690358 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd05943e-fff4-460d-b6a1-14635410027a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.690388 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ht2m\" (UniqueName: \"kubernetes.io/projected/cd05943e-fff4-460d-b6a1-14635410027a-kube-api-access-8ht2m\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.690446 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd05943e-fff4-460d-b6a1-14635410027a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.690492 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd05943e-fff4-460d-b6a1-14635410027a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792056 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255dc086-6870-42ef-9774-18e47840139d-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792462 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd05943e-fff4-460d-b6a1-14635410027a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792492 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792543 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/255dc086-6870-42ef-9774-18e47840139d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792568 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxk5p\" (UniqueName: \"kubernetes.io/projected/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-kube-api-access-jxk5p\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792629 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-config\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792777 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4bb778c6-ea1a-4f5d-b0c4-7981d6897cbf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4bb778c6-ea1a-4f5d-b0c4-7981d6897cbf\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792820 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd05943e-fff4-460d-b6a1-14635410027a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792865 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd05943e-fff4-460d-b6a1-14635410027a-config\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792880 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd05943e-fff4-460d-b6a1-14635410027a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.792979 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd05943e-fff4-460d-b6a1-14635410027a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793009 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/255dc086-6870-42ef-9774-18e47840139d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793028 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5ba6a291-8534-4faf-843e-91e2bb5dedd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ba6a291-8534-4faf-843e-91e2bb5dedd3\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793054 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/255dc086-6870-42ef-9774-18e47840139d-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793081 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ht2m\" (UniqueName: \"kubernetes.io/projected/cd05943e-fff4-460d-b6a1-14635410027a-kube-api-access-8ht2m\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793133 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793168 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255dc086-6870-42ef-9774-18e47840139d-config\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793205 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7xh4\" (UniqueName: \"kubernetes.io/projected/255dc086-6870-42ef-9774-18e47840139d-kube-api-access-w7xh4\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793227 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793257 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd05943e-fff4-460d-b6a1-14635410027a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793286 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/255dc086-6870-42ef-9774-18e47840139d-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793318 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9a37c7b8-49c9-46af-86d0-aa1000eb841c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9a37c7b8-49c9-46af-86d0-aa1000eb841c\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793362 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.793393 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.796881 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd05943e-fff4-460d-b6a1-14635410027a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.797135 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.797172 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4bb778c6-ea1a-4f5d-b0c4-7981d6897cbf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4bb778c6-ea1a-4f5d-b0c4-7981d6897cbf\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0b6064ba9b11dc8446b370f868167b2748e45b23bfa3e8dd50ec61be7d338f92/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.797415 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd05943e-fff4-460d-b6a1-14635410027a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.799410 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd05943e-fff4-460d-b6a1-14635410027a-config\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.801155 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd05943e-fff4-460d-b6a1-14635410027a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.804318 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd05943e-fff4-460d-b6a1-14635410027a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.811885 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd05943e-fff4-460d-b6a1-14635410027a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.812469 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ht2m\" (UniqueName: \"kubernetes.io/projected/cd05943e-fff4-460d-b6a1-14635410027a-kube-api-access-8ht2m\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.829842 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4bb778c6-ea1a-4f5d-b0c4-7981d6897cbf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4bb778c6-ea1a-4f5d-b0c4-7981d6897cbf\") pod \"ovsdbserver-sb-0\" (UID: \"cd05943e-fff4-460d-b6a1-14635410027a\") " pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894485 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/255dc086-6870-42ef-9774-18e47840139d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894521 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5ba6a291-8534-4faf-843e-91e2bb5dedd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ba6a291-8534-4faf-843e-91e2bb5dedd3\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894547 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/255dc086-6870-42ef-9774-18e47840139d-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894593 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894614 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255dc086-6870-42ef-9774-18e47840139d-config\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894636 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7xh4\" (UniqueName: \"kubernetes.io/projected/255dc086-6870-42ef-9774-18e47840139d-kube-api-access-w7xh4\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894654 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894689 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/255dc086-6870-42ef-9774-18e47840139d-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894726 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9a37c7b8-49c9-46af-86d0-aa1000eb841c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9a37c7b8-49c9-46af-86d0-aa1000eb841c\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894743 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894771 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894788 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255dc086-6870-42ef-9774-18e47840139d-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894819 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894849 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/255dc086-6870-42ef-9774-18e47840139d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894865 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxk5p\" (UniqueName: \"kubernetes.io/projected/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-kube-api-access-jxk5p\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.894902 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-config\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.895735 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-config\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.896472 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.897034 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/255dc086-6870-42ef-9774-18e47840139d-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.897389 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.897733 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/255dc086-6870-42ef-9774-18e47840139d-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.897936 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.897950 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/255dc086-6870-42ef-9774-18e47840139d-config\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.897971 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5ba6a291-8534-4faf-843e-91e2bb5dedd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ba6a291-8534-4faf-843e-91e2bb5dedd3\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7fe2e8aabe28a6db5b39dd339b88d91bb3f0ad9b7adfaabe018905a9a7dfb04e/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.899230 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.899259 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9a37c7b8-49c9-46af-86d0-aa1000eb841c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9a37c7b8-49c9-46af-86d0-aa1000eb841c\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a3ed39e9494915a66e6fd98f64efb8c860dbd5c0c99be8a3591a4d7bb90c7ec0/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.900188 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.902139 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/255dc086-6870-42ef-9774-18e47840139d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.902289 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/255dc086-6870-42ef-9774-18e47840139d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.902423 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.903845 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255dc086-6870-42ef-9774-18e47840139d-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.903932 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.915805 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7xh4\" (UniqueName: \"kubernetes.io/projected/255dc086-6870-42ef-9774-18e47840139d-kube-api-access-w7xh4\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.919637 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxk5p\" (UniqueName: \"kubernetes.io/projected/5e3b81ab-abf1-4b33-93c5-b4a86b14ab62-kube-api-access-jxk5p\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.926736 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.938442 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9a37c7b8-49c9-46af-86d0-aa1000eb841c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9a37c7b8-49c9-46af-86d0-aa1000eb841c\") pod \"ovsdbserver-sb-1\" (UID: \"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62\") " pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.942863 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5ba6a291-8534-4faf-843e-91e2bb5dedd3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5ba6a291-8534-4faf-843e-91e2bb5dedd3\") pod \"ovsdbserver-sb-2\" (UID: \"255dc086-6870-42ef-9774-18e47840139d\") " pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.957458 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:48 crc kubenswrapper[4917]: I0318 08:10:48.967354 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:49 crc kubenswrapper[4917]: I0318 08:10:49.287693 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9023a312-69d0-4d07-bd00-aadc2d360175","Type":"ContainerStarted","Data":"2f78d0e23f33c47c87048e3abc3f095cc146cd46fc0db883f22318e77f531221"} Mar 18 08:10:49 crc kubenswrapper[4917]: I0318 08:10:49.656726 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 18 08:10:49 crc kubenswrapper[4917]: W0318 08:10:49.668755 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255dc086_6870_42ef_9774_18e47840139d.slice/crio-ebdda7f8ca41318a6d9f69d7e3114d8366bca51830c5b716d08f52fb09980165 WatchSource:0}: Error finding container ebdda7f8ca41318a6d9f69d7e3114d8366bca51830c5b716d08f52fb09980165: Status 404 returned error can't find the container with id ebdda7f8ca41318a6d9f69d7e3114d8366bca51830c5b716d08f52fb09980165 Mar 18 08:10:49 crc kubenswrapper[4917]: I0318 08:10:49.745302 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 18 08:10:49 crc kubenswrapper[4917]: W0318 08:10:49.754270 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e3b81ab_abf1_4b33_93c5_b4a86b14ab62.slice/crio-b992ca04b0aeafa266227865e567a213a750ad4ab4ab92b4d269aab7fdc055e7 WatchSource:0}: Error finding container b992ca04b0aeafa266227865e567a213a750ad4ab4ab92b4d269aab7fdc055e7: Status 404 returned error can't find the container with id b992ca04b0aeafa266227865e567a213a750ad4ab4ab92b4d269aab7fdc055e7 Mar 18 08:10:50 crc kubenswrapper[4917]: I0318 08:10:50.246751 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 18 08:10:50 crc kubenswrapper[4917]: W0318 08:10:50.252097 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd05943e_fff4_460d_b6a1_14635410027a.slice/crio-3bd74ff9fbff0390d27bc01428b9dee4c466743d8e04f654bf748be6593be678 WatchSource:0}: Error finding container 3bd74ff9fbff0390d27bc01428b9dee4c466743d8e04f654bf748be6593be678: Status 404 returned error can't find the container with id 3bd74ff9fbff0390d27bc01428b9dee4c466743d8e04f654bf748be6593be678 Mar 18 08:10:50 crc kubenswrapper[4917]: I0318 08:10:50.305472 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62","Type":"ContainerStarted","Data":"b992ca04b0aeafa266227865e567a213a750ad4ab4ab92b4d269aab7fdc055e7"} Mar 18 08:10:50 crc kubenswrapper[4917]: I0318 08:10:50.307000 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"255dc086-6870-42ef-9774-18e47840139d","Type":"ContainerStarted","Data":"ebdda7f8ca41318a6d9f69d7e3114d8366bca51830c5b716d08f52fb09980165"} Mar 18 08:10:50 crc kubenswrapper[4917]: I0318 08:10:50.308986 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cd05943e-fff4-460d-b6a1-14635410027a","Type":"ContainerStarted","Data":"3bd74ff9fbff0390d27bc01428b9dee4c466743d8e04f654bf748be6593be678"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.330141 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"648fc31e-cfb7-4852-89ab-88296601d71d","Type":"ContainerStarted","Data":"70f9da32c0feff19278b9ad853785f644a90fba4b05e5a16c398e2fc663019e0"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.330571 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"648fc31e-cfb7-4852-89ab-88296601d71d","Type":"ContainerStarted","Data":"2434c91b44fd294708a8394c1b6200a52caaf11fc070dc9c6250ae9d261b4b9c"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.332164 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62","Type":"ContainerStarted","Data":"2b3908378567608d2764dfc4c345047cb177aaed03ec6ec87ec31cb2368d1d52"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.332188 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"5e3b81ab-abf1-4b33-93c5-b4a86b14ab62","Type":"ContainerStarted","Data":"1bb8f9f15c3fafa10c6e75eb4567e237aef7e5fd5739f8ba4d72d3029dc4ea0b"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.335066 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"e25a8b13-2060-4331-ae83-11d45548dce0","Type":"ContainerStarted","Data":"ba5fdd7906a51bb55ffd43be0ae51b3c98cf69dd0f9336b6cc9549310b82e669"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.335107 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"e25a8b13-2060-4331-ae83-11d45548dce0","Type":"ContainerStarted","Data":"29c73af16d08b4cde15b8a7e2a5e3daf0e3152f399c19cd37722ba588445e651"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.336931 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"255dc086-6870-42ef-9774-18e47840139d","Type":"ContainerStarted","Data":"bc45077e0b3248f2b1055491f46209670579227d6e765e6c77d6adb0716a7a8a"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.336969 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"255dc086-6870-42ef-9774-18e47840139d","Type":"ContainerStarted","Data":"8739b58aa61975c7f3673a56f29fda86ed31b02f134665e2d6eef055ac37a590"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.339322 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cd05943e-fff4-460d-b6a1-14635410027a","Type":"ContainerStarted","Data":"cd52fcf15115b0e5074b06f3636e69ce10f8519b8950c0e77ad4fafbd2265d21"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.339354 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cd05943e-fff4-460d-b6a1-14635410027a","Type":"ContainerStarted","Data":"58fc8eeb4ea03486804f5ee62906c37f643ac340d88749b6105976a95adabe94"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.341417 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9023a312-69d0-4d07-bd00-aadc2d360175","Type":"ContainerStarted","Data":"b7a3b26b17073d1ddfd91c5b5e712b4602c6a6349a1f7546308fcfd0ed5b5d2d"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.341441 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9023a312-69d0-4d07-bd00-aadc2d360175","Type":"ContainerStarted","Data":"1dd5a0ddf9ea5aea08becf6f639c395dbc85cf8a45454fe5d6b877a7a62cbff2"} Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.358017 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=2.785736636 podStartE2EDuration="7.357997481s" podCreationTimestamp="2026-03-18 08:10:46 +0000 UTC" firstStartedPulling="2026-03-18 08:10:48.146269651 +0000 UTC m=+5033.087424355" lastFinishedPulling="2026-03-18 08:10:52.718530486 +0000 UTC m=+5037.659685200" observedRunningTime="2026-03-18 08:10:53.352324449 +0000 UTC m=+5038.293479163" watchObservedRunningTime="2026-03-18 08:10:53.357997481 +0000 UTC m=+5038.299152195" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.378116 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=2.9286793319999997 podStartE2EDuration="7.378101697s" podCreationTimestamp="2026-03-18 08:10:46 +0000 UTC" firstStartedPulling="2026-03-18 08:10:48.216616013 +0000 UTC m=+5033.157770737" lastFinishedPulling="2026-03-18 08:10:52.666038388 +0000 UTC m=+5037.607193102" observedRunningTime="2026-03-18 08:10:53.375254101 +0000 UTC m=+5038.316408825" watchObservedRunningTime="2026-03-18 08:10:53.378101697 +0000 UTC m=+5038.319256411" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.397206 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.824859339 podStartE2EDuration="6.39719146s" podCreationTimestamp="2026-03-18 08:10:47 +0000 UTC" firstStartedPulling="2026-03-18 08:10:50.254798154 +0000 UTC m=+5035.195952868" lastFinishedPulling="2026-03-18 08:10:52.827130275 +0000 UTC m=+5037.768284989" observedRunningTime="2026-03-18 08:10:53.393073864 +0000 UTC m=+5038.334228568" watchObservedRunningTime="2026-03-18 08:10:53.39719146 +0000 UTC m=+5038.338346174" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.415375 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.34297244 podStartE2EDuration="6.415352791s" podCreationTimestamp="2026-03-18 08:10:47 +0000 UTC" firstStartedPulling="2026-03-18 08:10:49.671562515 +0000 UTC m=+5034.612717229" lastFinishedPulling="2026-03-18 08:10:52.743942866 +0000 UTC m=+5037.685097580" observedRunningTime="2026-03-18 08:10:53.414869379 +0000 UTC m=+5038.356024093" watchObservedRunningTime="2026-03-18 08:10:53.415352791 +0000 UTC m=+5038.356507505" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.435247 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.420307894 podStartE2EDuration="6.435230042s" podCreationTimestamp="2026-03-18 08:10:47 +0000 UTC" firstStartedPulling="2026-03-18 08:10:49.757202361 +0000 UTC m=+5034.698357075" lastFinishedPulling="2026-03-18 08:10:52.772124509 +0000 UTC m=+5037.713279223" observedRunningTime="2026-03-18 08:10:53.431966736 +0000 UTC m=+5038.373121450" watchObservedRunningTime="2026-03-18 08:10:53.435230042 +0000 UTC m=+5038.376384756" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.451684 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.018895644 podStartE2EDuration="7.451663383s" podCreationTimestamp="2026-03-18 08:10:46 +0000 UTC" firstStartedPulling="2026-03-18 08:10:48.308615737 +0000 UTC m=+5033.249770451" lastFinishedPulling="2026-03-18 08:10:52.741383456 +0000 UTC m=+5037.682538190" observedRunningTime="2026-03-18 08:10:53.449850021 +0000 UTC m=+5038.391004735" watchObservedRunningTime="2026-03-18 08:10:53.451663383 +0000 UTC m=+5038.392818097" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.696952 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.714219 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.726486 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.927697 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.957664 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:53 crc kubenswrapper[4917]: I0318 08:10:53.967865 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:54 crc kubenswrapper[4917]: I0318 08:10:54.928262 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:54 crc kubenswrapper[4917]: I0318 08:10:54.957756 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:54 crc kubenswrapper[4917]: I0318 08:10:54.967693 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:56 crc kubenswrapper[4917]: I0318 08:10:56.754968 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:56 crc kubenswrapper[4917]: I0318 08:10:56.755885 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:56 crc kubenswrapper[4917]: I0318 08:10:56.777569 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:56 crc kubenswrapper[4917]: I0318 08:10:56.778179 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:56 crc kubenswrapper[4917]: I0318 08:10:56.779766 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:56 crc kubenswrapper[4917]: I0318 08:10:56.779989 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.430310 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.436789 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.455334 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.697714 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d4495699-8x99h"] Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.699367 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.702978 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.712177 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d4495699-8x99h"] Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.869674 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-dns-svc\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.870011 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-config\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.870118 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vc22\" (UniqueName: \"kubernetes.io/projected/99e44152-cf60-4d7b-b9dd-b781c2331928-kube-api-access-6vc22\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.870159 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-ovsdbserver-nb\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.971833 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-config\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.971908 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vc22\" (UniqueName: \"kubernetes.io/projected/99e44152-cf60-4d7b-b9dd-b781c2331928-kube-api-access-6vc22\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.971939 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-ovsdbserver-nb\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.972056 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-dns-svc\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.972987 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-dns-svc\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.972999 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-config\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.973209 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-ovsdbserver-nb\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.987441 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:57 crc kubenswrapper[4917]: I0318 08:10:57.999792 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vc22\" (UniqueName: \"kubernetes.io/projected/99e44152-cf60-4d7b-b9dd-b781c2331928-kube-api-access-6vc22\") pod \"dnsmasq-dns-86d4495699-8x99h\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.009923 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.020903 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.044305 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.055185 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.072023 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.093999 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.265160 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d4495699-8x99h"] Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.290968 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58858ff77c-7cz7s"] Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.294020 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.304155 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.309043 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58858ff77c-7cz7s"] Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.384284 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-dns-svc\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.385834 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-nb\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.385906 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgvnw\" (UniqueName: \"kubernetes.io/projected/07f76cb1-2b75-4de2-aee9-538ac9cf035b-kube-api-access-tgvnw\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.385950 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-sb\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.385992 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-config\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.487252 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-nb\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.487429 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgvnw\" (UniqueName: \"kubernetes.io/projected/07f76cb1-2b75-4de2-aee9-538ac9cf035b-kube-api-access-tgvnw\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.487544 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-sb\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.487690 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-config\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.488023 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-dns-svc\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.490159 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-dns-svc\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.490418 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-sb\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.491040 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-nb\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.491123 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-config\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.503255 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgvnw\" (UniqueName: \"kubernetes.io/projected/07f76cb1-2b75-4de2-aee9-538ac9cf035b-kube-api-access-tgvnw\") pod \"dnsmasq-dns-58858ff77c-7cz7s\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.567043 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d4495699-8x99h"] Mar 18 08:10:58 crc kubenswrapper[4917]: W0318 08:10:58.569704 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99e44152_cf60_4d7b_b9dd_b781c2331928.slice/crio-3187246ff2155b9f0ff77004a09984df94c05b481ffd4d72c417a62215906db5 WatchSource:0}: Error finding container 3187246ff2155b9f0ff77004a09984df94c05b481ffd4d72c417a62215906db5: Status 404 returned error can't find the container with id 3187246ff2155b9f0ff77004a09984df94c05b481ffd4d72c417a62215906db5 Mar 18 08:10:58 crc kubenswrapper[4917]: I0318 08:10:58.626074 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.150399 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58858ff77c-7cz7s"] Mar 18 08:10:59 crc kubenswrapper[4917]: W0318 08:10:59.150404 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07f76cb1_2b75_4de2_aee9_538ac9cf035b.slice/crio-fcd86ed358c68aad60489b7a86930ab6d0f225764ca638ec883b512baf722089 WatchSource:0}: Error finding container fcd86ed358c68aad60489b7a86930ab6d0f225764ca638ec883b512baf722089: Status 404 returned error can't find the container with id fcd86ed358c68aad60489b7a86930ab6d0f225764ca638ec883b512baf722089 Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.394983 4917 generic.go:334] "Generic (PLEG): container finished" podID="99e44152-cf60-4d7b-b9dd-b781c2331928" containerID="9ae73b57ec60988d137bcedf52f23b24106c3e0942a415b1c192fb419dec2fbf" exitCode=0 Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.395053 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4495699-8x99h" event={"ID":"99e44152-cf60-4d7b-b9dd-b781c2331928","Type":"ContainerDied","Data":"9ae73b57ec60988d137bcedf52f23b24106c3e0942a415b1c192fb419dec2fbf"} Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.395094 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4495699-8x99h" event={"ID":"99e44152-cf60-4d7b-b9dd-b781c2331928","Type":"ContainerStarted","Data":"3187246ff2155b9f0ff77004a09984df94c05b481ffd4d72c417a62215906db5"} Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.398679 4917 generic.go:334] "Generic (PLEG): container finished" podID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" containerID="6cba96f313f4036bdd9e25ffd922a51ef0937deede9c4288eb2cc90ca6a16cce" exitCode=0 Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.398767 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" event={"ID":"07f76cb1-2b75-4de2-aee9-538ac9cf035b","Type":"ContainerDied","Data":"6cba96f313f4036bdd9e25ffd922a51ef0937deede9c4288eb2cc90ca6a16cce"} Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.398849 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" event={"ID":"07f76cb1-2b75-4de2-aee9-538ac9cf035b","Type":"ContainerStarted","Data":"fcd86ed358c68aad60489b7a86930ab6d0f225764ca638ec883b512baf722089"} Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.667645 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.718439 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-dns-svc\") pod \"99e44152-cf60-4d7b-b9dd-b781c2331928\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.718749 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-config\") pod \"99e44152-cf60-4d7b-b9dd-b781c2331928\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.718796 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-ovsdbserver-nb\") pod \"99e44152-cf60-4d7b-b9dd-b781c2331928\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.718832 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vc22\" (UniqueName: \"kubernetes.io/projected/99e44152-cf60-4d7b-b9dd-b781c2331928-kube-api-access-6vc22\") pod \"99e44152-cf60-4d7b-b9dd-b781c2331928\" (UID: \"99e44152-cf60-4d7b-b9dd-b781c2331928\") " Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.727036 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e44152-cf60-4d7b-b9dd-b781c2331928-kube-api-access-6vc22" (OuterVolumeSpecName: "kube-api-access-6vc22") pod "99e44152-cf60-4d7b-b9dd-b781c2331928" (UID: "99e44152-cf60-4d7b-b9dd-b781c2331928"). InnerVolumeSpecName "kube-api-access-6vc22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.743021 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99e44152-cf60-4d7b-b9dd-b781c2331928" (UID: "99e44152-cf60-4d7b-b9dd-b781c2331928"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.749123 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99e44152-cf60-4d7b-b9dd-b781c2331928" (UID: "99e44152-cf60-4d7b-b9dd-b781c2331928"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.749655 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-config" (OuterVolumeSpecName: "config") pod "99e44152-cf60-4d7b-b9dd-b781c2331928" (UID: "99e44152-cf60-4d7b-b9dd-b781c2331928"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.821059 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.821083 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.821093 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vc22\" (UniqueName: \"kubernetes.io/projected/99e44152-cf60-4d7b-b9dd-b781c2331928-kube-api-access-6vc22\") on node \"crc\" DevicePath \"\"" Mar 18 08:10:59 crc kubenswrapper[4917]: I0318 08:10:59.821103 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99e44152-cf60-4d7b-b9dd-b781c2331928-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.414028 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d4495699-8x99h" event={"ID":"99e44152-cf60-4d7b-b9dd-b781c2331928","Type":"ContainerDied","Data":"3187246ff2155b9f0ff77004a09984df94c05b481ffd4d72c417a62215906db5"} Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.414077 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d4495699-8x99h" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.414131 4917 scope.go:117] "RemoveContainer" containerID="9ae73b57ec60988d137bcedf52f23b24106c3e0942a415b1c192fb419dec2fbf" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.419168 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" event={"ID":"07f76cb1-2b75-4de2-aee9-538ac9cf035b","Type":"ContainerStarted","Data":"208c281c357b755f3b243e20e5fc11a793fd5b9b136f7e64f4de69a2dba8fe1a"} Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.419556 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.459847 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" podStartSLOduration=2.459818995 podStartE2EDuration="2.459818995s" podCreationTimestamp="2026-03-18 08:10:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:11:00.449565576 +0000 UTC m=+5045.390720330" watchObservedRunningTime="2026-03-18 08:11:00.459818995 +0000 UTC m=+5045.400973749" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.488570 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 18 08:11:00 crc kubenswrapper[4917]: E0318 08:11:00.489200 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e44152-cf60-4d7b-b9dd-b781c2331928" containerName="init" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.489236 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e44152-cf60-4d7b-b9dd-b781c2331928" containerName="init" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.489803 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e44152-cf60-4d7b-b9dd-b781c2331928" containerName="init" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.490819 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.496124 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.527350 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.537753 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d4495699-8x99h"] Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.549508 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d4495699-8x99h"] Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.636368 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/3251a7ec-9729-4af2-9d5d-28812567f353-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.638063 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr59h\" (UniqueName: \"kubernetes.io/projected/3251a7ec-9729-4af2-9d5d-28812567f353-kube-api-access-rr59h\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.638149 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.741104 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/3251a7ec-9729-4af2-9d5d-28812567f353-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.741313 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr59h\" (UniqueName: \"kubernetes.io/projected/3251a7ec-9729-4af2-9d5d-28812567f353-kube-api-access-rr59h\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.741374 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.745573 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.745705 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/899ffaebf8aebd2fb2ab7dc9dd0afc1f89a59ae67c90cc716b3188ac57a209b5/globalmount\"" pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.752170 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/3251a7ec-9729-4af2-9d5d-28812567f353-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.775138 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr59h\" (UniqueName: \"kubernetes.io/projected/3251a7ec-9729-4af2-9d5d-28812567f353-kube-api-access-rr59h\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.805631 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\") pod \"ovn-copy-data\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " pod="openstack/ovn-copy-data" Mar 18 08:11:00 crc kubenswrapper[4917]: I0318 08:11:00.828547 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 08:11:01 crc kubenswrapper[4917]: I0318 08:11:01.451818 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 08:11:01 crc kubenswrapper[4917]: W0318 08:11:01.458874 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3251a7ec_9729_4af2_9d5d_28812567f353.slice/crio-0b1471b1edc656e7bb37d9e430fed8c9356f9f8cdbf0c209826d218fedf1a8b6 WatchSource:0}: Error finding container 0b1471b1edc656e7bb37d9e430fed8c9356f9f8cdbf0c209826d218fedf1a8b6: Status 404 returned error can't find the container with id 0b1471b1edc656e7bb37d9e430fed8c9356f9f8cdbf0c209826d218fedf1a8b6 Mar 18 08:11:01 crc kubenswrapper[4917]: I0318 08:11:01.784513 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e44152-cf60-4d7b-b9dd-b781c2331928" path="/var/lib/kubelet/pods/99e44152-cf60-4d7b-b9dd-b781c2331928/volumes" Mar 18 08:11:02 crc kubenswrapper[4917]: I0318 08:11:02.442659 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"3251a7ec-9729-4af2-9d5d-28812567f353","Type":"ContainerStarted","Data":"72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c"} Mar 18 08:11:02 crc kubenswrapper[4917]: I0318 08:11:02.442709 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"3251a7ec-9729-4af2-9d5d-28812567f353","Type":"ContainerStarted","Data":"0b1471b1edc656e7bb37d9e430fed8c9356f9f8cdbf0c209826d218fedf1a8b6"} Mar 18 08:11:02 crc kubenswrapper[4917]: I0318 08:11:02.466312 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=3.284222257 podStartE2EDuration="3.46628606s" podCreationTimestamp="2026-03-18 08:10:59 +0000 UTC" firstStartedPulling="2026-03-18 08:11:01.461224665 +0000 UTC m=+5046.402379379" lastFinishedPulling="2026-03-18 08:11:01.643288448 +0000 UTC m=+5046.584443182" observedRunningTime="2026-03-18 08:11:02.460352803 +0000 UTC m=+5047.401507567" watchObservedRunningTime="2026-03-18 08:11:02.46628606 +0000 UTC m=+5047.407440804" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.161418 4917 scope.go:117] "RemoveContainer" containerID="92158f7a67ada285ce933d134bf88e16927900ad8018321e5223ab5e83cd0547" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.722617 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.724628 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.729501 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-jxf5l" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.729959 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.730218 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.741011 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.757667 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.778678 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac882469-bc1e-4d1f-a8cb-68e67ed26912-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.778788 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mtmq\" (UniqueName: \"kubernetes.io/projected/ac882469-bc1e-4d1f-a8cb-68e67ed26912-kube-api-access-5mtmq\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.778837 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac882469-bc1e-4d1f-a8cb-68e67ed26912-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.778911 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac882469-bc1e-4d1f-a8cb-68e67ed26912-scripts\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.778996 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac882469-bc1e-4d1f-a8cb-68e67ed26912-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.779186 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac882469-bc1e-4d1f-a8cb-68e67ed26912-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.779264 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac882469-bc1e-4d1f-a8cb-68e67ed26912-config\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.880808 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac882469-bc1e-4d1f-a8cb-68e67ed26912-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.880905 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac882469-bc1e-4d1f-a8cb-68e67ed26912-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.880945 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac882469-bc1e-4d1f-a8cb-68e67ed26912-config\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.881020 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac882469-bc1e-4d1f-a8cb-68e67ed26912-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.881052 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mtmq\" (UniqueName: \"kubernetes.io/projected/ac882469-bc1e-4d1f-a8cb-68e67ed26912-kube-api-access-5mtmq\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.881079 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac882469-bc1e-4d1f-a8cb-68e67ed26912-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.881474 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ac882469-bc1e-4d1f-a8cb-68e67ed26912-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.881511 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac882469-bc1e-4d1f-a8cb-68e67ed26912-scripts\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.882051 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac882469-bc1e-4d1f-a8cb-68e67ed26912-config\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.882276 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ac882469-bc1e-4d1f-a8cb-68e67ed26912-scripts\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.886802 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac882469-bc1e-4d1f-a8cb-68e67ed26912-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.887271 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac882469-bc1e-4d1f-a8cb-68e67ed26912-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.894225 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac882469-bc1e-4d1f-a8cb-68e67ed26912-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:07 crc kubenswrapper[4917]: I0318 08:11:07.901532 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mtmq\" (UniqueName: \"kubernetes.io/projected/ac882469-bc1e-4d1f-a8cb-68e67ed26912-kube-api-access-5mtmq\") pod \"ovn-northd-0\" (UID: \"ac882469-bc1e-4d1f-a8cb-68e67ed26912\") " pod="openstack/ovn-northd-0" Mar 18 08:11:08 crc kubenswrapper[4917]: I0318 08:11:08.069614 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 18 08:11:08 crc kubenswrapper[4917]: I0318 08:11:08.534015 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 18 08:11:08 crc kubenswrapper[4917]: W0318 08:11:08.541762 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac882469_bc1e_4d1f_a8cb_68e67ed26912.slice/crio-09373d00f606544e2b7d83195f8e049c204b2b4c109d9be02b0a3ee3f1dcbd2f WatchSource:0}: Error finding container 09373d00f606544e2b7d83195f8e049c204b2b4c109d9be02b0a3ee3f1dcbd2f: Status 404 returned error can't find the container with id 09373d00f606544e2b7d83195f8e049c204b2b4c109d9be02b0a3ee3f1dcbd2f Mar 18 08:11:08 crc kubenswrapper[4917]: I0318 08:11:08.628819 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:11:08 crc kubenswrapper[4917]: I0318 08:11:08.701753 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74754f857-2tf2v"] Mar 18 08:11:08 crc kubenswrapper[4917]: I0318 08:11:08.702066 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74754f857-2tf2v" podUID="58a2777d-8ff4-4eb1-9d32-5410377794c7" containerName="dnsmasq-dns" containerID="cri-o://b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2" gracePeriod=10 Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.309654 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.421503 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-config\") pod \"58a2777d-8ff4-4eb1-9d32-5410377794c7\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.421837 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbsvs\" (UniqueName: \"kubernetes.io/projected/58a2777d-8ff4-4eb1-9d32-5410377794c7-kube-api-access-sbsvs\") pod \"58a2777d-8ff4-4eb1-9d32-5410377794c7\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.421954 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-dns-svc\") pod \"58a2777d-8ff4-4eb1-9d32-5410377794c7\" (UID: \"58a2777d-8ff4-4eb1-9d32-5410377794c7\") " Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.425655 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a2777d-8ff4-4eb1-9d32-5410377794c7-kube-api-access-sbsvs" (OuterVolumeSpecName: "kube-api-access-sbsvs") pod "58a2777d-8ff4-4eb1-9d32-5410377794c7" (UID: "58a2777d-8ff4-4eb1-9d32-5410377794c7"). InnerVolumeSpecName "kube-api-access-sbsvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.454214 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-config" (OuterVolumeSpecName: "config") pod "58a2777d-8ff4-4eb1-9d32-5410377794c7" (UID: "58a2777d-8ff4-4eb1-9d32-5410377794c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.455652 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "58a2777d-8ff4-4eb1-9d32-5410377794c7" (UID: "58a2777d-8ff4-4eb1-9d32-5410377794c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.510289 4917 generic.go:334] "Generic (PLEG): container finished" podID="58a2777d-8ff4-4eb1-9d32-5410377794c7" containerID="b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2" exitCode=0 Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.510365 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74754f857-2tf2v" event={"ID":"58a2777d-8ff4-4eb1-9d32-5410377794c7","Type":"ContainerDied","Data":"b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2"} Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.510399 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74754f857-2tf2v" event={"ID":"58a2777d-8ff4-4eb1-9d32-5410377794c7","Type":"ContainerDied","Data":"d812f733e4cd6337d765b24ef725376fc2ea8334c4a75ad320e3dd0eeba0b1b3"} Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.510421 4917 scope.go:117] "RemoveContainer" containerID="b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.510552 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74754f857-2tf2v" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.517454 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac882469-bc1e-4d1f-a8cb-68e67ed26912","Type":"ContainerStarted","Data":"d32c32088d2c3979b6daa6add0dcec40c6e84ccd32281aea7a567607fec2ece2"} Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.517492 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac882469-bc1e-4d1f-a8cb-68e67ed26912","Type":"ContainerStarted","Data":"09373d00f606544e2b7d83195f8e049c204b2b4c109d9be02b0a3ee3f1dcbd2f"} Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.524447 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbsvs\" (UniqueName: \"kubernetes.io/projected/58a2777d-8ff4-4eb1-9d32-5410377794c7-kube-api-access-sbsvs\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.524495 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.524514 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/58a2777d-8ff4-4eb1-9d32-5410377794c7-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.553624 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74754f857-2tf2v"] Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.559661 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74754f857-2tf2v"] Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.560503 4917 scope.go:117] "RemoveContainer" containerID="e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.608695 4917 scope.go:117] "RemoveContainer" containerID="b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2" Mar 18 08:11:09 crc kubenswrapper[4917]: E0318 08:11:09.609069 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2\": container with ID starting with b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2 not found: ID does not exist" containerID="b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.609102 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2"} err="failed to get container status \"b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2\": rpc error: code = NotFound desc = could not find container \"b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2\": container with ID starting with b348c923879d0f8989ee6749113dae87d65b2a28306e7ea24a6822bb69a5b0f2 not found: ID does not exist" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.609124 4917 scope.go:117] "RemoveContainer" containerID="e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1" Mar 18 08:11:09 crc kubenswrapper[4917]: E0318 08:11:09.609317 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1\": container with ID starting with e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1 not found: ID does not exist" containerID="e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.609333 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1"} err="failed to get container status \"e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1\": rpc error: code = NotFound desc = could not find container \"e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1\": container with ID starting with e896759a7651de02b8afc5f19aadfb7a93986595802c6b85eb2af5d210cc0fb1 not found: ID does not exist" Mar 18 08:11:09 crc kubenswrapper[4917]: I0318 08:11:09.793777 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a2777d-8ff4-4eb1-9d32-5410377794c7" path="/var/lib/kubelet/pods/58a2777d-8ff4-4eb1-9d32-5410377794c7/volumes" Mar 18 08:11:10 crc kubenswrapper[4917]: I0318 08:11:10.531791 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"ac882469-bc1e-4d1f-a8cb-68e67ed26912","Type":"ContainerStarted","Data":"5fb962bc8e45f21e39db99e4133f0c5d5516c4af31f726db4864f5650e7ddd15"} Mar 18 08:11:10 crc kubenswrapper[4917]: I0318 08:11:10.532064 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 18 08:11:10 crc kubenswrapper[4917]: I0318 08:11:10.561940 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.877620626 podStartE2EDuration="3.56191431s" podCreationTimestamp="2026-03-18 08:11:07 +0000 UTC" firstStartedPulling="2026-03-18 08:11:08.543370044 +0000 UTC m=+5053.484524768" lastFinishedPulling="2026-03-18 08:11:09.227663738 +0000 UTC m=+5054.168818452" observedRunningTime="2026-03-18 08:11:10.560350143 +0000 UTC m=+5055.501504907" watchObservedRunningTime="2026-03-18 08:11:10.56191431 +0000 UTC m=+5055.503069074" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.515164 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7strl"] Mar 18 08:11:12 crc kubenswrapper[4917]: E0318 08:11:12.515913 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a2777d-8ff4-4eb1-9d32-5410377794c7" containerName="dnsmasq-dns" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.515934 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a2777d-8ff4-4eb1-9d32-5410377794c7" containerName="dnsmasq-dns" Mar 18 08:11:12 crc kubenswrapper[4917]: E0318 08:11:12.515954 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a2777d-8ff4-4eb1-9d32-5410377794c7" containerName="init" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.515962 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a2777d-8ff4-4eb1-9d32-5410377794c7" containerName="init" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.516169 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a2777d-8ff4-4eb1-9d32-5410377794c7" containerName="dnsmasq-dns" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.516892 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7strl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.528616 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7strl"] Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.581381 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kldn9\" (UniqueName: \"kubernetes.io/projected/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-kube-api-access-kldn9\") pod \"keystone-db-create-7strl\" (UID: \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\") " pod="openstack/keystone-db-create-7strl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.581453 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-operator-scripts\") pod \"keystone-db-create-7strl\" (UID: \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\") " pod="openstack/keystone-db-create-7strl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.606921 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d2b6-account-create-update-8ntxl"] Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.608072 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.611064 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.617631 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d2b6-account-create-update-8ntxl"] Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.683320 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aa8dcb-e441-4437-a0ec-948efd209a2d-operator-scripts\") pod \"keystone-d2b6-account-create-update-8ntxl\" (UID: \"45aa8dcb-e441-4437-a0ec-948efd209a2d\") " pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.683408 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kldn9\" (UniqueName: \"kubernetes.io/projected/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-kube-api-access-kldn9\") pod \"keystone-db-create-7strl\" (UID: \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\") " pod="openstack/keystone-db-create-7strl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.683492 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-operator-scripts\") pod \"keystone-db-create-7strl\" (UID: \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\") " pod="openstack/keystone-db-create-7strl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.683525 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh8pm\" (UniqueName: \"kubernetes.io/projected/45aa8dcb-e441-4437-a0ec-948efd209a2d-kube-api-access-zh8pm\") pod \"keystone-d2b6-account-create-update-8ntxl\" (UID: \"45aa8dcb-e441-4437-a0ec-948efd209a2d\") " pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.684250 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-operator-scripts\") pod \"keystone-db-create-7strl\" (UID: \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\") " pod="openstack/keystone-db-create-7strl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.700614 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kldn9\" (UniqueName: \"kubernetes.io/projected/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-kube-api-access-kldn9\") pod \"keystone-db-create-7strl\" (UID: \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\") " pod="openstack/keystone-db-create-7strl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.785577 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aa8dcb-e441-4437-a0ec-948efd209a2d-operator-scripts\") pod \"keystone-d2b6-account-create-update-8ntxl\" (UID: \"45aa8dcb-e441-4437-a0ec-948efd209a2d\") " pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.785669 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh8pm\" (UniqueName: \"kubernetes.io/projected/45aa8dcb-e441-4437-a0ec-948efd209a2d-kube-api-access-zh8pm\") pod \"keystone-d2b6-account-create-update-8ntxl\" (UID: \"45aa8dcb-e441-4437-a0ec-948efd209a2d\") " pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.786488 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aa8dcb-e441-4437-a0ec-948efd209a2d-operator-scripts\") pod \"keystone-d2b6-account-create-update-8ntxl\" (UID: \"45aa8dcb-e441-4437-a0ec-948efd209a2d\") " pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.802048 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh8pm\" (UniqueName: \"kubernetes.io/projected/45aa8dcb-e441-4437-a0ec-948efd209a2d-kube-api-access-zh8pm\") pod \"keystone-d2b6-account-create-update-8ntxl\" (UID: \"45aa8dcb-e441-4437-a0ec-948efd209a2d\") " pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.846410 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7strl" Mar 18 08:11:12 crc kubenswrapper[4917]: I0318 08:11:12.923013 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:13 crc kubenswrapper[4917]: I0318 08:11:13.216140 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d2b6-account-create-update-8ntxl"] Mar 18 08:11:13 crc kubenswrapper[4917]: I0318 08:11:13.308001 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7strl"] Mar 18 08:11:13 crc kubenswrapper[4917]: W0318 08:11:13.323228 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc36f0db_bfd4_4d6a_a5b0_13adcf72e33c.slice/crio-464c2ceced232236c9bc4bbc0fbcb2dbf07e33549c7521253100bc04ef6fe0cb WatchSource:0}: Error finding container 464c2ceced232236c9bc4bbc0fbcb2dbf07e33549c7521253100bc04ef6fe0cb: Status 404 returned error can't find the container with id 464c2ceced232236c9bc4bbc0fbcb2dbf07e33549c7521253100bc04ef6fe0cb Mar 18 08:11:13 crc kubenswrapper[4917]: I0318 08:11:13.562183 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b6-account-create-update-8ntxl" event={"ID":"45aa8dcb-e441-4437-a0ec-948efd209a2d","Type":"ContainerStarted","Data":"5dfaeb703555123358909bc01552eefaf89db4d72d91255704679e066111fa31"} Mar 18 08:11:13 crc kubenswrapper[4917]: I0318 08:11:13.562243 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b6-account-create-update-8ntxl" event={"ID":"45aa8dcb-e441-4437-a0ec-948efd209a2d","Type":"ContainerStarted","Data":"bfa3ad1e13e782a63d28abff7f9a921055962fd74e5bb5b324ca836c8b48e29d"} Mar 18 08:11:13 crc kubenswrapper[4917]: I0318 08:11:13.563367 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7strl" event={"ID":"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c","Type":"ContainerStarted","Data":"81add7f27779c98728d36361f3d6ec42c0115e7a6edae7ac5e88059e5e59917c"} Mar 18 08:11:13 crc kubenswrapper[4917]: I0318 08:11:13.563415 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7strl" event={"ID":"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c","Type":"ContainerStarted","Data":"464c2ceced232236c9bc4bbc0fbcb2dbf07e33549c7521253100bc04ef6fe0cb"} Mar 18 08:11:13 crc kubenswrapper[4917]: I0318 08:11:13.581625 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d2b6-account-create-update-8ntxl" podStartSLOduration=1.581603459 podStartE2EDuration="1.581603459s" podCreationTimestamp="2026-03-18 08:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:11:13.57691955 +0000 UTC m=+5058.518074294" watchObservedRunningTime="2026-03-18 08:11:13.581603459 +0000 UTC m=+5058.522758193" Mar 18 08:11:13 crc kubenswrapper[4917]: I0318 08:11:13.612666 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-7strl" podStartSLOduration=1.6126473890000002 podStartE2EDuration="1.612647389s" podCreationTimestamp="2026-03-18 08:11:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:11:13.597957498 +0000 UTC m=+5058.539112222" watchObservedRunningTime="2026-03-18 08:11:13.612647389 +0000 UTC m=+5058.553802113" Mar 18 08:11:14 crc kubenswrapper[4917]: I0318 08:11:14.574838 4917 generic.go:334] "Generic (PLEG): container finished" podID="45aa8dcb-e441-4437-a0ec-948efd209a2d" containerID="5dfaeb703555123358909bc01552eefaf89db4d72d91255704679e066111fa31" exitCode=0 Mar 18 08:11:14 crc kubenswrapper[4917]: I0318 08:11:14.574943 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b6-account-create-update-8ntxl" event={"ID":"45aa8dcb-e441-4437-a0ec-948efd209a2d","Type":"ContainerDied","Data":"5dfaeb703555123358909bc01552eefaf89db4d72d91255704679e066111fa31"} Mar 18 08:11:14 crc kubenswrapper[4917]: I0318 08:11:14.577166 4917 generic.go:334] "Generic (PLEG): container finished" podID="fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c" containerID="81add7f27779c98728d36361f3d6ec42c0115e7a6edae7ac5e88059e5e59917c" exitCode=0 Mar 18 08:11:14 crc kubenswrapper[4917]: I0318 08:11:14.577198 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7strl" event={"ID":"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c","Type":"ContainerDied","Data":"81add7f27779c98728d36361f3d6ec42c0115e7a6edae7ac5e88059e5e59917c"} Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.028445 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7strl" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.033399 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.145464 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aa8dcb-e441-4437-a0ec-948efd209a2d-operator-scripts\") pod \"45aa8dcb-e441-4437-a0ec-948efd209a2d\" (UID: \"45aa8dcb-e441-4437-a0ec-948efd209a2d\") " Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.145659 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-operator-scripts\") pod \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\" (UID: \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\") " Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.145685 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh8pm\" (UniqueName: \"kubernetes.io/projected/45aa8dcb-e441-4437-a0ec-948efd209a2d-kube-api-access-zh8pm\") pod \"45aa8dcb-e441-4437-a0ec-948efd209a2d\" (UID: \"45aa8dcb-e441-4437-a0ec-948efd209a2d\") " Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.145742 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kldn9\" (UniqueName: \"kubernetes.io/projected/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-kube-api-access-kldn9\") pod \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\" (UID: \"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c\") " Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.146812 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45aa8dcb-e441-4437-a0ec-948efd209a2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "45aa8dcb-e441-4437-a0ec-948efd209a2d" (UID: "45aa8dcb-e441-4437-a0ec-948efd209a2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.146849 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c" (UID: "fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.151018 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-kube-api-access-kldn9" (OuterVolumeSpecName: "kube-api-access-kldn9") pod "fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c" (UID: "fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c"). InnerVolumeSpecName "kube-api-access-kldn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.152058 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45aa8dcb-e441-4437-a0ec-948efd209a2d-kube-api-access-zh8pm" (OuterVolumeSpecName: "kube-api-access-zh8pm") pod "45aa8dcb-e441-4437-a0ec-948efd209a2d" (UID: "45aa8dcb-e441-4437-a0ec-948efd209a2d"). InnerVolumeSpecName "kube-api-access-zh8pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.247755 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kldn9\" (UniqueName: \"kubernetes.io/projected/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-kube-api-access-kldn9\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.247803 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/45aa8dcb-e441-4437-a0ec-948efd209a2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.247820 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.247837 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh8pm\" (UniqueName: \"kubernetes.io/projected/45aa8dcb-e441-4437-a0ec-948efd209a2d-kube-api-access-zh8pm\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.599022 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d2b6-account-create-update-8ntxl" event={"ID":"45aa8dcb-e441-4437-a0ec-948efd209a2d","Type":"ContainerDied","Data":"bfa3ad1e13e782a63d28abff7f9a921055962fd74e5bb5b324ca836c8b48e29d"} Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.599479 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa3ad1e13e782a63d28abff7f9a921055962fd74e5bb5b324ca836c8b48e29d" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.599309 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d2b6-account-create-update-8ntxl" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.601695 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7strl" event={"ID":"fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c","Type":"ContainerDied","Data":"464c2ceced232236c9bc4bbc0fbcb2dbf07e33549c7521253100bc04ef6fe0cb"} Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.601758 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="464c2ceced232236c9bc4bbc0fbcb2dbf07e33549c7521253100bc04ef6fe0cb" Mar 18 08:11:16 crc kubenswrapper[4917]: I0318 08:11:16.601847 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7strl" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.072051 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-zjqck"] Mar 18 08:11:18 crc kubenswrapper[4917]: E0318 08:11:18.073498 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45aa8dcb-e441-4437-a0ec-948efd209a2d" containerName="mariadb-account-create-update" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.073525 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="45aa8dcb-e441-4437-a0ec-948efd209a2d" containerName="mariadb-account-create-update" Mar 18 08:11:18 crc kubenswrapper[4917]: E0318 08:11:18.073558 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c" containerName="mariadb-database-create" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.073567 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c" containerName="mariadb-database-create" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.073785 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="45aa8dcb-e441-4437-a0ec-948efd209a2d" containerName="mariadb-account-create-update" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.073798 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c" containerName="mariadb-database-create" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.074385 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.077134 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.077185 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.078123 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-62zk4" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.078451 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.085942 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zjqck"] Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.181910 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-config-data\") pod \"keystone-db-sync-zjqck\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.181954 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-combined-ca-bundle\") pod \"keystone-db-sync-zjqck\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.181985 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfb5h\" (UniqueName: \"kubernetes.io/projected/25b11861-9ee4-4b04-a9b4-8736aeac1e15-kube-api-access-mfb5h\") pod \"keystone-db-sync-zjqck\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.283738 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-config-data\") pod \"keystone-db-sync-zjqck\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.283829 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-combined-ca-bundle\") pod \"keystone-db-sync-zjqck\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.284014 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfb5h\" (UniqueName: \"kubernetes.io/projected/25b11861-9ee4-4b04-a9b4-8736aeac1e15-kube-api-access-mfb5h\") pod \"keystone-db-sync-zjqck\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.289400 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-combined-ca-bundle\") pod \"keystone-db-sync-zjqck\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.290323 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-config-data\") pod \"keystone-db-sync-zjqck\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.310658 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfb5h\" (UniqueName: \"kubernetes.io/projected/25b11861-9ee4-4b04-a9b4-8736aeac1e15-kube-api-access-mfb5h\") pod \"keystone-db-sync-zjqck\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:18 crc kubenswrapper[4917]: I0318 08:11:18.415851 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:19 crc kubenswrapper[4917]: I0318 08:11:18.878262 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zjqck"] Mar 18 08:11:19 crc kubenswrapper[4917]: I0318 08:11:19.630444 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zjqck" event={"ID":"25b11861-9ee4-4b04-a9b4-8736aeac1e15","Type":"ContainerStarted","Data":"ec9d6e643f36eaa4df453a7bc95f6b0969fdc9631151a482f94ee13629475a8d"} Mar 18 08:11:23 crc kubenswrapper[4917]: I0318 08:11:23.666523 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zjqck" event={"ID":"25b11861-9ee4-4b04-a9b4-8736aeac1e15","Type":"ContainerStarted","Data":"9ba15e42b4f651241178cffa01129b1597ae47958d127e4e9ef3a95a61449001"} Mar 18 08:11:23 crc kubenswrapper[4917]: I0318 08:11:23.702475 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-zjqck" podStartSLOduration=1.389333835 podStartE2EDuration="5.702448739s" podCreationTimestamp="2026-03-18 08:11:18 +0000 UTC" firstStartedPulling="2026-03-18 08:11:18.888267732 +0000 UTC m=+5063.829422486" lastFinishedPulling="2026-03-18 08:11:23.201382636 +0000 UTC m=+5068.142537390" observedRunningTime="2026-03-18 08:11:23.689425627 +0000 UTC m=+5068.630580411" watchObservedRunningTime="2026-03-18 08:11:23.702448739 +0000 UTC m=+5068.643603483" Mar 18 08:11:25 crc kubenswrapper[4917]: I0318 08:11:25.690737 4917 generic.go:334] "Generic (PLEG): container finished" podID="25b11861-9ee4-4b04-a9b4-8736aeac1e15" containerID="9ba15e42b4f651241178cffa01129b1597ae47958d127e4e9ef3a95a61449001" exitCode=0 Mar 18 08:11:25 crc kubenswrapper[4917]: I0318 08:11:25.690843 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zjqck" event={"ID":"25b11861-9ee4-4b04-a9b4-8736aeac1e15","Type":"ContainerDied","Data":"9ba15e42b4f651241178cffa01129b1597ae47958d127e4e9ef3a95a61449001"} Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.323046 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.468978 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfb5h\" (UniqueName: \"kubernetes.io/projected/25b11861-9ee4-4b04-a9b4-8736aeac1e15-kube-api-access-mfb5h\") pod \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.469130 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-combined-ca-bundle\") pod \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.469220 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-config-data\") pod \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\" (UID: \"25b11861-9ee4-4b04-a9b4-8736aeac1e15\") " Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.475313 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b11861-9ee4-4b04-a9b4-8736aeac1e15-kube-api-access-mfb5h" (OuterVolumeSpecName: "kube-api-access-mfb5h") pod "25b11861-9ee4-4b04-a9b4-8736aeac1e15" (UID: "25b11861-9ee4-4b04-a9b4-8736aeac1e15"). InnerVolumeSpecName "kube-api-access-mfb5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.511193 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25b11861-9ee4-4b04-a9b4-8736aeac1e15" (UID: "25b11861-9ee4-4b04-a9b4-8736aeac1e15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.523523 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-config-data" (OuterVolumeSpecName: "config-data") pod "25b11861-9ee4-4b04-a9b4-8736aeac1e15" (UID: "25b11861-9ee4-4b04-a9b4-8736aeac1e15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.571780 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.571816 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25b11861-9ee4-4b04-a9b4-8736aeac1e15-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.571829 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfb5h\" (UniqueName: \"kubernetes.io/projected/25b11861-9ee4-4b04-a9b4-8736aeac1e15-kube-api-access-mfb5h\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.715190 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zjqck" event={"ID":"25b11861-9ee4-4b04-a9b4-8736aeac1e15","Type":"ContainerDied","Data":"ec9d6e643f36eaa4df453a7bc95f6b0969fdc9631151a482f94ee13629475a8d"} Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.715250 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec9d6e643f36eaa4df453a7bc95f6b0969fdc9631151a482f94ee13629475a8d" Mar 18 08:11:27 crc kubenswrapper[4917]: I0318 08:11:27.715281 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zjqck" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.056769 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-vjjw6"] Mar 18 08:11:28 crc kubenswrapper[4917]: E0318 08:11:28.057169 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b11861-9ee4-4b04-a9b4-8736aeac1e15" containerName="keystone-db-sync" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.057188 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b11861-9ee4-4b04-a9b4-8736aeac1e15" containerName="keystone-db-sync" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.057382 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b11861-9ee4-4b04-a9b4-8736aeac1e15" containerName="keystone-db-sync" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.057964 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.064106 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.064326 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-62zk4" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.064428 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.064710 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.075281 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.078426 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c4cb5b59c-nxk29"] Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.079953 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.092372 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vjjw6"] Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.103033 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c4cb5b59c-nxk29"] Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131442 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-scripts\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131483 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-config-data\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131508 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-credential-keys\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131532 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131562 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131576 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scfnm\" (UniqueName: \"kubernetes.io/projected/c5532fe0-43b7-4932-8c29-db1c6346720e-kube-api-access-scfnm\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131674 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-config\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131699 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-dns-svc\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131726 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-fernet-keys\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131751 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-combined-ca-bundle\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.131772 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kksf9\" (UniqueName: \"kubernetes.io/projected/ef5de187-2794-4a99-9be7-25c48156dd47-kube-api-access-kksf9\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.167659 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.232924 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-combined-ca-bundle\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233223 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kksf9\" (UniqueName: \"kubernetes.io/projected/ef5de187-2794-4a99-9be7-25c48156dd47-kube-api-access-kksf9\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233265 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-scripts\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233283 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-config-data\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233308 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-credential-keys\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233330 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233357 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scfnm\" (UniqueName: \"kubernetes.io/projected/c5532fe0-43b7-4932-8c29-db1c6346720e-kube-api-access-scfnm\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233375 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233417 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-config\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233439 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-dns-svc\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.233477 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-fernet-keys\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.235421 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-dns-svc\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.235999 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-sb\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.236775 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-config\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.239577 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-combined-ca-bundle\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.240798 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-credential-keys\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.241840 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-scripts\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.243018 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-fernet-keys\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.243281 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-nb\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.252181 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-config-data\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.253462 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kksf9\" (UniqueName: \"kubernetes.io/projected/ef5de187-2794-4a99-9be7-25c48156dd47-kube-api-access-kksf9\") pod \"dnsmasq-dns-6c4cb5b59c-nxk29\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.255931 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scfnm\" (UniqueName: \"kubernetes.io/projected/c5532fe0-43b7-4932-8c29-db1c6346720e-kube-api-access-scfnm\") pod \"keystone-bootstrap-vjjw6\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.394439 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.424345 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.849740 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-vjjw6"] Mar 18 08:11:28 crc kubenswrapper[4917]: I0318 08:11:28.949873 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c4cb5b59c-nxk29"] Mar 18 08:11:29 crc kubenswrapper[4917]: W0318 08:11:29.036763 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5532fe0_43b7_4932_8c29_db1c6346720e.slice/crio-77b46eb241fd2d10d13b7c8f7cd061d5c0f9a1ace01b66284eb1b0d5271f9a46 WatchSource:0}: Error finding container 77b46eb241fd2d10d13b7c8f7cd061d5c0f9a1ace01b66284eb1b0d5271f9a46: Status 404 returned error can't find the container with id 77b46eb241fd2d10d13b7c8f7cd061d5c0f9a1ace01b66284eb1b0d5271f9a46 Mar 18 08:11:29 crc kubenswrapper[4917]: W0318 08:11:29.135954 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef5de187_2794_4a99_9be7_25c48156dd47.slice/crio-1103fe7f801e2df43302de771a283dd99686656b6a85e0031b283e0cd8ae74f6 WatchSource:0}: Error finding container 1103fe7f801e2df43302de771a283dd99686656b6a85e0031b283e0cd8ae74f6: Status 404 returned error can't find the container with id 1103fe7f801e2df43302de771a283dd99686656b6a85e0031b283e0cd8ae74f6 Mar 18 08:11:29 crc kubenswrapper[4917]: I0318 08:11:29.752269 4917 generic.go:334] "Generic (PLEG): container finished" podID="ef5de187-2794-4a99-9be7-25c48156dd47" containerID="17e682c7491c5a148c434f2bb460f62edc0a763324db1e909d8c5f2bc6b21a12" exitCode=0 Mar 18 08:11:29 crc kubenswrapper[4917]: I0318 08:11:29.754808 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" event={"ID":"ef5de187-2794-4a99-9be7-25c48156dd47","Type":"ContainerDied","Data":"17e682c7491c5a148c434f2bb460f62edc0a763324db1e909d8c5f2bc6b21a12"} Mar 18 08:11:29 crc kubenswrapper[4917]: I0318 08:11:29.754872 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" event={"ID":"ef5de187-2794-4a99-9be7-25c48156dd47","Type":"ContainerStarted","Data":"1103fe7f801e2df43302de771a283dd99686656b6a85e0031b283e0cd8ae74f6"} Mar 18 08:11:29 crc kubenswrapper[4917]: I0318 08:11:29.759821 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vjjw6" event={"ID":"c5532fe0-43b7-4932-8c29-db1c6346720e","Type":"ContainerStarted","Data":"644cdb8030a839044c71ac7afde2875311ed34a59e4fddc22e707852743fcf4a"} Mar 18 08:11:29 crc kubenswrapper[4917]: I0318 08:11:29.759867 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vjjw6" event={"ID":"c5532fe0-43b7-4932-8c29-db1c6346720e","Type":"ContainerStarted","Data":"77b46eb241fd2d10d13b7c8f7cd061d5c0f9a1ace01b66284eb1b0d5271f9a46"} Mar 18 08:11:29 crc kubenswrapper[4917]: I0318 08:11:29.806379 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-vjjw6" podStartSLOduration=1.806357687 podStartE2EDuration="1.806357687s" podCreationTimestamp="2026-03-18 08:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:11:29.803647164 +0000 UTC m=+5074.744801898" watchObservedRunningTime="2026-03-18 08:11:29.806357687 +0000 UTC m=+5074.747512401" Mar 18 08:11:30 crc kubenswrapper[4917]: I0318 08:11:30.769613 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" event={"ID":"ef5de187-2794-4a99-9be7-25c48156dd47","Type":"ContainerStarted","Data":"61b82be59dd54b705e13c2a27173582e46912084dc6e45ad1dd0473ee8ef3b31"} Mar 18 08:11:30 crc kubenswrapper[4917]: I0318 08:11:30.769911 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:30 crc kubenswrapper[4917]: I0318 08:11:30.815000 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" podStartSLOduration=2.814977374 podStartE2EDuration="2.814977374s" podCreationTimestamp="2026-03-18 08:11:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:11:30.800266142 +0000 UTC m=+5075.741420866" watchObservedRunningTime="2026-03-18 08:11:30.814977374 +0000 UTC m=+5075.756132108" Mar 18 08:11:32 crc kubenswrapper[4917]: I0318 08:11:32.787930 4917 generic.go:334] "Generic (PLEG): container finished" podID="c5532fe0-43b7-4932-8c29-db1c6346720e" containerID="644cdb8030a839044c71ac7afde2875311ed34a59e4fddc22e707852743fcf4a" exitCode=0 Mar 18 08:11:32 crc kubenswrapper[4917]: I0318 08:11:32.788027 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vjjw6" event={"ID":"c5532fe0-43b7-4932-8c29-db1c6346720e","Type":"ContainerDied","Data":"644cdb8030a839044c71ac7afde2875311ed34a59e4fddc22e707852743fcf4a"} Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.240229 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.316743 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-credential-keys\") pod \"c5532fe0-43b7-4932-8c29-db1c6346720e\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.316867 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-config-data\") pod \"c5532fe0-43b7-4932-8c29-db1c6346720e\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.316913 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-combined-ca-bundle\") pod \"c5532fe0-43b7-4932-8c29-db1c6346720e\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.316991 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-fernet-keys\") pod \"c5532fe0-43b7-4932-8c29-db1c6346720e\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.317052 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-scripts\") pod \"c5532fe0-43b7-4932-8c29-db1c6346720e\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.317332 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scfnm\" (UniqueName: \"kubernetes.io/projected/c5532fe0-43b7-4932-8c29-db1c6346720e-kube-api-access-scfnm\") pod \"c5532fe0-43b7-4932-8c29-db1c6346720e\" (UID: \"c5532fe0-43b7-4932-8c29-db1c6346720e\") " Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.327083 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c5532fe0-43b7-4932-8c29-db1c6346720e" (UID: "c5532fe0-43b7-4932-8c29-db1c6346720e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.327489 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c5532fe0-43b7-4932-8c29-db1c6346720e" (UID: "c5532fe0-43b7-4932-8c29-db1c6346720e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.329083 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5532fe0-43b7-4932-8c29-db1c6346720e-kube-api-access-scfnm" (OuterVolumeSpecName: "kube-api-access-scfnm") pod "c5532fe0-43b7-4932-8c29-db1c6346720e" (UID: "c5532fe0-43b7-4932-8c29-db1c6346720e"). InnerVolumeSpecName "kube-api-access-scfnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.335766 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-scripts" (OuterVolumeSpecName: "scripts") pod "c5532fe0-43b7-4932-8c29-db1c6346720e" (UID: "c5532fe0-43b7-4932-8c29-db1c6346720e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.356512 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5532fe0-43b7-4932-8c29-db1c6346720e" (UID: "c5532fe0-43b7-4932-8c29-db1c6346720e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.367716 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-config-data" (OuterVolumeSpecName: "config-data") pod "c5532fe0-43b7-4932-8c29-db1c6346720e" (UID: "c5532fe0-43b7-4932-8c29-db1c6346720e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.419632 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.419661 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.419670 4917 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.419678 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.419687 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scfnm\" (UniqueName: \"kubernetes.io/projected/c5532fe0-43b7-4932-8c29-db1c6346720e-kube-api-access-scfnm\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.419697 4917 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c5532fe0-43b7-4932-8c29-db1c6346720e-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.813426 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-vjjw6" event={"ID":"c5532fe0-43b7-4932-8c29-db1c6346720e","Type":"ContainerDied","Data":"77b46eb241fd2d10d13b7c8f7cd061d5c0f9a1ace01b66284eb1b0d5271f9a46"} Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.813782 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77b46eb241fd2d10d13b7c8f7cd061d5c0f9a1ace01b66284eb1b0d5271f9a46" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.813566 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-vjjw6" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.913576 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-vjjw6"] Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.927689 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-vjjw6"] Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.996681 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-5fnxk"] Mar 18 08:11:34 crc kubenswrapper[4917]: E0318 08:11:34.997186 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5532fe0-43b7-4932-8c29-db1c6346720e" containerName="keystone-bootstrap" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.997202 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5532fe0-43b7-4932-8c29-db1c6346720e" containerName="keystone-bootstrap" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.997379 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5532fe0-43b7-4932-8c29-db1c6346720e" containerName="keystone-bootstrap" Mar 18 08:11:34 crc kubenswrapper[4917]: I0318 08:11:34.997925 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.003423 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.003537 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.003570 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.003719 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.003967 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-62zk4" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.021383 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5fnxk"] Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.135152 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-config-data\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.135203 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-credential-keys\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.135243 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-scripts\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.135327 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcbsd\" (UniqueName: \"kubernetes.io/projected/f189520b-8456-4b79-ad3d-72c93b09a4d1-kube-api-access-hcbsd\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.135379 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-fernet-keys\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.135473 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-combined-ca-bundle\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.236612 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcbsd\" (UniqueName: \"kubernetes.io/projected/f189520b-8456-4b79-ad3d-72c93b09a4d1-kube-api-access-hcbsd\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.236712 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-fernet-keys\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.236756 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-combined-ca-bundle\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.236825 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-config-data\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.236848 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-credential-keys\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.236881 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-scripts\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.241373 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-scripts\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.241858 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-config-data\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.243340 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-credential-keys\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.245435 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-fernet-keys\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.250728 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-combined-ca-bundle\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.256753 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcbsd\" (UniqueName: \"kubernetes.io/projected/f189520b-8456-4b79-ad3d-72c93b09a4d1-kube-api-access-hcbsd\") pod \"keystone-bootstrap-5fnxk\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.321309 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.789032 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5532fe0-43b7-4932-8c29-db1c6346720e" path="/var/lib/kubelet/pods/c5532fe0-43b7-4932-8c29-db1c6346720e/volumes" Mar 18 08:11:35 crc kubenswrapper[4917]: I0318 08:11:35.833041 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-5fnxk"] Mar 18 08:11:36 crc kubenswrapper[4917]: I0318 08:11:36.839419 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5fnxk" event={"ID":"f189520b-8456-4b79-ad3d-72c93b09a4d1","Type":"ContainerStarted","Data":"a5d0067c66f7e11569b5cc505da1a8d3342fc11ba0785dd4c2f68c765355accf"} Mar 18 08:11:36 crc kubenswrapper[4917]: I0318 08:11:36.839831 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5fnxk" event={"ID":"f189520b-8456-4b79-ad3d-72c93b09a4d1","Type":"ContainerStarted","Data":"b0c751dbbb6e151a7869db254c43d256b331018cce5653e38a9de84ad68bb5ca"} Mar 18 08:11:36 crc kubenswrapper[4917]: I0318 08:11:36.867649 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-5fnxk" podStartSLOduration=2.867620122 podStartE2EDuration="2.867620122s" podCreationTimestamp="2026-03-18 08:11:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:11:36.859406301 +0000 UTC m=+5081.800561035" watchObservedRunningTime="2026-03-18 08:11:36.867620122 +0000 UTC m=+5081.808774846" Mar 18 08:11:38 crc kubenswrapper[4917]: I0318 08:11:38.426803 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:11:38 crc kubenswrapper[4917]: I0318 08:11:38.477190 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58858ff77c-7cz7s"] Mar 18 08:11:38 crc kubenswrapper[4917]: I0318 08:11:38.477811 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" podUID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" containerName="dnsmasq-dns" containerID="cri-o://208c281c357b755f3b243e20e5fc11a793fd5b9b136f7e64f4de69a2dba8fe1a" gracePeriod=10 Mar 18 08:11:38 crc kubenswrapper[4917]: I0318 08:11:38.627705 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" podUID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.45:5353: connect: connection refused" Mar 18 08:11:38 crc kubenswrapper[4917]: I0318 08:11:38.867517 4917 generic.go:334] "Generic (PLEG): container finished" podID="f189520b-8456-4b79-ad3d-72c93b09a4d1" containerID="a5d0067c66f7e11569b5cc505da1a8d3342fc11ba0785dd4c2f68c765355accf" exitCode=0 Mar 18 08:11:38 crc kubenswrapper[4917]: I0318 08:11:38.867643 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5fnxk" event={"ID":"f189520b-8456-4b79-ad3d-72c93b09a4d1","Type":"ContainerDied","Data":"a5d0067c66f7e11569b5cc505da1a8d3342fc11ba0785dd4c2f68c765355accf"} Mar 18 08:11:38 crc kubenswrapper[4917]: I0318 08:11:38.870788 4917 generic.go:334] "Generic (PLEG): container finished" podID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" containerID="208c281c357b755f3b243e20e5fc11a793fd5b9b136f7e64f4de69a2dba8fe1a" exitCode=0 Mar 18 08:11:38 crc kubenswrapper[4917]: I0318 08:11:38.870840 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" event={"ID":"07f76cb1-2b75-4de2-aee9-538ac9cf035b","Type":"ContainerDied","Data":"208c281c357b755f3b243e20e5fc11a793fd5b9b136f7e64f4de69a2dba8fe1a"} Mar 18 08:11:38 crc kubenswrapper[4917]: I0318 08:11:38.933005 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.022399 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-dns-svc\") pod \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.022450 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-nb\") pod \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.022496 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgvnw\" (UniqueName: \"kubernetes.io/projected/07f76cb1-2b75-4de2-aee9-538ac9cf035b-kube-api-access-tgvnw\") pod \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.022572 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-config\") pod \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.022645 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-sb\") pod \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\" (UID: \"07f76cb1-2b75-4de2-aee9-538ac9cf035b\") " Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.028522 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07f76cb1-2b75-4de2-aee9-538ac9cf035b-kube-api-access-tgvnw" (OuterVolumeSpecName: "kube-api-access-tgvnw") pod "07f76cb1-2b75-4de2-aee9-538ac9cf035b" (UID: "07f76cb1-2b75-4de2-aee9-538ac9cf035b"). InnerVolumeSpecName "kube-api-access-tgvnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.063153 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07f76cb1-2b75-4de2-aee9-538ac9cf035b" (UID: "07f76cb1-2b75-4de2-aee9-538ac9cf035b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.063879 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07f76cb1-2b75-4de2-aee9-538ac9cf035b" (UID: "07f76cb1-2b75-4de2-aee9-538ac9cf035b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.065289 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07f76cb1-2b75-4de2-aee9-538ac9cf035b" (UID: "07f76cb1-2b75-4de2-aee9-538ac9cf035b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.066421 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-config" (OuterVolumeSpecName: "config") pod "07f76cb1-2b75-4de2-aee9-538ac9cf035b" (UID: "07f76cb1-2b75-4de2-aee9-538ac9cf035b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.125111 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.125158 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.125170 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.125181 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgvnw\" (UniqueName: \"kubernetes.io/projected/07f76cb1-2b75-4de2-aee9-538ac9cf035b-kube-api-access-tgvnw\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.125192 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07f76cb1-2b75-4de2-aee9-538ac9cf035b-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.901765 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" event={"ID":"07f76cb1-2b75-4de2-aee9-538ac9cf035b","Type":"ContainerDied","Data":"fcd86ed358c68aad60489b7a86930ab6d0f225764ca638ec883b512baf722089"} Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.901933 4917 scope.go:117] "RemoveContainer" containerID="208c281c357b755f3b243e20e5fc11a793fd5b9b136f7e64f4de69a2dba8fe1a" Mar 18 08:11:39 crc kubenswrapper[4917]: I0318 08:11:39.901952 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58858ff77c-7cz7s" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.123001 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58858ff77c-7cz7s"] Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.128677 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58858ff77c-7cz7s"] Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.132105 4917 scope.go:117] "RemoveContainer" containerID="6cba96f313f4036bdd9e25ffd922a51ef0937deede9c4288eb2cc90ca6a16cce" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.346394 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.448502 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-combined-ca-bundle\") pod \"f189520b-8456-4b79-ad3d-72c93b09a4d1\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.448545 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-credential-keys\") pod \"f189520b-8456-4b79-ad3d-72c93b09a4d1\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.448674 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-scripts\") pod \"f189520b-8456-4b79-ad3d-72c93b09a4d1\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.448729 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-config-data\") pod \"f189520b-8456-4b79-ad3d-72c93b09a4d1\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.448745 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-fernet-keys\") pod \"f189520b-8456-4b79-ad3d-72c93b09a4d1\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.448778 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcbsd\" (UniqueName: \"kubernetes.io/projected/f189520b-8456-4b79-ad3d-72c93b09a4d1-kube-api-access-hcbsd\") pod \"f189520b-8456-4b79-ad3d-72c93b09a4d1\" (UID: \"f189520b-8456-4b79-ad3d-72c93b09a4d1\") " Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.452929 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-scripts" (OuterVolumeSpecName: "scripts") pod "f189520b-8456-4b79-ad3d-72c93b09a4d1" (UID: "f189520b-8456-4b79-ad3d-72c93b09a4d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.453531 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f189520b-8456-4b79-ad3d-72c93b09a4d1" (UID: "f189520b-8456-4b79-ad3d-72c93b09a4d1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.454113 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f189520b-8456-4b79-ad3d-72c93b09a4d1" (UID: "f189520b-8456-4b79-ad3d-72c93b09a4d1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.454439 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f189520b-8456-4b79-ad3d-72c93b09a4d1-kube-api-access-hcbsd" (OuterVolumeSpecName: "kube-api-access-hcbsd") pod "f189520b-8456-4b79-ad3d-72c93b09a4d1" (UID: "f189520b-8456-4b79-ad3d-72c93b09a4d1"). InnerVolumeSpecName "kube-api-access-hcbsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.473365 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f189520b-8456-4b79-ad3d-72c93b09a4d1" (UID: "f189520b-8456-4b79-ad3d-72c93b09a4d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.475441 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-config-data" (OuterVolumeSpecName: "config-data") pod "f189520b-8456-4b79-ad3d-72c93b09a4d1" (UID: "f189520b-8456-4b79-ad3d-72c93b09a4d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.550415 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.550453 4917 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.550465 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.550475 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.550486 4917 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f189520b-8456-4b79-ad3d-72c93b09a4d1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.550496 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcbsd\" (UniqueName: \"kubernetes.io/projected/f189520b-8456-4b79-ad3d-72c93b09a4d1-kube-api-access-hcbsd\") on node \"crc\" DevicePath \"\"" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.916164 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-5fnxk" Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.916087 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-5fnxk" event={"ID":"f189520b-8456-4b79-ad3d-72c93b09a4d1","Type":"ContainerDied","Data":"b0c751dbbb6e151a7869db254c43d256b331018cce5653e38a9de84ad68bb5ca"} Mar 18 08:11:40 crc kubenswrapper[4917]: I0318 08:11:40.916500 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0c751dbbb6e151a7869db254c43d256b331018cce5653e38a9de84ad68bb5ca" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.099647 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7cd97484f6-89g6g"] Mar 18 08:11:41 crc kubenswrapper[4917]: E0318 08:11:41.100429 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" containerName="init" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.100452 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" containerName="init" Mar 18 08:11:41 crc kubenswrapper[4917]: E0318 08:11:41.100471 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" containerName="dnsmasq-dns" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.100480 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" containerName="dnsmasq-dns" Mar 18 08:11:41 crc kubenswrapper[4917]: E0318 08:11:41.100503 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f189520b-8456-4b79-ad3d-72c93b09a4d1" containerName="keystone-bootstrap" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.100511 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f189520b-8456-4b79-ad3d-72c93b09a4d1" containerName="keystone-bootstrap" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.100731 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" containerName="dnsmasq-dns" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.100749 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f189520b-8456-4b79-ad3d-72c93b09a4d1" containerName="keystone-bootstrap" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.101370 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.108260 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.108407 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-62zk4" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.108413 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.108425 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.108633 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.114199 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cd97484f6-89g6g"] Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.119333 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.162039 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-scripts\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.162103 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-internal-tls-certs\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.162173 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-config-data\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.162199 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-fernet-keys\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.162234 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-public-tls-certs\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.162264 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-combined-ca-bundle\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.162288 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr6n4\" (UniqueName: \"kubernetes.io/projected/be44e918-695b-4764-b584-9698eeabb807-kube-api-access-vr6n4\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.162343 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-credential-keys\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.264045 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-credential-keys\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.264644 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-scripts\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.264789 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-internal-tls-certs\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.264918 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-config-data\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.265046 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-fernet-keys\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.265186 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-public-tls-certs\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.265300 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-combined-ca-bundle\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.265382 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr6n4\" (UniqueName: \"kubernetes.io/projected/be44e918-695b-4764-b584-9698eeabb807-kube-api-access-vr6n4\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.270083 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-public-tls-certs\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.271313 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-scripts\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.272214 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-config-data\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.279204 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-combined-ca-bundle\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.279661 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-fernet-keys\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.280422 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-internal-tls-certs\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.282234 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/be44e918-695b-4764-b584-9698eeabb807-credential-keys\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.286332 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr6n4\" (UniqueName: \"kubernetes.io/projected/be44e918-695b-4764-b584-9698eeabb807-kube-api-access-vr6n4\") pod \"keystone-7cd97484f6-89g6g\" (UID: \"be44e918-695b-4764-b584-9698eeabb807\") " pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.466818 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:41 crc kubenswrapper[4917]: I0318 08:11:41.788656 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07f76cb1-2b75-4de2-aee9-538ac9cf035b" path="/var/lib/kubelet/pods/07f76cb1-2b75-4de2-aee9-538ac9cf035b/volumes" Mar 18 08:11:42 crc kubenswrapper[4917]: I0318 08:11:42.289471 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cd97484f6-89g6g"] Mar 18 08:11:42 crc kubenswrapper[4917]: I0318 08:11:42.945551 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cd97484f6-89g6g" event={"ID":"be44e918-695b-4764-b584-9698eeabb807","Type":"ContainerStarted","Data":"9b750e4ddc32758c8201680e106e7943bb35cb2c940e69eb2d97fdd150623023"} Mar 18 08:11:42 crc kubenswrapper[4917]: I0318 08:11:42.945915 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cd97484f6-89g6g" event={"ID":"be44e918-695b-4764-b584-9698eeabb807","Type":"ContainerStarted","Data":"4630170dd9e5b732f9628572d303e742d0cd57889f34cd42f86e12338752cd1f"} Mar 18 08:11:42 crc kubenswrapper[4917]: I0318 08:11:42.945937 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:11:42 crc kubenswrapper[4917]: I0318 08:11:42.972136 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7cd97484f6-89g6g" podStartSLOduration=1.972121511 podStartE2EDuration="1.972121511s" podCreationTimestamp="2026-03-18 08:11:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:11:42.971269201 +0000 UTC m=+5087.912423935" watchObservedRunningTime="2026-03-18 08:11:42.972121511 +0000 UTC m=+5087.913276225" Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.146849 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563692-k6hj5"] Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.151541 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563692-k6hj5" Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.155073 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.155706 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.157680 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.157807 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563692-k6hj5"] Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.328702 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5k9s\" (UniqueName: \"kubernetes.io/projected/abfd30f8-cac5-4990-8237-9a0381bd36a0-kube-api-access-d5k9s\") pod \"auto-csr-approver-29563692-k6hj5\" (UID: \"abfd30f8-cac5-4990-8237-9a0381bd36a0\") " pod="openshift-infra/auto-csr-approver-29563692-k6hj5" Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.430743 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5k9s\" (UniqueName: \"kubernetes.io/projected/abfd30f8-cac5-4990-8237-9a0381bd36a0-kube-api-access-d5k9s\") pod \"auto-csr-approver-29563692-k6hj5\" (UID: \"abfd30f8-cac5-4990-8237-9a0381bd36a0\") " pod="openshift-infra/auto-csr-approver-29563692-k6hj5" Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.467042 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5k9s\" (UniqueName: \"kubernetes.io/projected/abfd30f8-cac5-4990-8237-9a0381bd36a0-kube-api-access-d5k9s\") pod \"auto-csr-approver-29563692-k6hj5\" (UID: \"abfd30f8-cac5-4990-8237-9a0381bd36a0\") " pod="openshift-infra/auto-csr-approver-29563692-k6hj5" Mar 18 08:12:00 crc kubenswrapper[4917]: I0318 08:12:00.477488 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563692-k6hj5" Mar 18 08:12:01 crc kubenswrapper[4917]: I0318 08:12:01.051820 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563692-k6hj5"] Mar 18 08:12:01 crc kubenswrapper[4917]: I0318 08:12:01.139219 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563692-k6hj5" event={"ID":"abfd30f8-cac5-4990-8237-9a0381bd36a0","Type":"ContainerStarted","Data":"50d433c2eada6490fa52e4bc8661b484f64e4b484535b9df4e28c30f4f7ea127"} Mar 18 08:12:02 crc kubenswrapper[4917]: I0318 08:12:02.929041 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:12:02 crc kubenswrapper[4917]: I0318 08:12:02.929738 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:12:03 crc kubenswrapper[4917]: I0318 08:12:03.164212 4917 generic.go:334] "Generic (PLEG): container finished" podID="abfd30f8-cac5-4990-8237-9a0381bd36a0" containerID="b845de4e4b0fddb0e1f6ec2c1cb7676a693c25921a229bae7da663095b22b2d7" exitCode=0 Mar 18 08:12:03 crc kubenswrapper[4917]: I0318 08:12:03.164299 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563692-k6hj5" event={"ID":"abfd30f8-cac5-4990-8237-9a0381bd36a0","Type":"ContainerDied","Data":"b845de4e4b0fddb0e1f6ec2c1cb7676a693c25921a229bae7da663095b22b2d7"} Mar 18 08:12:04 crc kubenswrapper[4917]: I0318 08:12:04.586505 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563692-k6hj5" Mar 18 08:12:04 crc kubenswrapper[4917]: I0318 08:12:04.710555 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5k9s\" (UniqueName: \"kubernetes.io/projected/abfd30f8-cac5-4990-8237-9a0381bd36a0-kube-api-access-d5k9s\") pod \"abfd30f8-cac5-4990-8237-9a0381bd36a0\" (UID: \"abfd30f8-cac5-4990-8237-9a0381bd36a0\") " Mar 18 08:12:04 crc kubenswrapper[4917]: I0318 08:12:04.717020 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abfd30f8-cac5-4990-8237-9a0381bd36a0-kube-api-access-d5k9s" (OuterVolumeSpecName: "kube-api-access-d5k9s") pod "abfd30f8-cac5-4990-8237-9a0381bd36a0" (UID: "abfd30f8-cac5-4990-8237-9a0381bd36a0"). InnerVolumeSpecName "kube-api-access-d5k9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:12:04 crc kubenswrapper[4917]: I0318 08:12:04.815506 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5k9s\" (UniqueName: \"kubernetes.io/projected/abfd30f8-cac5-4990-8237-9a0381bd36a0-kube-api-access-d5k9s\") on node \"crc\" DevicePath \"\"" Mar 18 08:12:05 crc kubenswrapper[4917]: I0318 08:12:05.187471 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563692-k6hj5" event={"ID":"abfd30f8-cac5-4990-8237-9a0381bd36a0","Type":"ContainerDied","Data":"50d433c2eada6490fa52e4bc8661b484f64e4b484535b9df4e28c30f4f7ea127"} Mar 18 08:12:05 crc kubenswrapper[4917]: I0318 08:12:05.187512 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d433c2eada6490fa52e4bc8661b484f64e4b484535b9df4e28c30f4f7ea127" Mar 18 08:12:05 crc kubenswrapper[4917]: I0318 08:12:05.188043 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563692-k6hj5" Mar 18 08:12:05 crc kubenswrapper[4917]: I0318 08:12:05.668861 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563686-s8hgz"] Mar 18 08:12:05 crc kubenswrapper[4917]: I0318 08:12:05.679270 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563686-s8hgz"] Mar 18 08:12:05 crc kubenswrapper[4917]: I0318 08:12:05.796976 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7061c831-cc15-4ad1-8748-2327a5bea84e" path="/var/lib/kubelet/pods/7061c831-cc15-4ad1-8748-2327a5bea84e/volumes" Mar 18 08:12:07 crc kubenswrapper[4917]: I0318 08:12:07.275767 4917 scope.go:117] "RemoveContainer" containerID="fe111208ba0414c8f247d915a0bc05e53a8a3c35d4fda2007b4cb160eb94872f" Mar 18 08:12:07 crc kubenswrapper[4917]: I0318 08:12:07.324083 4917 scope.go:117] "RemoveContainer" containerID="4cc0be2c51a22908fbfc14e2bcd4cb64a04baa8286228c019cb76b36c3428d40" Mar 18 08:12:10 crc kubenswrapper[4917]: I0318 08:12:10.097041 4917 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod07f76cb1-2b75-4de2-aee9-538ac9cf035b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod07f76cb1-2b75-4de2-aee9-538ac9cf035b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod07f76cb1_2b75_4de2_aee9_538ac9cf035b.slice" Mar 18 08:12:12 crc kubenswrapper[4917]: I0318 08:12:12.977046 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7cd97484f6-89g6g" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.667936 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 08:12:17 crc kubenswrapper[4917]: E0318 08:12:17.669802 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abfd30f8-cac5-4990-8237-9a0381bd36a0" containerName="oc" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.669883 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="abfd30f8-cac5-4990-8237-9a0381bd36a0" containerName="oc" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.670092 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="abfd30f8-cac5-4990-8237-9a0381bd36a0" containerName="oc" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.670740 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.672960 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.674724 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.678253 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-4kh6m" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.701190 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.703981 4917 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e4f2077-f8ee-4a71-81a2-b68b775e30c6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T08:12:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T08:12:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T08:12:17Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-18T08:12:17Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.rdoproject.org/podified-antelope-centos9/openstack-openstackclient:059169826d1e668c44c01b5bb9959b22\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem\\\",\\\"name\\\":\\\"combined-ca-bundle\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l5n85\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-18T08:12:17Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.711851 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 08:12:17 crc kubenswrapper[4917]: E0318 08:12:17.712547 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle kube-api-access-l5n85 openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[combined-ca-bundle kube-api-access-l5n85 openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="6e4f2077-f8ee-4a71-81a2-b68b775e30c6" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.720832 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.740656 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.744315 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.752036 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.775217 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6e4f2077-f8ee-4a71-81a2-b68b775e30c6" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.785199 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.785257 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.785344 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.785410 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44tl8\" (UniqueName: \"kubernetes.io/projected/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-kube-api-access-44tl8\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.787101 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e4f2077-f8ee-4a71-81a2-b68b775e30c6" path="/var/lib/kubelet/pods/6e4f2077-f8ee-4a71-81a2-b68b775e30c6/volumes" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.887178 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.887246 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.887391 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.887445 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44tl8\" (UniqueName: \"kubernetes.io/projected/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-kube-api-access-44tl8\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.888678 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.896653 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config-secret\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.904052 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44tl8\" (UniqueName: \"kubernetes.io/projected/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-kube-api-access-44tl8\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:17 crc kubenswrapper[4917]: I0318 08:12:17.904643 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " pod="openstack/openstackclient" Mar 18 08:12:18 crc kubenswrapper[4917]: I0318 08:12:18.092554 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:12:18 crc kubenswrapper[4917]: I0318 08:12:18.308192 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:12:18 crc kubenswrapper[4917]: I0318 08:12:18.312056 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6e4f2077-f8ee-4a71-81a2-b68b775e30c6" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" Mar 18 08:12:18 crc kubenswrapper[4917]: I0318 08:12:18.318448 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:12:18 crc kubenswrapper[4917]: I0318 08:12:18.321330 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6e4f2077-f8ee-4a71-81a2-b68b775e30c6" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" Mar 18 08:12:18 crc kubenswrapper[4917]: I0318 08:12:18.856724 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 08:12:18 crc kubenswrapper[4917]: W0318 08:12:18.862238 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda64a3f74_fdb0_45c0_970b_0fc6220fb8aa.slice/crio-96dd43340541ef22aa249174fd61eb329a322b654716e74dccd405706cf082f0 WatchSource:0}: Error finding container 96dd43340541ef22aa249174fd61eb329a322b654716e74dccd405706cf082f0: Status 404 returned error can't find the container with id 96dd43340541ef22aa249174fd61eb329a322b654716e74dccd405706cf082f0 Mar 18 08:12:19 crc kubenswrapper[4917]: I0318 08:12:19.320524 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:12:19 crc kubenswrapper[4917]: I0318 08:12:19.320526 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa","Type":"ContainerStarted","Data":"96dd43340541ef22aa249174fd61eb329a322b654716e74dccd405706cf082f0"} Mar 18 08:12:19 crc kubenswrapper[4917]: I0318 08:12:19.323966 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6e4f2077-f8ee-4a71-81a2-b68b775e30c6" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" Mar 18 08:12:19 crc kubenswrapper[4917]: I0318 08:12:19.344358 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6e4f2077-f8ee-4a71-81a2-b68b775e30c6" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" Mar 18 08:12:30 crc kubenswrapper[4917]: I0318 08:12:30.431155 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa","Type":"ContainerStarted","Data":"8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96"} Mar 18 08:12:30 crc kubenswrapper[4917]: I0318 08:12:30.453672 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.047167015 podStartE2EDuration="13.453628451s" podCreationTimestamp="2026-03-18 08:12:17 +0000 UTC" firstStartedPulling="2026-03-18 08:12:18.865372811 +0000 UTC m=+5123.806527525" lastFinishedPulling="2026-03-18 08:12:29.271834237 +0000 UTC m=+5134.212988961" observedRunningTime="2026-03-18 08:12:30.446567327 +0000 UTC m=+5135.387722111" watchObservedRunningTime="2026-03-18 08:12:30.453628451 +0000 UTC m=+5135.394783195" Mar 18 08:12:32 crc kubenswrapper[4917]: I0318 08:12:32.928923 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:12:32 crc kubenswrapper[4917]: I0318 08:12:32.929237 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:13:02 crc kubenswrapper[4917]: I0318 08:13:02.929807 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:13:02 crc kubenswrapper[4917]: I0318 08:13:02.930460 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:13:02 crc kubenswrapper[4917]: I0318 08:13:02.930514 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:13:02 crc kubenswrapper[4917]: I0318 08:13:02.931493 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4857c0f586b258498a2f24a9ee2cbd7165480a218adc673893556e562f68bc3b"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:13:02 crc kubenswrapper[4917]: I0318 08:13:02.931557 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://4857c0f586b258498a2f24a9ee2cbd7165480a218adc673893556e562f68bc3b" gracePeriod=600 Mar 18 08:13:03 crc kubenswrapper[4917]: I0318 08:13:03.786190 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="4857c0f586b258498a2f24a9ee2cbd7165480a218adc673893556e562f68bc3b" exitCode=0 Mar 18 08:13:03 crc kubenswrapper[4917]: I0318 08:13:03.803219 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"4857c0f586b258498a2f24a9ee2cbd7165480a218adc673893556e562f68bc3b"} Mar 18 08:13:03 crc kubenswrapper[4917]: I0318 08:13:03.808036 4917 scope.go:117] "RemoveContainer" containerID="86429f861c4eb475a742d32fd3693f71c0a34cee137cca9ef1b34c9f4b247c1b" Mar 18 08:13:04 crc kubenswrapper[4917]: I0318 08:13:04.802089 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1"} Mar 18 08:13:07 crc kubenswrapper[4917]: I0318 08:13:07.414211 4917 scope.go:117] "RemoveContainer" containerID="24fd371ad43a46eac5352ccc44df51e7a629b836a1acc620d39235ea332bb7e7" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.099001 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-d5rpk"] Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.100517 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.117560 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-d5rpk"] Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.205347 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2baa-account-create-update-ctf24"] Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.206999 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.208800 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.218260 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2baa-account-create-update-ctf24"] Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.280142 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgqxx\" (UniqueName: \"kubernetes.io/projected/550d2002-958b-45b5-8618-ab8065124e2f-kube-api-access-bgqxx\") pod \"barbican-db-create-d5rpk\" (UID: \"550d2002-958b-45b5-8618-ab8065124e2f\") " pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.280198 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/550d2002-958b-45b5-8618-ab8065124e2f-operator-scripts\") pod \"barbican-db-create-d5rpk\" (UID: \"550d2002-958b-45b5-8618-ab8065124e2f\") " pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.381752 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5wh6\" (UniqueName: \"kubernetes.io/projected/41d247d8-d97f-44ba-b769-5f13a5973355-kube-api-access-k5wh6\") pod \"barbican-2baa-account-create-update-ctf24\" (UID: \"41d247d8-d97f-44ba-b769-5f13a5973355\") " pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.381907 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgqxx\" (UniqueName: \"kubernetes.io/projected/550d2002-958b-45b5-8618-ab8065124e2f-kube-api-access-bgqxx\") pod \"barbican-db-create-d5rpk\" (UID: \"550d2002-958b-45b5-8618-ab8065124e2f\") " pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.382948 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/550d2002-958b-45b5-8618-ab8065124e2f-operator-scripts\") pod \"barbican-db-create-d5rpk\" (UID: \"550d2002-958b-45b5-8618-ab8065124e2f\") " pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.383046 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d247d8-d97f-44ba-b769-5f13a5973355-operator-scripts\") pod \"barbican-2baa-account-create-update-ctf24\" (UID: \"41d247d8-d97f-44ba-b769-5f13a5973355\") " pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.383810 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/550d2002-958b-45b5-8618-ab8065124e2f-operator-scripts\") pod \"barbican-db-create-d5rpk\" (UID: \"550d2002-958b-45b5-8618-ab8065124e2f\") " pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.420689 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgqxx\" (UniqueName: \"kubernetes.io/projected/550d2002-958b-45b5-8618-ab8065124e2f-kube-api-access-bgqxx\") pod \"barbican-db-create-d5rpk\" (UID: \"550d2002-958b-45b5-8618-ab8065124e2f\") " pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.485286 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d247d8-d97f-44ba-b769-5f13a5973355-operator-scripts\") pod \"barbican-2baa-account-create-update-ctf24\" (UID: \"41d247d8-d97f-44ba-b769-5f13a5973355\") " pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.485383 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5wh6\" (UniqueName: \"kubernetes.io/projected/41d247d8-d97f-44ba-b769-5f13a5973355-kube-api-access-k5wh6\") pod \"barbican-2baa-account-create-update-ctf24\" (UID: \"41d247d8-d97f-44ba-b769-5f13a5973355\") " pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.486277 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d247d8-d97f-44ba-b769-5f13a5973355-operator-scripts\") pod \"barbican-2baa-account-create-update-ctf24\" (UID: \"41d247d8-d97f-44ba-b769-5f13a5973355\") " pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.504306 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5wh6\" (UniqueName: \"kubernetes.io/projected/41d247d8-d97f-44ba-b769-5f13a5973355-kube-api-access-k5wh6\") pod \"barbican-2baa-account-create-update-ctf24\" (UID: \"41d247d8-d97f-44ba-b769-5f13a5973355\") " pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.530449 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.718923 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.955112 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-d5rpk"] Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.971176 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2baa-account-create-update-ctf24"] Mar 18 08:13:55 crc kubenswrapper[4917]: W0318 08:13:55.978494 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41d247d8_d97f_44ba_b769_5f13a5973355.slice/crio-c9cfdfcd8557e223024af8e1f61252f6d03cff9107e59e8ebd4cedd9beb24bf4 WatchSource:0}: Error finding container c9cfdfcd8557e223024af8e1f61252f6d03cff9107e59e8ebd4cedd9beb24bf4: Status 404 returned error can't find the container with id c9cfdfcd8557e223024af8e1f61252f6d03cff9107e59e8ebd4cedd9beb24bf4 Mar 18 08:13:55 crc kubenswrapper[4917]: I0318 08:13:55.982446 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 18 08:13:56 crc kubenswrapper[4917]: I0318 08:13:56.309260 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2baa-account-create-update-ctf24" event={"ID":"41d247d8-d97f-44ba-b769-5f13a5973355","Type":"ContainerStarted","Data":"e3fd0f4211d8d8b2af47925b0efb65b133f4de0ea28bc94e5240b1d179bfe9f4"} Mar 18 08:13:56 crc kubenswrapper[4917]: I0318 08:13:56.309332 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2baa-account-create-update-ctf24" event={"ID":"41d247d8-d97f-44ba-b769-5f13a5973355","Type":"ContainerStarted","Data":"c9cfdfcd8557e223024af8e1f61252f6d03cff9107e59e8ebd4cedd9beb24bf4"} Mar 18 08:13:56 crc kubenswrapper[4917]: I0318 08:13:56.311426 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-d5rpk" event={"ID":"550d2002-958b-45b5-8618-ab8065124e2f","Type":"ContainerStarted","Data":"622916c3b4ad66a4938df690f52d88a166aa619fa4c03fe7bef870621bbcb30d"} Mar 18 08:13:56 crc kubenswrapper[4917]: I0318 08:13:56.311476 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-d5rpk" event={"ID":"550d2002-958b-45b5-8618-ab8065124e2f","Type":"ContainerStarted","Data":"b9262af747ca6a881f59291330a3164647eba2823770006addf2e8ec53094d21"} Mar 18 08:13:56 crc kubenswrapper[4917]: I0318 08:13:56.333285 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-2baa-account-create-update-ctf24" podStartSLOduration=1.333265655 podStartE2EDuration="1.333265655s" podCreationTimestamp="2026-03-18 08:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:13:56.328562931 +0000 UTC m=+5221.269717665" watchObservedRunningTime="2026-03-18 08:13:56.333265655 +0000 UTC m=+5221.274420379" Mar 18 08:13:56 crc kubenswrapper[4917]: I0318 08:13:56.348837 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-d5rpk" podStartSLOduration=1.348817962 podStartE2EDuration="1.348817962s" podCreationTimestamp="2026-03-18 08:13:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:13:56.347299965 +0000 UTC m=+5221.288454689" watchObservedRunningTime="2026-03-18 08:13:56.348817962 +0000 UTC m=+5221.289972686" Mar 18 08:13:57 crc kubenswrapper[4917]: I0318 08:13:57.324823 4917 generic.go:334] "Generic (PLEG): container finished" podID="41d247d8-d97f-44ba-b769-5f13a5973355" containerID="e3fd0f4211d8d8b2af47925b0efb65b133f4de0ea28bc94e5240b1d179bfe9f4" exitCode=0 Mar 18 08:13:57 crc kubenswrapper[4917]: I0318 08:13:57.324904 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2baa-account-create-update-ctf24" event={"ID":"41d247d8-d97f-44ba-b769-5f13a5973355","Type":"ContainerDied","Data":"e3fd0f4211d8d8b2af47925b0efb65b133f4de0ea28bc94e5240b1d179bfe9f4"} Mar 18 08:13:57 crc kubenswrapper[4917]: I0318 08:13:57.327641 4917 generic.go:334] "Generic (PLEG): container finished" podID="550d2002-958b-45b5-8618-ab8065124e2f" containerID="622916c3b4ad66a4938df690f52d88a166aa619fa4c03fe7bef870621bbcb30d" exitCode=0 Mar 18 08:13:57 crc kubenswrapper[4917]: I0318 08:13:57.327698 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-d5rpk" event={"ID":"550d2002-958b-45b5-8618-ab8065124e2f","Type":"ContainerDied","Data":"622916c3b4ad66a4938df690f52d88a166aa619fa4c03fe7bef870621bbcb30d"} Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.771224 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.779599 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.858954 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgqxx\" (UniqueName: \"kubernetes.io/projected/550d2002-958b-45b5-8618-ab8065124e2f-kube-api-access-bgqxx\") pod \"550d2002-958b-45b5-8618-ab8065124e2f\" (UID: \"550d2002-958b-45b5-8618-ab8065124e2f\") " Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.859212 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5wh6\" (UniqueName: \"kubernetes.io/projected/41d247d8-d97f-44ba-b769-5f13a5973355-kube-api-access-k5wh6\") pod \"41d247d8-d97f-44ba-b769-5f13a5973355\" (UID: \"41d247d8-d97f-44ba-b769-5f13a5973355\") " Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.859296 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d247d8-d97f-44ba-b769-5f13a5973355-operator-scripts\") pod \"41d247d8-d97f-44ba-b769-5f13a5973355\" (UID: \"41d247d8-d97f-44ba-b769-5f13a5973355\") " Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.859346 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/550d2002-958b-45b5-8618-ab8065124e2f-operator-scripts\") pod \"550d2002-958b-45b5-8618-ab8065124e2f\" (UID: \"550d2002-958b-45b5-8618-ab8065124e2f\") " Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.860034 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d247d8-d97f-44ba-b769-5f13a5973355-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "41d247d8-d97f-44ba-b769-5f13a5973355" (UID: "41d247d8-d97f-44ba-b769-5f13a5973355"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.860110 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/550d2002-958b-45b5-8618-ab8065124e2f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "550d2002-958b-45b5-8618-ab8065124e2f" (UID: "550d2002-958b-45b5-8618-ab8065124e2f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.864893 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550d2002-958b-45b5-8618-ab8065124e2f-kube-api-access-bgqxx" (OuterVolumeSpecName: "kube-api-access-bgqxx") pod "550d2002-958b-45b5-8618-ab8065124e2f" (UID: "550d2002-958b-45b5-8618-ab8065124e2f"). InnerVolumeSpecName "kube-api-access-bgqxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.865044 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d247d8-d97f-44ba-b769-5f13a5973355-kube-api-access-k5wh6" (OuterVolumeSpecName: "kube-api-access-k5wh6") pod "41d247d8-d97f-44ba-b769-5f13a5973355" (UID: "41d247d8-d97f-44ba-b769-5f13a5973355"). InnerVolumeSpecName "kube-api-access-k5wh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.960383 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5wh6\" (UniqueName: \"kubernetes.io/projected/41d247d8-d97f-44ba-b769-5f13a5973355-kube-api-access-k5wh6\") on node \"crc\" DevicePath \"\"" Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.960415 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/41d247d8-d97f-44ba-b769-5f13a5973355-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.960428 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/550d2002-958b-45b5-8618-ab8065124e2f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:13:58 crc kubenswrapper[4917]: I0318 08:13:58.960440 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgqxx\" (UniqueName: \"kubernetes.io/projected/550d2002-958b-45b5-8618-ab8065124e2f-kube-api-access-bgqxx\") on node \"crc\" DevicePath \"\"" Mar 18 08:13:59 crc kubenswrapper[4917]: I0318 08:13:59.356958 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-d5rpk" event={"ID":"550d2002-958b-45b5-8618-ab8065124e2f","Type":"ContainerDied","Data":"b9262af747ca6a881f59291330a3164647eba2823770006addf2e8ec53094d21"} Mar 18 08:13:59 crc kubenswrapper[4917]: I0318 08:13:59.356992 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-d5rpk" Mar 18 08:13:59 crc kubenswrapper[4917]: I0318 08:13:59.357003 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9262af747ca6a881f59291330a3164647eba2823770006addf2e8ec53094d21" Mar 18 08:13:59 crc kubenswrapper[4917]: I0318 08:13:59.365325 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2baa-account-create-update-ctf24" event={"ID":"41d247d8-d97f-44ba-b769-5f13a5973355","Type":"ContainerDied","Data":"c9cfdfcd8557e223024af8e1f61252f6d03cff9107e59e8ebd4cedd9beb24bf4"} Mar 18 08:13:59 crc kubenswrapper[4917]: I0318 08:13:59.365398 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9cfdfcd8557e223024af8e1f61252f6d03cff9107e59e8ebd4cedd9beb24bf4" Mar 18 08:13:59 crc kubenswrapper[4917]: I0318 08:13:59.365493 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2baa-account-create-update-ctf24" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.145459 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563694-dc4zx"] Mar 18 08:14:00 crc kubenswrapper[4917]: E0318 08:14:00.146426 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550d2002-958b-45b5-8618-ab8065124e2f" containerName="mariadb-database-create" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.146449 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="550d2002-958b-45b5-8618-ab8065124e2f" containerName="mariadb-database-create" Mar 18 08:14:00 crc kubenswrapper[4917]: E0318 08:14:00.146474 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41d247d8-d97f-44ba-b769-5f13a5973355" containerName="mariadb-account-create-update" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.146487 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d247d8-d97f-44ba-b769-5f13a5973355" containerName="mariadb-account-create-update" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.146815 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="41d247d8-d97f-44ba-b769-5f13a5973355" containerName="mariadb-account-create-update" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.146839 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="550d2002-958b-45b5-8618-ab8065124e2f" containerName="mariadb-database-create" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.147727 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563694-dc4zx" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.151974 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.152039 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.152940 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.158181 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563694-dc4zx"] Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.290891 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tmp5\" (UniqueName: \"kubernetes.io/projected/5f759add-12d3-4da3-928e-f0effea94b71-kube-api-access-5tmp5\") pod \"auto-csr-approver-29563694-dc4zx\" (UID: \"5f759add-12d3-4da3-928e-f0effea94b71\") " pod="openshift-infra/auto-csr-approver-29563694-dc4zx" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.393065 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tmp5\" (UniqueName: \"kubernetes.io/projected/5f759add-12d3-4da3-928e-f0effea94b71-kube-api-access-5tmp5\") pod \"auto-csr-approver-29563694-dc4zx\" (UID: \"5f759add-12d3-4da3-928e-f0effea94b71\") " pod="openshift-infra/auto-csr-approver-29563694-dc4zx" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.416970 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tmp5\" (UniqueName: \"kubernetes.io/projected/5f759add-12d3-4da3-928e-f0effea94b71-kube-api-access-5tmp5\") pod \"auto-csr-approver-29563694-dc4zx\" (UID: \"5f759add-12d3-4da3-928e-f0effea94b71\") " pod="openshift-infra/auto-csr-approver-29563694-dc4zx" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.475433 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lxc4g"] Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.476420 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.479040 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jswlf" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.479619 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.485864 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563694-dc4zx" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.489920 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lxc4g"] Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.595252 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-db-sync-config-data\") pod \"barbican-db-sync-lxc4g\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.595348 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5zl6\" (UniqueName: \"kubernetes.io/projected/107f1f76-b32c-4371-aca8-e5253102d6cb-kube-api-access-s5zl6\") pod \"barbican-db-sync-lxc4g\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.595480 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-combined-ca-bundle\") pod \"barbican-db-sync-lxc4g\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.697492 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-db-sync-config-data\") pod \"barbican-db-sync-lxc4g\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.698407 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zl6\" (UniqueName: \"kubernetes.io/projected/107f1f76-b32c-4371-aca8-e5253102d6cb-kube-api-access-s5zl6\") pod \"barbican-db-sync-lxc4g\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.698621 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-combined-ca-bundle\") pod \"barbican-db-sync-lxc4g\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.706076 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-db-sync-config-data\") pod \"barbican-db-sync-lxc4g\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.706963 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-combined-ca-bundle\") pod \"barbican-db-sync-lxc4g\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.716797 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5zl6\" (UniqueName: \"kubernetes.io/projected/107f1f76-b32c-4371-aca8-e5253102d6cb-kube-api-access-s5zl6\") pod \"barbican-db-sync-lxc4g\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.909818 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:00 crc kubenswrapper[4917]: I0318 08:14:00.970728 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563694-dc4zx"] Mar 18 08:14:01 crc kubenswrapper[4917]: I0318 08:14:01.386731 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563694-dc4zx" event={"ID":"5f759add-12d3-4da3-928e-f0effea94b71","Type":"ContainerStarted","Data":"778a595e54b1c6c0a6bc65f4e25c4b4a63b8fde55ee803dd1793eaaaa2de9793"} Mar 18 08:14:01 crc kubenswrapper[4917]: I0318 08:14:01.429784 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lxc4g"] Mar 18 08:14:01 crc kubenswrapper[4917]: W0318 08:14:01.439873 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod107f1f76_b32c_4371_aca8_e5253102d6cb.slice/crio-8b70795feaf160d8eab7b26fc8e45f1c14ecf6ef43fc3962ffe991ecba788b0c WatchSource:0}: Error finding container 8b70795feaf160d8eab7b26fc8e45f1c14ecf6ef43fc3962ffe991ecba788b0c: Status 404 returned error can't find the container with id 8b70795feaf160d8eab7b26fc8e45f1c14ecf6ef43fc3962ffe991ecba788b0c Mar 18 08:14:02 crc kubenswrapper[4917]: I0318 08:14:02.400705 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lxc4g" event={"ID":"107f1f76-b32c-4371-aca8-e5253102d6cb","Type":"ContainerStarted","Data":"8b70795feaf160d8eab7b26fc8e45f1c14ecf6ef43fc3962ffe991ecba788b0c"} Mar 18 08:14:02 crc kubenswrapper[4917]: I0318 08:14:02.402749 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563694-dc4zx" event={"ID":"5f759add-12d3-4da3-928e-f0effea94b71","Type":"ContainerStarted","Data":"54ef452b1e86be2b2a8d3dadd0902c59e0ec75b1480d7193a1ff2bd6338ebf08"} Mar 18 08:14:02 crc kubenswrapper[4917]: I0318 08:14:02.419940 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563694-dc4zx" podStartSLOduration=1.3611070889999999 podStartE2EDuration="2.419923116s" podCreationTimestamp="2026-03-18 08:14:00 +0000 UTC" firstStartedPulling="2026-03-18 08:14:00.988102927 +0000 UTC m=+5225.929257651" lastFinishedPulling="2026-03-18 08:14:02.046918944 +0000 UTC m=+5226.988073678" observedRunningTime="2026-03-18 08:14:02.418102402 +0000 UTC m=+5227.359257116" watchObservedRunningTime="2026-03-18 08:14:02.419923116 +0000 UTC m=+5227.361077830" Mar 18 08:14:03 crc kubenswrapper[4917]: I0318 08:14:03.412461 4917 generic.go:334] "Generic (PLEG): container finished" podID="5f759add-12d3-4da3-928e-f0effea94b71" containerID="54ef452b1e86be2b2a8d3dadd0902c59e0ec75b1480d7193a1ff2bd6338ebf08" exitCode=0 Mar 18 08:14:03 crc kubenswrapper[4917]: I0318 08:14:03.412549 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563694-dc4zx" event={"ID":"5f759add-12d3-4da3-928e-f0effea94b71","Type":"ContainerDied","Data":"54ef452b1e86be2b2a8d3dadd0902c59e0ec75b1480d7193a1ff2bd6338ebf08"} Mar 18 08:14:05 crc kubenswrapper[4917]: I0318 08:14:05.970291 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563694-dc4zx" Mar 18 08:14:06 crc kubenswrapper[4917]: I0318 08:14:06.123495 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tmp5\" (UniqueName: \"kubernetes.io/projected/5f759add-12d3-4da3-928e-f0effea94b71-kube-api-access-5tmp5\") pod \"5f759add-12d3-4da3-928e-f0effea94b71\" (UID: \"5f759add-12d3-4da3-928e-f0effea94b71\") " Mar 18 08:14:06 crc kubenswrapper[4917]: I0318 08:14:06.135771 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f759add-12d3-4da3-928e-f0effea94b71-kube-api-access-5tmp5" (OuterVolumeSpecName: "kube-api-access-5tmp5") pod "5f759add-12d3-4da3-928e-f0effea94b71" (UID: "5f759add-12d3-4da3-928e-f0effea94b71"). InnerVolumeSpecName "kube-api-access-5tmp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:14:06 crc kubenswrapper[4917]: I0318 08:14:06.235480 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tmp5\" (UniqueName: \"kubernetes.io/projected/5f759add-12d3-4da3-928e-f0effea94b71-kube-api-access-5tmp5\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:06 crc kubenswrapper[4917]: I0318 08:14:06.449561 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563694-dc4zx" event={"ID":"5f759add-12d3-4da3-928e-f0effea94b71","Type":"ContainerDied","Data":"778a595e54b1c6c0a6bc65f4e25c4b4a63b8fde55ee803dd1793eaaaa2de9793"} Mar 18 08:14:06 crc kubenswrapper[4917]: I0318 08:14:06.449620 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="778a595e54b1c6c0a6bc65f4e25c4b4a63b8fde55ee803dd1793eaaaa2de9793" Mar 18 08:14:06 crc kubenswrapper[4917]: I0318 08:14:06.449626 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563694-dc4zx" Mar 18 08:14:07 crc kubenswrapper[4917]: I0318 08:14:07.045685 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563688-526t7"] Mar 18 08:14:07 crc kubenswrapper[4917]: I0318 08:14:07.072978 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563688-526t7"] Mar 18 08:14:07 crc kubenswrapper[4917]: I0318 08:14:07.465843 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lxc4g" event={"ID":"107f1f76-b32c-4371-aca8-e5253102d6cb","Type":"ContainerStarted","Data":"e217587353e71ace2a478044a1fb68bc906b885892207047a0d099d4896d2582"} Mar 18 08:14:07 crc kubenswrapper[4917]: I0318 08:14:07.503633 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lxc4g" podStartSLOduration=2.389012057 podStartE2EDuration="7.503570453s" podCreationTimestamp="2026-03-18 08:14:00 +0000 UTC" firstStartedPulling="2026-03-18 08:14:01.443210669 +0000 UTC m=+5226.384365403" lastFinishedPulling="2026-03-18 08:14:06.557769075 +0000 UTC m=+5231.498923799" observedRunningTime="2026-03-18 08:14:07.490540467 +0000 UTC m=+5232.431695221" watchObservedRunningTime="2026-03-18 08:14:07.503570453 +0000 UTC m=+5232.444725207" Mar 18 08:14:07 crc kubenswrapper[4917]: I0318 08:14:07.794754 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="730d4203-a2f5-4f9d-82aa-7d06e92c6520" path="/var/lib/kubelet/pods/730d4203-a2f5-4f9d-82aa-7d06e92c6520/volumes" Mar 18 08:14:08 crc kubenswrapper[4917]: I0318 08:14:08.477403 4917 generic.go:334] "Generic (PLEG): container finished" podID="107f1f76-b32c-4371-aca8-e5253102d6cb" containerID="e217587353e71ace2a478044a1fb68bc906b885892207047a0d099d4896d2582" exitCode=0 Mar 18 08:14:08 crc kubenswrapper[4917]: I0318 08:14:08.477855 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lxc4g" event={"ID":"107f1f76-b32c-4371-aca8-e5253102d6cb","Type":"ContainerDied","Data":"e217587353e71ace2a478044a1fb68bc906b885892207047a0d099d4896d2582"} Mar 18 08:14:09 crc kubenswrapper[4917]: I0318 08:14:09.872540 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.007324 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-db-sync-config-data\") pod \"107f1f76-b32c-4371-aca8-e5253102d6cb\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.007391 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5zl6\" (UniqueName: \"kubernetes.io/projected/107f1f76-b32c-4371-aca8-e5253102d6cb-kube-api-access-s5zl6\") pod \"107f1f76-b32c-4371-aca8-e5253102d6cb\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.007502 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-combined-ca-bundle\") pod \"107f1f76-b32c-4371-aca8-e5253102d6cb\" (UID: \"107f1f76-b32c-4371-aca8-e5253102d6cb\") " Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.013945 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107f1f76-b32c-4371-aca8-e5253102d6cb-kube-api-access-s5zl6" (OuterVolumeSpecName: "kube-api-access-s5zl6") pod "107f1f76-b32c-4371-aca8-e5253102d6cb" (UID: "107f1f76-b32c-4371-aca8-e5253102d6cb"). InnerVolumeSpecName "kube-api-access-s5zl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.014975 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "107f1f76-b32c-4371-aca8-e5253102d6cb" (UID: "107f1f76-b32c-4371-aca8-e5253102d6cb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.030773 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "107f1f76-b32c-4371-aca8-e5253102d6cb" (UID: "107f1f76-b32c-4371-aca8-e5253102d6cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.109722 4917 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.109758 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5zl6\" (UniqueName: \"kubernetes.io/projected/107f1f76-b32c-4371-aca8-e5253102d6cb-kube-api-access-s5zl6\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.109772 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107f1f76-b32c-4371-aca8-e5253102d6cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.500489 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lxc4g" event={"ID":"107f1f76-b32c-4371-aca8-e5253102d6cb","Type":"ContainerDied","Data":"8b70795feaf160d8eab7b26fc8e45f1c14ecf6ef43fc3962ffe991ecba788b0c"} Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.500537 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b70795feaf160d8eab7b26fc8e45f1c14ecf6ef43fc3962ffe991ecba788b0c" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.500550 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lxc4g" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.756239 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-d7b5bfd65-9fl8g"] Mar 18 08:14:10 crc kubenswrapper[4917]: E0318 08:14:10.759636 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f759add-12d3-4da3-928e-f0effea94b71" containerName="oc" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.759676 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f759add-12d3-4da3-928e-f0effea94b71" containerName="oc" Mar 18 08:14:10 crc kubenswrapper[4917]: E0318 08:14:10.759728 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107f1f76-b32c-4371-aca8-e5253102d6cb" containerName="barbican-db-sync" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.759738 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="107f1f76-b32c-4371-aca8-e5253102d6cb" containerName="barbican-db-sync" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.760050 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f759add-12d3-4da3-928e-f0effea94b71" containerName="oc" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.760087 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="107f1f76-b32c-4371-aca8-e5253102d6cb" containerName="barbican-db-sync" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.761101 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.768131 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d7b5bfd65-9fl8g"] Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.782430 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-jswlf" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.782468 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.782996 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.863394 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-547d8cd7db-x2gql"] Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.865018 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.876225 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.879253 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-547d8cd7db-x2gql"] Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.897616 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-798fdcd749-v7b8s"] Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.899541 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.923815 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-logs\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.923926 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-config-data-custom\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.923949 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-config-data\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.924016 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-combined-ca-bundle\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.924058 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fnc\" (UniqueName: \"kubernetes.io/projected/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-kube-api-access-27fnc\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.924324 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-798fdcd749-v7b8s"] Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.980930 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-98b7c8d86-766p9"] Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.988700 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.990820 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 18 08:14:10 crc kubenswrapper[4917]: I0318 08:14:10.994304 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-98b7c8d86-766p9"] Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026552 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-dns-svc\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026619 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-config-data-custom\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026655 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-config-data-custom\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026677 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9xrm\" (UniqueName: \"kubernetes.io/projected/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-kube-api-access-j9xrm\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026705 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-config-data\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026737 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-config\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026758 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-sb\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026804 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-logs\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026837 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-combined-ca-bundle\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026851 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-combined-ca-bundle\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026874 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbccv\" (UniqueName: \"kubernetes.io/projected/12e567d2-c790-47f7-86b6-11c50caedc2a-kube-api-access-dbccv\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026895 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fnc\" (UniqueName: \"kubernetes.io/projected/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-kube-api-access-27fnc\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026911 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-nb\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026941 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-logs\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.026961 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-config-data\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.029852 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-logs\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.035008 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-config-data-custom\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.035468 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-combined-ca-bundle\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.037483 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-config-data\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.056105 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fnc\" (UniqueName: \"kubernetes.io/projected/c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917-kube-api-access-27fnc\") pod \"barbican-worker-d7b5bfd65-9fl8g\" (UID: \"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917\") " pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.122511 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-d7b5bfd65-9fl8g" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128484 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9xrm\" (UniqueName: \"kubernetes.io/projected/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-kube-api-access-j9xrm\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128522 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128555 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-config\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128573 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-sb\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128623 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b599f978-cd6b-402a-836c-35e95943a9bb-logs\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128644 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kg8m\" (UniqueName: \"kubernetes.io/projected/b599f978-cd6b-402a-836c-35e95943a9bb-kube-api-access-8kg8m\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128664 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-logs\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128692 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-combined-ca-bundle\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128726 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data-custom\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128747 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbccv\" (UniqueName: \"kubernetes.io/projected/12e567d2-c790-47f7-86b6-11c50caedc2a-kube-api-access-dbccv\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128768 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-nb\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128806 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-config-data\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128820 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-combined-ca-bundle\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128858 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-dns-svc\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.128894 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-config-data-custom\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.129944 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-sb\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.130370 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-nb\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.130476 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-config\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.130780 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-logs\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.131869 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-dns-svc\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.133008 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-config-data-custom\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.138393 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-combined-ca-bundle\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.139627 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-config-data\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.155319 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9xrm\" (UniqueName: \"kubernetes.io/projected/2b9513fe-7b49-4571-b3fa-71c00adf8dd6-kube-api-access-j9xrm\") pod \"barbican-keystone-listener-547d8cd7db-x2gql\" (UID: \"2b9513fe-7b49-4571-b3fa-71c00adf8dd6\") " pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.156632 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbccv\" (UniqueName: \"kubernetes.io/projected/12e567d2-c790-47f7-86b6-11c50caedc2a-kube-api-access-dbccv\") pod \"dnsmasq-dns-798fdcd749-v7b8s\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.193094 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.229993 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.230075 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b599f978-cd6b-402a-836c-35e95943a9bb-logs\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.230104 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kg8m\" (UniqueName: \"kubernetes.io/projected/b599f978-cd6b-402a-836c-35e95943a9bb-kube-api-access-8kg8m\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.230140 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data-custom\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.230194 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-combined-ca-bundle\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.231331 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b599f978-cd6b-402a-836c-35e95943a9bb-logs\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.232921 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.240104 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.241297 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-combined-ca-bundle\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.243357 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data-custom\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.308272 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kg8m\" (UniqueName: \"kubernetes.io/projected/b599f978-cd6b-402a-836c-35e95943a9bb-kube-api-access-8kg8m\") pod \"barbican-api-98b7c8d86-766p9\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.606500 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.681918 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-d7b5bfd65-9fl8g"] Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.832887 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-798fdcd749-v7b8s"] Mar 18 08:14:11 crc kubenswrapper[4917]: I0318 08:14:11.865554 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-547d8cd7db-x2gql"] Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.044641 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-98b7c8d86-766p9"] Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.531378 4917 generic.go:334] "Generic (PLEG): container finished" podID="12e567d2-c790-47f7-86b6-11c50caedc2a" containerID="be99f3cef75880fa8331d548ecb4a127ac6706aa57058a7720f3b031bd93c8dc" exitCode=0 Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.531703 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" event={"ID":"12e567d2-c790-47f7-86b6-11c50caedc2a","Type":"ContainerDied","Data":"be99f3cef75880fa8331d548ecb4a127ac6706aa57058a7720f3b031bd93c8dc"} Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.531737 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" event={"ID":"12e567d2-c790-47f7-86b6-11c50caedc2a","Type":"ContainerStarted","Data":"d70204ce8a412e968dc33ceda96ae6350dc88e811f914ae232c9a8e5353527a0"} Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.537546 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-98b7c8d86-766p9" event={"ID":"b599f978-cd6b-402a-836c-35e95943a9bb","Type":"ContainerStarted","Data":"d6fc648b719c44270340d862d106304600e9ef83b0576fd23f8e71aacb6afd37"} Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.537624 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-98b7c8d86-766p9" event={"ID":"b599f978-cd6b-402a-836c-35e95943a9bb","Type":"ContainerStarted","Data":"5f4d1c36b8c91e533cc35bedd2f8e1cc4f051a1e4521676dbb6e060b27e83841"} Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.537646 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-98b7c8d86-766p9" event={"ID":"b599f978-cd6b-402a-836c-35e95943a9bb","Type":"ContainerStarted","Data":"165c74a8f7a968c155119e7902a7eaef372fc90bcbcc1778286ba417ce2f7dd9"} Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.537685 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.537746 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.539198 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d7b5bfd65-9fl8g" event={"ID":"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917","Type":"ContainerStarted","Data":"7c3673bb7f38369172fb18864063275e2d81251ef85e7728e10c70e4adc6b5e4"} Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.540714 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" event={"ID":"2b9513fe-7b49-4571-b3fa-71c00adf8dd6","Type":"ContainerStarted","Data":"0fda8224f75f2af7bb3771d31cb5cafa3754ba399dc3a7f24b6c620ba58313e8"} Mar 18 08:14:12 crc kubenswrapper[4917]: I0318 08:14:12.567881 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-98b7c8d86-766p9" podStartSLOduration=2.5678649 podStartE2EDuration="2.5678649s" podCreationTimestamp="2026-03-18 08:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:14:12.566992149 +0000 UTC m=+5237.508146873" watchObservedRunningTime="2026-03-18 08:14:12.5678649 +0000 UTC m=+5237.509019614" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.450346 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-84574ccfdb-jwjkm"] Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.452099 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.456191 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.456334 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.473190 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84574ccfdb-jwjkm"] Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.574043 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-internal-tls-certs\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.574112 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-config-data-custom\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.574171 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-public-tls-certs\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.574199 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtkr9\" (UniqueName: \"kubernetes.io/projected/b26bf0a9-c189-4146-aee9-2d2513251872-kube-api-access-dtkr9\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.574225 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26bf0a9-c189-4146-aee9-2d2513251872-logs\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.574281 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-combined-ca-bundle\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.574303 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-config-data\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.676226 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-public-tls-certs\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.676286 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtkr9\" (UniqueName: \"kubernetes.io/projected/b26bf0a9-c189-4146-aee9-2d2513251872-kube-api-access-dtkr9\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.676323 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26bf0a9-c189-4146-aee9-2d2513251872-logs\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.676361 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-combined-ca-bundle\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.676396 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-config-data\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.676545 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-internal-tls-certs\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.676617 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-config-data-custom\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.677936 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b26bf0a9-c189-4146-aee9-2d2513251872-logs\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.682351 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-combined-ca-bundle\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.682607 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-internal-tls-certs\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.683041 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-public-tls-certs\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.683439 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-config-data\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.685150 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b26bf0a9-c189-4146-aee9-2d2513251872-config-data-custom\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.696338 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtkr9\" (UniqueName: \"kubernetes.io/projected/b26bf0a9-c189-4146-aee9-2d2513251872-kube-api-access-dtkr9\") pod \"barbican-api-84574ccfdb-jwjkm\" (UID: \"b26bf0a9-c189-4146-aee9-2d2513251872\") " pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:13 crc kubenswrapper[4917]: I0318 08:14:13.769768 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.214561 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-84574ccfdb-jwjkm"] Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.559197 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d7b5bfd65-9fl8g" event={"ID":"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917","Type":"ContainerStarted","Data":"668f5d4b65f237e91fac0366b1952919c75c7b9c9a20b16e0d78e865b200082c"} Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.559513 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-d7b5bfd65-9fl8g" event={"ID":"c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917","Type":"ContainerStarted","Data":"3a9193f6ef737731ea322541864336edf20cbae2c4b9823064a4d4aa380ec0d0"} Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.561784 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84574ccfdb-jwjkm" event={"ID":"b26bf0a9-c189-4146-aee9-2d2513251872","Type":"ContainerStarted","Data":"9a811d774ea38f5331af6c4e95c6b2e6deb852323a729d7db9b82d5262bd1e10"} Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.561894 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84574ccfdb-jwjkm" event={"ID":"b26bf0a9-c189-4146-aee9-2d2513251872","Type":"ContainerStarted","Data":"13a8518d68b48d40af396b4cc039874685b18c95f59ab5bf957d75a5f372442d"} Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.570090 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" event={"ID":"2b9513fe-7b49-4571-b3fa-71c00adf8dd6","Type":"ContainerStarted","Data":"f44d52a15e8fed9e060f04d1b3de2a5ef8f7546f8fc47bd2c8d1b41ab6bff1d2"} Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.570135 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" event={"ID":"2b9513fe-7b49-4571-b3fa-71c00adf8dd6","Type":"ContainerStarted","Data":"b8769a8ad1536b3d936679405a4adaefa3c07d57d2ef9223b6b63b749b89eb5a"} Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.573111 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" event={"ID":"12e567d2-c790-47f7-86b6-11c50caedc2a","Type":"ContainerStarted","Data":"22025d5050cdbe90cde38bef0fd2d94f4b8de52c90b73f135a59486261cd389a"} Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.573305 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.582563 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-d7b5bfd65-9fl8g" podStartSLOduration=2.663754783 podStartE2EDuration="4.582546009s" podCreationTimestamp="2026-03-18 08:14:10 +0000 UTC" firstStartedPulling="2026-03-18 08:14:11.69171644 +0000 UTC m=+5236.632871154" lastFinishedPulling="2026-03-18 08:14:13.610507656 +0000 UTC m=+5238.551662380" observedRunningTime="2026-03-18 08:14:14.579956776 +0000 UTC m=+5239.521111510" watchObservedRunningTime="2026-03-18 08:14:14.582546009 +0000 UTC m=+5239.523700733" Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.614694 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-547d8cd7db-x2gql" podStartSLOduration=2.916299776 podStartE2EDuration="4.614673388s" podCreationTimestamp="2026-03-18 08:14:10 +0000 UTC" firstStartedPulling="2026-03-18 08:14:11.91319847 +0000 UTC m=+5236.854353184" lastFinishedPulling="2026-03-18 08:14:13.611572072 +0000 UTC m=+5238.552726796" observedRunningTime="2026-03-18 08:14:14.612906156 +0000 UTC m=+5239.554060890" watchObservedRunningTime="2026-03-18 08:14:14.614673388 +0000 UTC m=+5239.555828112" Mar 18 08:14:14 crc kubenswrapper[4917]: I0318 08:14:14.641460 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" podStartSLOduration=4.641436617 podStartE2EDuration="4.641436617s" podCreationTimestamp="2026-03-18 08:14:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:14:14.633005823 +0000 UTC m=+5239.574160557" watchObservedRunningTime="2026-03-18 08:14:14.641436617 +0000 UTC m=+5239.582591351" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.298986 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wxwpz"] Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.300818 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.319517 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxwpz"] Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.404981 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-utilities\") pod \"community-operators-wxwpz\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.405033 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2blp\" (UniqueName: \"kubernetes.io/projected/b6295b3b-3f74-448f-86de-ef6ffee5e111-kube-api-access-l2blp\") pod \"community-operators-wxwpz\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.405375 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-catalog-content\") pod \"community-operators-wxwpz\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.506520 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-catalog-content\") pod \"community-operators-wxwpz\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.506618 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-utilities\") pod \"community-operators-wxwpz\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.506641 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2blp\" (UniqueName: \"kubernetes.io/projected/b6295b3b-3f74-448f-86de-ef6ffee5e111-kube-api-access-l2blp\") pod \"community-operators-wxwpz\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.507477 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-catalog-content\") pod \"community-operators-wxwpz\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.507725 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-utilities\") pod \"community-operators-wxwpz\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.527419 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2blp\" (UniqueName: \"kubernetes.io/projected/b6295b3b-3f74-448f-86de-ef6ffee5e111-kube-api-access-l2blp\") pod \"community-operators-wxwpz\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.582026 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-84574ccfdb-jwjkm" event={"ID":"b26bf0a9-c189-4146-aee9-2d2513251872","Type":"ContainerStarted","Data":"2b35130984a5dfd2beb1f567c63858683764bc2cb2396a72a7a4f62dd094a2a1"} Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.582970 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.583006 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.607650 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-84574ccfdb-jwjkm" podStartSLOduration=2.6076339490000002 podStartE2EDuration="2.607633949s" podCreationTimestamp="2026-03-18 08:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:14:15.602418012 +0000 UTC m=+5240.543572736" watchObservedRunningTime="2026-03-18 08:14:15.607633949 +0000 UTC m=+5240.548788663" Mar 18 08:14:15 crc kubenswrapper[4917]: I0318 08:14:15.631042 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:16 crc kubenswrapper[4917]: I0318 08:14:16.191075 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wxwpz"] Mar 18 08:14:16 crc kubenswrapper[4917]: I0318 08:14:16.594184 4917 generic.go:334] "Generic (PLEG): container finished" podID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerID="d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01" exitCode=0 Mar 18 08:14:16 crc kubenswrapper[4917]: I0318 08:14:16.594263 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxwpz" event={"ID":"b6295b3b-3f74-448f-86de-ef6ffee5e111","Type":"ContainerDied","Data":"d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01"} Mar 18 08:14:16 crc kubenswrapper[4917]: I0318 08:14:16.594571 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxwpz" event={"ID":"b6295b3b-3f74-448f-86de-ef6ffee5e111","Type":"ContainerStarted","Data":"d1a2b16ff7cdd33b8b2589aa63fcf00e353b63ef21b15352d67ff7ff6575dc86"} Mar 18 08:14:17 crc kubenswrapper[4917]: I0318 08:14:17.608452 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxwpz" event={"ID":"b6295b3b-3f74-448f-86de-ef6ffee5e111","Type":"ContainerStarted","Data":"545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95"} Mar 18 08:14:18 crc kubenswrapper[4917]: I0318 08:14:18.618269 4917 generic.go:334] "Generic (PLEG): container finished" podID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerID="545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95" exitCode=0 Mar 18 08:14:18 crc kubenswrapper[4917]: I0318 08:14:18.618624 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxwpz" event={"ID":"b6295b3b-3f74-448f-86de-ef6ffee5e111","Type":"ContainerDied","Data":"545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95"} Mar 18 08:14:19 crc kubenswrapper[4917]: I0318 08:14:19.630197 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxwpz" event={"ID":"b6295b3b-3f74-448f-86de-ef6ffee5e111","Type":"ContainerStarted","Data":"014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34"} Mar 18 08:14:19 crc kubenswrapper[4917]: I0318 08:14:19.655538 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wxwpz" podStartSLOduration=2.198110205 podStartE2EDuration="4.655516957s" podCreationTimestamp="2026-03-18 08:14:15 +0000 UTC" firstStartedPulling="2026-03-18 08:14:16.595576149 +0000 UTC m=+5241.536730863" lastFinishedPulling="2026-03-18 08:14:19.052982901 +0000 UTC m=+5243.994137615" observedRunningTime="2026-03-18 08:14:19.651130741 +0000 UTC m=+5244.592285515" watchObservedRunningTime="2026-03-18 08:14:19.655516957 +0000 UTC m=+5244.596671671" Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.235913 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.295385 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c4cb5b59c-nxk29"] Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.295691 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" podUID="ef5de187-2794-4a99-9be7-25c48156dd47" containerName="dnsmasq-dns" containerID="cri-o://61b82be59dd54b705e13c2a27173582e46912084dc6e45ad1dd0473ee8ef3b31" gracePeriod=10 Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.657136 4917 generic.go:334] "Generic (PLEG): container finished" podID="ef5de187-2794-4a99-9be7-25c48156dd47" containerID="61b82be59dd54b705e13c2a27173582e46912084dc6e45ad1dd0473ee8ef3b31" exitCode=0 Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.657222 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" event={"ID":"ef5de187-2794-4a99-9be7-25c48156dd47","Type":"ContainerDied","Data":"61b82be59dd54b705e13c2a27173582e46912084dc6e45ad1dd0473ee8ef3b31"} Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.797420 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.917766 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-config\") pod \"ef5de187-2794-4a99-9be7-25c48156dd47\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.917858 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-sb\") pod \"ef5de187-2794-4a99-9be7-25c48156dd47\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.917915 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-dns-svc\") pod \"ef5de187-2794-4a99-9be7-25c48156dd47\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.917956 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kksf9\" (UniqueName: \"kubernetes.io/projected/ef5de187-2794-4a99-9be7-25c48156dd47-kube-api-access-kksf9\") pod \"ef5de187-2794-4a99-9be7-25c48156dd47\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.917990 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-nb\") pod \"ef5de187-2794-4a99-9be7-25c48156dd47\" (UID: \"ef5de187-2794-4a99-9be7-25c48156dd47\") " Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.923770 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef5de187-2794-4a99-9be7-25c48156dd47-kube-api-access-kksf9" (OuterVolumeSpecName: "kube-api-access-kksf9") pod "ef5de187-2794-4a99-9be7-25c48156dd47" (UID: "ef5de187-2794-4a99-9be7-25c48156dd47"). InnerVolumeSpecName "kube-api-access-kksf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:14:21 crc kubenswrapper[4917]: I0318 08:14:21.989137 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef5de187-2794-4a99-9be7-25c48156dd47" (UID: "ef5de187-2794-4a99-9be7-25c48156dd47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.007263 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-config" (OuterVolumeSpecName: "config") pod "ef5de187-2794-4a99-9be7-25c48156dd47" (UID: "ef5de187-2794-4a99-9be7-25c48156dd47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.019397 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef5de187-2794-4a99-9be7-25c48156dd47" (UID: "ef5de187-2794-4a99-9be7-25c48156dd47"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.019713 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.019744 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.019757 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.019765 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kksf9\" (UniqueName: \"kubernetes.io/projected/ef5de187-2794-4a99-9be7-25c48156dd47-kube-api-access-kksf9\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.020987 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ef5de187-2794-4a99-9be7-25c48156dd47" (UID: "ef5de187-2794-4a99-9be7-25c48156dd47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.121101 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ef5de187-2794-4a99-9be7-25c48156dd47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.665958 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" event={"ID":"ef5de187-2794-4a99-9be7-25c48156dd47","Type":"ContainerDied","Data":"1103fe7f801e2df43302de771a283dd99686656b6a85e0031b283e0cd8ae74f6"} Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.666055 4917 scope.go:117] "RemoveContainer" containerID="61b82be59dd54b705e13c2a27173582e46912084dc6e45ad1dd0473ee8ef3b31" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.666327 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c4cb5b59c-nxk29" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.718035 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c4cb5b59c-nxk29"] Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.721137 4917 scope.go:117] "RemoveContainer" containerID="17e682c7491c5a148c434f2bb460f62edc0a763324db1e909d8c5f2bc6b21a12" Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.730393 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c4cb5b59c-nxk29"] Mar 18 08:14:22 crc kubenswrapper[4917]: I0318 08:14:22.897090 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:23 crc kubenswrapper[4917]: I0318 08:14:23.096776 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:23 crc kubenswrapper[4917]: I0318 08:14:23.794415 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef5de187-2794-4a99-9be7-25c48156dd47" path="/var/lib/kubelet/pods/ef5de187-2794-4a99-9be7-25c48156dd47/volumes" Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.143574 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.181066 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-84574ccfdb-jwjkm" Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.260049 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-98b7c8d86-766p9"] Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.260263 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-98b7c8d86-766p9" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api-log" containerID="cri-o://5f4d1c36b8c91e533cc35bedd2f8e1cc4f051a1e4521676dbb6e060b27e83841" gracePeriod=30 Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.260624 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-98b7c8d86-766p9" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api" containerID="cri-o://d6fc648b719c44270340d862d106304600e9ef83b0576fd23f8e71aacb6afd37" gracePeriod=30 Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.631134 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.632988 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.696749 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.703413 4917 generic.go:334] "Generic (PLEG): container finished" podID="b599f978-cd6b-402a-836c-35e95943a9bb" containerID="5f4d1c36b8c91e533cc35bedd2f8e1cc4f051a1e4521676dbb6e060b27e83841" exitCode=143 Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.704142 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-98b7c8d86-766p9" event={"ID":"b599f978-cd6b-402a-836c-35e95943a9bb","Type":"ContainerDied","Data":"5f4d1c36b8c91e533cc35bedd2f8e1cc4f051a1e4521676dbb6e060b27e83841"} Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.766218 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:25 crc kubenswrapper[4917]: I0318 08:14:25.940235 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wxwpz"] Mar 18 08:14:27 crc kubenswrapper[4917]: I0318 08:14:27.722968 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wxwpz" podUID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerName="registry-server" containerID="cri-o://014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34" gracePeriod=2 Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.247145 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.358617 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-catalog-content\") pod \"b6295b3b-3f74-448f-86de-ef6ffee5e111\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.358762 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2blp\" (UniqueName: \"kubernetes.io/projected/b6295b3b-3f74-448f-86de-ef6ffee5e111-kube-api-access-l2blp\") pod \"b6295b3b-3f74-448f-86de-ef6ffee5e111\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.358815 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-utilities\") pod \"b6295b3b-3f74-448f-86de-ef6ffee5e111\" (UID: \"b6295b3b-3f74-448f-86de-ef6ffee5e111\") " Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.360133 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-utilities" (OuterVolumeSpecName: "utilities") pod "b6295b3b-3f74-448f-86de-ef6ffee5e111" (UID: "b6295b3b-3f74-448f-86de-ef6ffee5e111"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.393754 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6295b3b-3f74-448f-86de-ef6ffee5e111-kube-api-access-l2blp" (OuterVolumeSpecName: "kube-api-access-l2blp") pod "b6295b3b-3f74-448f-86de-ef6ffee5e111" (UID: "b6295b3b-3f74-448f-86de-ef6ffee5e111"). InnerVolumeSpecName "kube-api-access-l2blp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.451200 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6295b3b-3f74-448f-86de-ef6ffee5e111" (UID: "b6295b3b-3f74-448f-86de-ef6ffee5e111"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.460744 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.460785 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2blp\" (UniqueName: \"kubernetes.io/projected/b6295b3b-3f74-448f-86de-ef6ffee5e111-kube-api-access-l2blp\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.460796 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6295b3b-3f74-448f-86de-ef6ffee5e111-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.477641 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-98b7c8d86-766p9" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.1.65:9311/healthcheck\": read tcp 10.217.0.2:34090->10.217.1.65:9311: read: connection reset by peer" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.477659 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-98b7c8d86-766p9" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.1.65:9311/healthcheck\": read tcp 10.217.0.2:34098->10.217.1.65:9311: read: connection reset by peer" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.739702 4917 generic.go:334] "Generic (PLEG): container finished" podID="b599f978-cd6b-402a-836c-35e95943a9bb" containerID="d6fc648b719c44270340d862d106304600e9ef83b0576fd23f8e71aacb6afd37" exitCode=0 Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.739797 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-98b7c8d86-766p9" event={"ID":"b599f978-cd6b-402a-836c-35e95943a9bb","Type":"ContainerDied","Data":"d6fc648b719c44270340d862d106304600e9ef83b0576fd23f8e71aacb6afd37"} Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.799863 4917 generic.go:334] "Generic (PLEG): container finished" podID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerID="014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34" exitCode=0 Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.799909 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxwpz" event={"ID":"b6295b3b-3f74-448f-86de-ef6ffee5e111","Type":"ContainerDied","Data":"014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34"} Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.799948 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wxwpz" event={"ID":"b6295b3b-3f74-448f-86de-ef6ffee5e111","Type":"ContainerDied","Data":"d1a2b16ff7cdd33b8b2589aa63fcf00e353b63ef21b15352d67ff7ff6575dc86"} Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.799969 4917 scope.go:117] "RemoveContainer" containerID="014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.800174 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wxwpz" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.834889 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wxwpz"] Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.844315 4917 scope.go:117] "RemoveContainer" containerID="545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.853806 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wxwpz"] Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.868053 4917 scope.go:117] "RemoveContainer" containerID="d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.932101 4917 scope.go:117] "RemoveContainer" containerID="014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34" Mar 18 08:14:28 crc kubenswrapper[4917]: E0318 08:14:28.932478 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34\": container with ID starting with 014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34 not found: ID does not exist" containerID="014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.932528 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34"} err="failed to get container status \"014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34\": rpc error: code = NotFound desc = could not find container \"014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34\": container with ID starting with 014bdadc5567e86a7c16c64593844266abb83fa870cca748bb49b581e1e16f34 not found: ID does not exist" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.932560 4917 scope.go:117] "RemoveContainer" containerID="545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95" Mar 18 08:14:28 crc kubenswrapper[4917]: E0318 08:14:28.933092 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95\": container with ID starting with 545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95 not found: ID does not exist" containerID="545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.933134 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95"} err="failed to get container status \"545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95\": rpc error: code = NotFound desc = could not find container \"545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95\": container with ID starting with 545eb728a60edf1938e0430c13d8421390f25703db9c71d2cbe3c2be6ab08b95 not found: ID does not exist" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.933159 4917 scope.go:117] "RemoveContainer" containerID="d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01" Mar 18 08:14:28 crc kubenswrapper[4917]: E0318 08:14:28.933659 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01\": container with ID starting with d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01 not found: ID does not exist" containerID="d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.933696 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01"} err="failed to get container status \"d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01\": rpc error: code = NotFound desc = could not find container \"d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01\": container with ID starting with d2e0906e1dc3b79be11a91e50441e4f1aff91c40eaf7cde4291b06d08c7d2a01 not found: ID does not exist" Mar 18 08:14:28 crc kubenswrapper[4917]: I0318 08:14:28.958991 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.083403 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b599f978-cd6b-402a-836c-35e95943a9bb-logs\") pod \"b599f978-cd6b-402a-836c-35e95943a9bb\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.083541 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data\") pod \"b599f978-cd6b-402a-836c-35e95943a9bb\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.083709 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data-custom\") pod \"b599f978-cd6b-402a-836c-35e95943a9bb\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.083741 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kg8m\" (UniqueName: \"kubernetes.io/projected/b599f978-cd6b-402a-836c-35e95943a9bb-kube-api-access-8kg8m\") pod \"b599f978-cd6b-402a-836c-35e95943a9bb\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.083789 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-combined-ca-bundle\") pod \"b599f978-cd6b-402a-836c-35e95943a9bb\" (UID: \"b599f978-cd6b-402a-836c-35e95943a9bb\") " Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.083963 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b599f978-cd6b-402a-836c-35e95943a9bb-logs" (OuterVolumeSpecName: "logs") pod "b599f978-cd6b-402a-836c-35e95943a9bb" (UID: "b599f978-cd6b-402a-836c-35e95943a9bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.088192 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b599f978-cd6b-402a-836c-35e95943a9bb-kube-api-access-8kg8m" (OuterVolumeSpecName: "kube-api-access-8kg8m") pod "b599f978-cd6b-402a-836c-35e95943a9bb" (UID: "b599f978-cd6b-402a-836c-35e95943a9bb"). InnerVolumeSpecName "kube-api-access-8kg8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.090108 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b599f978-cd6b-402a-836c-35e95943a9bb" (UID: "b599f978-cd6b-402a-836c-35e95943a9bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.109276 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b599f978-cd6b-402a-836c-35e95943a9bb" (UID: "b599f978-cd6b-402a-836c-35e95943a9bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.130711 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data" (OuterVolumeSpecName: "config-data") pod "b599f978-cd6b-402a-836c-35e95943a9bb" (UID: "b599f978-cd6b-402a-836c-35e95943a9bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.187324 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b599f978-cd6b-402a-836c-35e95943a9bb-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.187362 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.187374 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.187388 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kg8m\" (UniqueName: \"kubernetes.io/projected/b599f978-cd6b-402a-836c-35e95943a9bb-kube-api-access-8kg8m\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.187398 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b599f978-cd6b-402a-836c-35e95943a9bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.786772 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6295b3b-3f74-448f-86de-ef6ffee5e111" path="/var/lib/kubelet/pods/b6295b3b-3f74-448f-86de-ef6ffee5e111/volumes" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.810716 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-98b7c8d86-766p9" event={"ID":"b599f978-cd6b-402a-836c-35e95943a9bb","Type":"ContainerDied","Data":"165c74a8f7a968c155119e7902a7eaef372fc90bcbcc1778286ba417ce2f7dd9"} Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.810761 4917 scope.go:117] "RemoveContainer" containerID="d6fc648b719c44270340d862d106304600e9ef83b0576fd23f8e71aacb6afd37" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.810851 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-98b7c8d86-766p9" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.836741 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-98b7c8d86-766p9"] Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.839607 4917 scope.go:117] "RemoveContainer" containerID="5f4d1c36b8c91e533cc35bedd2f8e1cc4f051a1e4521676dbb6e060b27e83841" Mar 18 08:14:29 crc kubenswrapper[4917]: I0318 08:14:29.842295 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-98b7c8d86-766p9"] Mar 18 08:14:31 crc kubenswrapper[4917]: I0318 08:14:31.786342 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" path="/var/lib/kubelet/pods/b599f978-cd6b-402a-836c-35e95943a9bb/volumes" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.604207 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hzxqs"] Mar 18 08:14:52 crc kubenswrapper[4917]: E0318 08:14:52.605143 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5de187-2794-4a99-9be7-25c48156dd47" containerName="init" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605164 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5de187-2794-4a99-9be7-25c48156dd47" containerName="init" Mar 18 08:14:52 crc kubenswrapper[4917]: E0318 08:14:52.605181 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerName="extract-content" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605189 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerName="extract-content" Mar 18 08:14:52 crc kubenswrapper[4917]: E0318 08:14:52.605203 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef5de187-2794-4a99-9be7-25c48156dd47" containerName="dnsmasq-dns" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605211 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef5de187-2794-4a99-9be7-25c48156dd47" containerName="dnsmasq-dns" Mar 18 08:14:52 crc kubenswrapper[4917]: E0318 08:14:52.605226 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api-log" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605231 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api-log" Mar 18 08:14:52 crc kubenswrapper[4917]: E0318 08:14:52.605243 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerName="registry-server" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605250 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerName="registry-server" Mar 18 08:14:52 crc kubenswrapper[4917]: E0318 08:14:52.605270 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerName="extract-utilities" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605279 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerName="extract-utilities" Mar 18 08:14:52 crc kubenswrapper[4917]: E0318 08:14:52.605294 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605301 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605492 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605510 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef5de187-2794-4a99-9be7-25c48156dd47" containerName="dnsmasq-dns" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605532 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6295b3b-3f74-448f-86de-ef6ffee5e111" containerName="registry-server" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.605544 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b599f978-cd6b-402a-836c-35e95943a9bb" containerName="barbican-api-log" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.606257 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.615824 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hzxqs"] Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.639727 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz444\" (UniqueName: \"kubernetes.io/projected/53d77b9f-d04c-485f-ab9d-465b393ba56f-kube-api-access-hz444\") pod \"neutron-db-create-hzxqs\" (UID: \"53d77b9f-d04c-485f-ab9d-465b393ba56f\") " pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.639888 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53d77b9f-d04c-485f-ab9d-465b393ba56f-operator-scripts\") pod \"neutron-db-create-hzxqs\" (UID: \"53d77b9f-d04c-485f-ab9d-465b393ba56f\") " pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.707632 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-a039-account-create-update-9hhlb"] Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.709058 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.712604 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.718521 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a039-account-create-update-9hhlb"] Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.744390 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8602779-52ca-408c-974d-bd5723b5cb8f-operator-scripts\") pod \"neutron-a039-account-create-update-9hhlb\" (UID: \"c8602779-52ca-408c-974d-bd5723b5cb8f\") " pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.744510 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz444\" (UniqueName: \"kubernetes.io/projected/53d77b9f-d04c-485f-ab9d-465b393ba56f-kube-api-access-hz444\") pod \"neutron-db-create-hzxqs\" (UID: \"53d77b9f-d04c-485f-ab9d-465b393ba56f\") " pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.744559 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szkvs\" (UniqueName: \"kubernetes.io/projected/c8602779-52ca-408c-974d-bd5723b5cb8f-kube-api-access-szkvs\") pod \"neutron-a039-account-create-update-9hhlb\" (UID: \"c8602779-52ca-408c-974d-bd5723b5cb8f\") " pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.744617 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53d77b9f-d04c-485f-ab9d-465b393ba56f-operator-scripts\") pod \"neutron-db-create-hzxqs\" (UID: \"53d77b9f-d04c-485f-ab9d-465b393ba56f\") " pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.748480 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53d77b9f-d04c-485f-ab9d-465b393ba56f-operator-scripts\") pod \"neutron-db-create-hzxqs\" (UID: \"53d77b9f-d04c-485f-ab9d-465b393ba56f\") " pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.766097 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz444\" (UniqueName: \"kubernetes.io/projected/53d77b9f-d04c-485f-ab9d-465b393ba56f-kube-api-access-hz444\") pod \"neutron-db-create-hzxqs\" (UID: \"53d77b9f-d04c-485f-ab9d-465b393ba56f\") " pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.845825 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szkvs\" (UniqueName: \"kubernetes.io/projected/c8602779-52ca-408c-974d-bd5723b5cb8f-kube-api-access-szkvs\") pod \"neutron-a039-account-create-update-9hhlb\" (UID: \"c8602779-52ca-408c-974d-bd5723b5cb8f\") " pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.846223 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8602779-52ca-408c-974d-bd5723b5cb8f-operator-scripts\") pod \"neutron-a039-account-create-update-9hhlb\" (UID: \"c8602779-52ca-408c-974d-bd5723b5cb8f\") " pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.847536 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8602779-52ca-408c-974d-bd5723b5cb8f-operator-scripts\") pod \"neutron-a039-account-create-update-9hhlb\" (UID: \"c8602779-52ca-408c-974d-bd5723b5cb8f\") " pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.861687 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szkvs\" (UniqueName: \"kubernetes.io/projected/c8602779-52ca-408c-974d-bd5723b5cb8f-kube-api-access-szkvs\") pod \"neutron-a039-account-create-update-9hhlb\" (UID: \"c8602779-52ca-408c-974d-bd5723b5cb8f\") " pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:52 crc kubenswrapper[4917]: I0318 08:14:52.931958 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:53 crc kubenswrapper[4917]: I0318 08:14:53.030157 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:53 crc kubenswrapper[4917]: I0318 08:14:53.395528 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hzxqs"] Mar 18 08:14:53 crc kubenswrapper[4917]: W0318 08:14:53.541797 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8602779_52ca_408c_974d_bd5723b5cb8f.slice/crio-a7674c576ceff076851f62a04a0d8a7f136742f08d14ec2cf2009191da6d30d3 WatchSource:0}: Error finding container a7674c576ceff076851f62a04a0d8a7f136742f08d14ec2cf2009191da6d30d3: Status 404 returned error can't find the container with id a7674c576ceff076851f62a04a0d8a7f136742f08d14ec2cf2009191da6d30d3 Mar 18 08:14:53 crc kubenswrapper[4917]: I0318 08:14:53.542741 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-a039-account-create-update-9hhlb"] Mar 18 08:14:54 crc kubenswrapper[4917]: I0318 08:14:54.092823 4917 generic.go:334] "Generic (PLEG): container finished" podID="53d77b9f-d04c-485f-ab9d-465b393ba56f" containerID="9de26a0491cdd67b4195032535fa3b2c14410e53fc91c6fd765377abab3bc834" exitCode=0 Mar 18 08:14:54 crc kubenswrapper[4917]: I0318 08:14:54.092896 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hzxqs" event={"ID":"53d77b9f-d04c-485f-ab9d-465b393ba56f","Type":"ContainerDied","Data":"9de26a0491cdd67b4195032535fa3b2c14410e53fc91c6fd765377abab3bc834"} Mar 18 08:14:54 crc kubenswrapper[4917]: I0318 08:14:54.092928 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hzxqs" event={"ID":"53d77b9f-d04c-485f-ab9d-465b393ba56f","Type":"ContainerStarted","Data":"64188aac24ffcfcd6a0655bf12a824d7f7fbcaf5fa601b32bdd4719e5093f61c"} Mar 18 08:14:54 crc kubenswrapper[4917]: I0318 08:14:54.095798 4917 generic.go:334] "Generic (PLEG): container finished" podID="c8602779-52ca-408c-974d-bd5723b5cb8f" containerID="f085efb893f4bbac0631daa8589a6a7312a0c398f52b528f7af6b93c30d0f7c3" exitCode=0 Mar 18 08:14:54 crc kubenswrapper[4917]: I0318 08:14:54.095828 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a039-account-create-update-9hhlb" event={"ID":"c8602779-52ca-408c-974d-bd5723b5cb8f","Type":"ContainerDied","Data":"f085efb893f4bbac0631daa8589a6a7312a0c398f52b528f7af6b93c30d0f7c3"} Mar 18 08:14:54 crc kubenswrapper[4917]: I0318 08:14:54.095847 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a039-account-create-update-9hhlb" event={"ID":"c8602779-52ca-408c-974d-bd5723b5cb8f","Type":"ContainerStarted","Data":"a7674c576ceff076851f62a04a0d8a7f136742f08d14ec2cf2009191da6d30d3"} Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.462183 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.467927 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.500354 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz444\" (UniqueName: \"kubernetes.io/projected/53d77b9f-d04c-485f-ab9d-465b393ba56f-kube-api-access-hz444\") pod \"53d77b9f-d04c-485f-ab9d-465b393ba56f\" (UID: \"53d77b9f-d04c-485f-ab9d-465b393ba56f\") " Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.500409 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53d77b9f-d04c-485f-ab9d-465b393ba56f-operator-scripts\") pod \"53d77b9f-d04c-485f-ab9d-465b393ba56f\" (UID: \"53d77b9f-d04c-485f-ab9d-465b393ba56f\") " Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.501364 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d77b9f-d04c-485f-ab9d-465b393ba56f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53d77b9f-d04c-485f-ab9d-465b393ba56f" (UID: "53d77b9f-d04c-485f-ab9d-465b393ba56f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.507151 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d77b9f-d04c-485f-ab9d-465b393ba56f-kube-api-access-hz444" (OuterVolumeSpecName: "kube-api-access-hz444") pod "53d77b9f-d04c-485f-ab9d-465b393ba56f" (UID: "53d77b9f-d04c-485f-ab9d-465b393ba56f"). InnerVolumeSpecName "kube-api-access-hz444". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.602088 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szkvs\" (UniqueName: \"kubernetes.io/projected/c8602779-52ca-408c-974d-bd5723b5cb8f-kube-api-access-szkvs\") pod \"c8602779-52ca-408c-974d-bd5723b5cb8f\" (UID: \"c8602779-52ca-408c-974d-bd5723b5cb8f\") " Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.602197 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8602779-52ca-408c-974d-bd5723b5cb8f-operator-scripts\") pod \"c8602779-52ca-408c-974d-bd5723b5cb8f\" (UID: \"c8602779-52ca-408c-974d-bd5723b5cb8f\") " Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.602571 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8602779-52ca-408c-974d-bd5723b5cb8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c8602779-52ca-408c-974d-bd5723b5cb8f" (UID: "c8602779-52ca-408c-974d-bd5723b5cb8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.602850 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz444\" (UniqueName: \"kubernetes.io/projected/53d77b9f-d04c-485f-ab9d-465b393ba56f-kube-api-access-hz444\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.602870 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53d77b9f-d04c-485f-ab9d-465b393ba56f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.602878 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c8602779-52ca-408c-974d-bd5723b5cb8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.605574 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8602779-52ca-408c-974d-bd5723b5cb8f-kube-api-access-szkvs" (OuterVolumeSpecName: "kube-api-access-szkvs") pod "c8602779-52ca-408c-974d-bd5723b5cb8f" (UID: "c8602779-52ca-408c-974d-bd5723b5cb8f"). InnerVolumeSpecName "kube-api-access-szkvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:14:55 crc kubenswrapper[4917]: I0318 08:14:55.704418 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szkvs\" (UniqueName: \"kubernetes.io/projected/c8602779-52ca-408c-974d-bd5723b5cb8f-kube-api-access-szkvs\") on node \"crc\" DevicePath \"\"" Mar 18 08:14:56 crc kubenswrapper[4917]: I0318 08:14:56.122228 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hzxqs" event={"ID":"53d77b9f-d04c-485f-ab9d-465b393ba56f","Type":"ContainerDied","Data":"64188aac24ffcfcd6a0655bf12a824d7f7fbcaf5fa601b32bdd4719e5093f61c"} Mar 18 08:14:56 crc kubenswrapper[4917]: I0318 08:14:56.122277 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hzxqs" Mar 18 08:14:56 crc kubenswrapper[4917]: I0318 08:14:56.122291 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64188aac24ffcfcd6a0655bf12a824d7f7fbcaf5fa601b32bdd4719e5093f61c" Mar 18 08:14:56 crc kubenswrapper[4917]: I0318 08:14:56.124755 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-a039-account-create-update-9hhlb" event={"ID":"c8602779-52ca-408c-974d-bd5723b5cb8f","Type":"ContainerDied","Data":"a7674c576ceff076851f62a04a0d8a7f136742f08d14ec2cf2009191da6d30d3"} Mar 18 08:14:56 crc kubenswrapper[4917]: I0318 08:14:56.124915 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7674c576ceff076851f62a04a0d8a7f136742f08d14ec2cf2009191da6d30d3" Mar 18 08:14:56 crc kubenswrapper[4917]: I0318 08:14:56.124780 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-a039-account-create-update-9hhlb" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.933491 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dxm9h"] Mar 18 08:14:57 crc kubenswrapper[4917]: E0318 08:14:57.934332 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8602779-52ca-408c-974d-bd5723b5cb8f" containerName="mariadb-account-create-update" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.934349 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8602779-52ca-408c-974d-bd5723b5cb8f" containerName="mariadb-account-create-update" Mar 18 08:14:57 crc kubenswrapper[4917]: E0318 08:14:57.934397 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d77b9f-d04c-485f-ab9d-465b393ba56f" containerName="mariadb-database-create" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.934404 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d77b9f-d04c-485f-ab9d-465b393ba56f" containerName="mariadb-database-create" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.934632 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="53d77b9f-d04c-485f-ab9d-465b393ba56f" containerName="mariadb-database-create" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.934649 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8602779-52ca-408c-974d-bd5723b5cb8f" containerName="mariadb-account-create-update" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.935341 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.941341 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.941607 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s25j6" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.941756 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 08:14:57 crc kubenswrapper[4917]: I0318 08:14:57.943484 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dxm9h"] Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.069337 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lpg\" (UniqueName: \"kubernetes.io/projected/34878b32-0d4d-4d01-8898-90e7393b3b49-kube-api-access-x4lpg\") pod \"neutron-db-sync-dxm9h\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.069435 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-config\") pod \"neutron-db-sync-dxm9h\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.069501 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-combined-ca-bundle\") pod \"neutron-db-sync-dxm9h\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.170605 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lpg\" (UniqueName: \"kubernetes.io/projected/34878b32-0d4d-4d01-8898-90e7393b3b49-kube-api-access-x4lpg\") pod \"neutron-db-sync-dxm9h\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.170752 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-config\") pod \"neutron-db-sync-dxm9h\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.171855 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-combined-ca-bundle\") pod \"neutron-db-sync-dxm9h\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.179752 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-combined-ca-bundle\") pod \"neutron-db-sync-dxm9h\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.187223 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-config\") pod \"neutron-db-sync-dxm9h\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.191832 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lpg\" (UniqueName: \"kubernetes.io/projected/34878b32-0d4d-4d01-8898-90e7393b3b49-kube-api-access-x4lpg\") pod \"neutron-db-sync-dxm9h\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.258669 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:14:58 crc kubenswrapper[4917]: I0318 08:14:58.717743 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dxm9h"] Mar 18 08:14:58 crc kubenswrapper[4917]: W0318 08:14:58.721257 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34878b32_0d4d_4d01_8898_90e7393b3b49.slice/crio-235479b311223d0cbb36fbf215e2910d56d8c8c0d939007ea5ff7d68c20d3f92 WatchSource:0}: Error finding container 235479b311223d0cbb36fbf215e2910d56d8c8c0d939007ea5ff7d68c20d3f92: Status 404 returned error can't find the container with id 235479b311223d0cbb36fbf215e2910d56d8c8c0d939007ea5ff7d68c20d3f92 Mar 18 08:14:59 crc kubenswrapper[4917]: I0318 08:14:59.151958 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dxm9h" event={"ID":"34878b32-0d4d-4d01-8898-90e7393b3b49","Type":"ContainerStarted","Data":"32e1c3096ccac1a4fe42d7ddce9ac61a328a0bfd323c79202551863fe40b82bd"} Mar 18 08:14:59 crc kubenswrapper[4917]: I0318 08:14:59.152320 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dxm9h" event={"ID":"34878b32-0d4d-4d01-8898-90e7393b3b49","Type":"ContainerStarted","Data":"235479b311223d0cbb36fbf215e2910d56d8c8c0d939007ea5ff7d68c20d3f92"} Mar 18 08:14:59 crc kubenswrapper[4917]: I0318 08:14:59.178440 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dxm9h" podStartSLOduration=2.178417843 podStartE2EDuration="2.178417843s" podCreationTimestamp="2026-03-18 08:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:14:59.17828088 +0000 UTC m=+5284.119435594" watchObservedRunningTime="2026-03-18 08:14:59.178417843 +0000 UTC m=+5284.119572557" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.132494 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k"] Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.134567 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.137273 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.139888 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.143448 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k"] Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.212691 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a612ff9-5b00-44d7-9966-5fba72c2963b-config-volume\") pod \"collect-profiles-29563695-b954k\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.212923 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkcmh\" (UniqueName: \"kubernetes.io/projected/2a612ff9-5b00-44d7-9966-5fba72c2963b-kube-api-access-zkcmh\") pod \"collect-profiles-29563695-b954k\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.213004 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a612ff9-5b00-44d7-9966-5fba72c2963b-secret-volume\") pod \"collect-profiles-29563695-b954k\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.315134 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a612ff9-5b00-44d7-9966-5fba72c2963b-config-volume\") pod \"collect-profiles-29563695-b954k\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.315335 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkcmh\" (UniqueName: \"kubernetes.io/projected/2a612ff9-5b00-44d7-9966-5fba72c2963b-kube-api-access-zkcmh\") pod \"collect-profiles-29563695-b954k\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.315395 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a612ff9-5b00-44d7-9966-5fba72c2963b-secret-volume\") pod \"collect-profiles-29563695-b954k\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.316452 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a612ff9-5b00-44d7-9966-5fba72c2963b-config-volume\") pod \"collect-profiles-29563695-b954k\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.333527 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a612ff9-5b00-44d7-9966-5fba72c2963b-secret-volume\") pod \"collect-profiles-29563695-b954k\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.341295 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkcmh\" (UniqueName: \"kubernetes.io/projected/2a612ff9-5b00-44d7-9966-5fba72c2963b-kube-api-access-zkcmh\") pod \"collect-profiles-29563695-b954k\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.462413 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:00 crc kubenswrapper[4917]: E0318 08:15:00.557460 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice/crio-64188aac24ffcfcd6a0655bf12a824d7f7fbcaf5fa601b32bdd4719e5093f61c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8602779_52ca_408c_974d_bd5723b5cb8f.slice\": RecentStats: unable to find data in memory cache]" Mar 18 08:15:00 crc kubenswrapper[4917]: I0318 08:15:00.953004 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k"] Mar 18 08:15:01 crc kubenswrapper[4917]: I0318 08:15:01.172086 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" event={"ID":"2a612ff9-5b00-44d7-9966-5fba72c2963b","Type":"ContainerStarted","Data":"4d0274c6dd4cbda94d8d055eca4e35e8ea9d6f36db2ba77958742562e530e48d"} Mar 18 08:15:01 crc kubenswrapper[4917]: I0318 08:15:01.172458 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" event={"ID":"2a612ff9-5b00-44d7-9966-5fba72c2963b","Type":"ContainerStarted","Data":"1c920b2069066efb2ecd86d50a53bb10438d1dddbd84736d5097505c627360c7"} Mar 18 08:15:01 crc kubenswrapper[4917]: I0318 08:15:01.196951 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" podStartSLOduration=1.196934896 podStartE2EDuration="1.196934896s" podCreationTimestamp="2026-03-18 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:15:01.192478418 +0000 UTC m=+5286.133633132" watchObservedRunningTime="2026-03-18 08:15:01.196934896 +0000 UTC m=+5286.138089610" Mar 18 08:15:02 crc kubenswrapper[4917]: I0318 08:15:02.186843 4917 generic.go:334] "Generic (PLEG): container finished" podID="2a612ff9-5b00-44d7-9966-5fba72c2963b" containerID="4d0274c6dd4cbda94d8d055eca4e35e8ea9d6f36db2ba77958742562e530e48d" exitCode=0 Mar 18 08:15:02 crc kubenswrapper[4917]: I0318 08:15:02.186924 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" event={"ID":"2a612ff9-5b00-44d7-9966-5fba72c2963b","Type":"ContainerDied","Data":"4d0274c6dd4cbda94d8d055eca4e35e8ea9d6f36db2ba77958742562e530e48d"} Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.198730 4917 generic.go:334] "Generic (PLEG): container finished" podID="34878b32-0d4d-4d01-8898-90e7393b3b49" containerID="32e1c3096ccac1a4fe42d7ddce9ac61a328a0bfd323c79202551863fe40b82bd" exitCode=0 Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.198930 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dxm9h" event={"ID":"34878b32-0d4d-4d01-8898-90e7393b3b49","Type":"ContainerDied","Data":"32e1c3096ccac1a4fe42d7ddce9ac61a328a0bfd323c79202551863fe40b82bd"} Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.620442 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.676822 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a612ff9-5b00-44d7-9966-5fba72c2963b-config-volume\") pod \"2a612ff9-5b00-44d7-9966-5fba72c2963b\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.676947 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkcmh\" (UniqueName: \"kubernetes.io/projected/2a612ff9-5b00-44d7-9966-5fba72c2963b-kube-api-access-zkcmh\") pod \"2a612ff9-5b00-44d7-9966-5fba72c2963b\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.677026 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a612ff9-5b00-44d7-9966-5fba72c2963b-secret-volume\") pod \"2a612ff9-5b00-44d7-9966-5fba72c2963b\" (UID: \"2a612ff9-5b00-44d7-9966-5fba72c2963b\") " Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.678354 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a612ff9-5b00-44d7-9966-5fba72c2963b-config-volume" (OuterVolumeSpecName: "config-volume") pod "2a612ff9-5b00-44d7-9966-5fba72c2963b" (UID: "2a612ff9-5b00-44d7-9966-5fba72c2963b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.684995 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a612ff9-5b00-44d7-9966-5fba72c2963b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2a612ff9-5b00-44d7-9966-5fba72c2963b" (UID: "2a612ff9-5b00-44d7-9966-5fba72c2963b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.686251 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a612ff9-5b00-44d7-9966-5fba72c2963b-kube-api-access-zkcmh" (OuterVolumeSpecName: "kube-api-access-zkcmh") pod "2a612ff9-5b00-44d7-9966-5fba72c2963b" (UID: "2a612ff9-5b00-44d7-9966-5fba72c2963b"). InnerVolumeSpecName "kube-api-access-zkcmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.780679 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2a612ff9-5b00-44d7-9966-5fba72c2963b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.780709 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkcmh\" (UniqueName: \"kubernetes.io/projected/2a612ff9-5b00-44d7-9966-5fba72c2963b-kube-api-access-zkcmh\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:03 crc kubenswrapper[4917]: I0318 08:15:03.780725 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2a612ff9-5b00-44d7-9966-5fba72c2963b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.217270 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.218381 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k" event={"ID":"2a612ff9-5b00-44d7-9966-5fba72c2963b","Type":"ContainerDied","Data":"1c920b2069066efb2ecd86d50a53bb10438d1dddbd84736d5097505c627360c7"} Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.218434 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c920b2069066efb2ecd86d50a53bb10438d1dddbd84736d5097505c627360c7" Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.310851 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5"] Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.324797 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563650-dpfc5"] Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.614571 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.699382 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4lpg\" (UniqueName: \"kubernetes.io/projected/34878b32-0d4d-4d01-8898-90e7393b3b49-kube-api-access-x4lpg\") pod \"34878b32-0d4d-4d01-8898-90e7393b3b49\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.699806 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-config\") pod \"34878b32-0d4d-4d01-8898-90e7393b3b49\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.699876 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-combined-ca-bundle\") pod \"34878b32-0d4d-4d01-8898-90e7393b3b49\" (UID: \"34878b32-0d4d-4d01-8898-90e7393b3b49\") " Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.707003 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34878b32-0d4d-4d01-8898-90e7393b3b49-kube-api-access-x4lpg" (OuterVolumeSpecName: "kube-api-access-x4lpg") pod "34878b32-0d4d-4d01-8898-90e7393b3b49" (UID: "34878b32-0d4d-4d01-8898-90e7393b3b49"). InnerVolumeSpecName "kube-api-access-x4lpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.720227 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34878b32-0d4d-4d01-8898-90e7393b3b49" (UID: "34878b32-0d4d-4d01-8898-90e7393b3b49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.733044 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-config" (OuterVolumeSpecName: "config") pod "34878b32-0d4d-4d01-8898-90e7393b3b49" (UID: "34878b32-0d4d-4d01-8898-90e7393b3b49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.803755 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.803865 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34878b32-0d4d-4d01-8898-90e7393b3b49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:04 crc kubenswrapper[4917]: I0318 08:15:04.803937 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4lpg\" (UniqueName: \"kubernetes.io/projected/34878b32-0d4d-4d01-8898-90e7393b3b49-kube-api-access-x4lpg\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.229956 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dxm9h" event={"ID":"34878b32-0d4d-4d01-8898-90e7393b3b49","Type":"ContainerDied","Data":"235479b311223d0cbb36fbf215e2910d56d8c8c0d939007ea5ff7d68c20d3f92"} Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.230015 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="235479b311223d0cbb36fbf215e2910d56d8c8c0d939007ea5ff7d68c20d3f92" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.230033 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dxm9h" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.525641 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cd658d79-dpjcx"] Mar 18 08:15:05 crc kubenswrapper[4917]: E0318 08:15:05.525984 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a612ff9-5b00-44d7-9966-5fba72c2963b" containerName="collect-profiles" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.525998 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a612ff9-5b00-44d7-9966-5fba72c2963b" containerName="collect-profiles" Mar 18 08:15:05 crc kubenswrapper[4917]: E0318 08:15:05.526009 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34878b32-0d4d-4d01-8898-90e7393b3b49" containerName="neutron-db-sync" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.526015 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="34878b32-0d4d-4d01-8898-90e7393b3b49" containerName="neutron-db-sync" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.528705 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a612ff9-5b00-44d7-9966-5fba72c2963b" containerName="collect-profiles" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.528747 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="34878b32-0d4d-4d01-8898-90e7393b3b49" containerName="neutron-db-sync" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.529826 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.557749 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd658d79-dpjcx"] Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.617377 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.617422 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-config\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.617452 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.617651 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-dns-svc\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.617697 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57h6l\" (UniqueName: \"kubernetes.io/projected/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-kube-api-access-57h6l\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.719186 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f5dd69f86-hl7mf"] Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.719723 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57h6l\" (UniqueName: \"kubernetes.io/projected/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-kube-api-access-57h6l\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.719872 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.719971 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-config\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.720077 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.720260 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-dns-svc\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.721634 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-dns-svc\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.721750 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-config\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.721775 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-sb\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.721809 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-nb\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.722403 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.729839 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.730141 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.733636 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.733764 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-s25j6" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.765176 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f5dd69f86-hl7mf"] Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.770344 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57h6l\" (UniqueName: \"kubernetes.io/projected/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-kube-api-access-57h6l\") pod \"dnsmasq-dns-6cd658d79-dpjcx\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.787574 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18d5cfc7-f7f0-488a-86bc-f75136f39ec3" path="/var/lib/kubelet/pods/18d5cfc7-f7f0-488a-86bc-f75136f39ec3/volumes" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.822117 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85shq\" (UniqueName: \"kubernetes.io/projected/03cdb896-dde0-4c67-ac8c-96635e52e8f0-kube-api-access-85shq\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.822199 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-config\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.822232 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-ovndb-tls-certs\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.822254 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-httpd-config\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.822278 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-combined-ca-bundle\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.850703 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.923988 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-ovndb-tls-certs\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.924308 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-httpd-config\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.924339 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-combined-ca-bundle\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.924416 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85shq\" (UniqueName: \"kubernetes.io/projected/03cdb896-dde0-4c67-ac8c-96635e52e8f0-kube-api-access-85shq\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.924472 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-config\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.929247 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-combined-ca-bundle\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.929788 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-ovndb-tls-certs\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.930637 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-config\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.938083 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-httpd-config\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:05 crc kubenswrapper[4917]: I0318 08:15:05.946339 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85shq\" (UniqueName: \"kubernetes.io/projected/03cdb896-dde0-4c67-ac8c-96635e52e8f0-kube-api-access-85shq\") pod \"neutron-7f5dd69f86-hl7mf\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:06 crc kubenswrapper[4917]: I0318 08:15:06.037378 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:06 crc kubenswrapper[4917]: I0318 08:15:06.311651 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cd658d79-dpjcx"] Mar 18 08:15:06 crc kubenswrapper[4917]: W0318 08:15:06.317777 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f20036_5cf5_4dc4_92bc_e819bebcd8c7.slice/crio-f0a3c92dc1a10430d4737530a6eb0b2edf198cb05eb7694e8a7f6a748a4dc50b WatchSource:0}: Error finding container f0a3c92dc1a10430d4737530a6eb0b2edf198cb05eb7694e8a7f6a748a4dc50b: Status 404 returned error can't find the container with id f0a3c92dc1a10430d4737530a6eb0b2edf198cb05eb7694e8a7f6a748a4dc50b Mar 18 08:15:06 crc kubenswrapper[4917]: I0318 08:15:06.567389 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f5dd69f86-hl7mf"] Mar 18 08:15:06 crc kubenswrapper[4917]: W0318 08:15:06.575476 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03cdb896_dde0_4c67_ac8c_96635e52e8f0.slice/crio-6de97fe6ea0e20b5f0ca72993c2949be53f4de19145d34c4e73bd97b940e9adf WatchSource:0}: Error finding container 6de97fe6ea0e20b5f0ca72993c2949be53f4de19145d34c4e73bd97b940e9adf: Status 404 returned error can't find the container with id 6de97fe6ea0e20b5f0ca72993c2949be53f4de19145d34c4e73bd97b940e9adf Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.246782 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f5dd69f86-hl7mf" event={"ID":"03cdb896-dde0-4c67-ac8c-96635e52e8f0","Type":"ContainerStarted","Data":"ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58"} Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.247080 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f5dd69f86-hl7mf" event={"ID":"03cdb896-dde0-4c67-ac8c-96635e52e8f0","Type":"ContainerStarted","Data":"ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c"} Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.247091 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f5dd69f86-hl7mf" event={"ID":"03cdb896-dde0-4c67-ac8c-96635e52e8f0","Type":"ContainerStarted","Data":"6de97fe6ea0e20b5f0ca72993c2949be53f4de19145d34c4e73bd97b940e9adf"} Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.247106 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.248764 4917 generic.go:334] "Generic (PLEG): container finished" podID="69f20036-5cf5-4dc4-92bc-e819bebcd8c7" containerID="9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93" exitCode=0 Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.248808 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" event={"ID":"69f20036-5cf5-4dc4-92bc-e819bebcd8c7","Type":"ContainerDied","Data":"9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93"} Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.248829 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" event={"ID":"69f20036-5cf5-4dc4-92bc-e819bebcd8c7","Type":"ContainerStarted","Data":"f0a3c92dc1a10430d4737530a6eb0b2edf198cb05eb7694e8a7f6a748a4dc50b"} Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.272485 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f5dd69f86-hl7mf" podStartSLOduration=2.272466847 podStartE2EDuration="2.272466847s" podCreationTimestamp="2026-03-18 08:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:15:07.268573943 +0000 UTC m=+5292.209728657" watchObservedRunningTime="2026-03-18 08:15:07.272466847 +0000 UTC m=+5292.213621561" Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.522154 4917 scope.go:117] "RemoveContainer" containerID="9d74cb4063b6f51d9055cc46838ad8e4c5077a5581a908195d13add58f6d0b14" Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.542385 4917 scope.go:117] "RemoveContainer" containerID="2d8c9f61990d5db44b16d3ca04a1ab5996fae555fa549183e3785c24ea2e8fb0" Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.945967 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6d49c699bc-wmgbl"] Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.947177 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.948991 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 18 08:15:07 crc kubenswrapper[4917]: I0318 08:15:07.969327 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.008239 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d49c699bc-wmgbl"] Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.084628 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-public-tls-certs\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.084688 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fwfv\" (UniqueName: \"kubernetes.io/projected/776717e0-ca51-4856-abc7-f1b03d0d312b-kube-api-access-9fwfv\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.084717 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-config\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.084784 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-combined-ca-bundle\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.084812 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-internal-tls-certs\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.084833 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-httpd-config\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.084884 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-ovndb-tls-certs\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.186677 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-config\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.186784 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-combined-ca-bundle\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.186811 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-internal-tls-certs\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.186832 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-httpd-config\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.186885 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-ovndb-tls-certs\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.186918 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-public-tls-certs\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.186940 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fwfv\" (UniqueName: \"kubernetes.io/projected/776717e0-ca51-4856-abc7-f1b03d0d312b-kube-api-access-9fwfv\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.192543 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-combined-ca-bundle\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.193977 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-public-tls-certs\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.194297 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-internal-tls-certs\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.195486 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-config\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.196725 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-ovndb-tls-certs\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.203827 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fwfv\" (UniqueName: \"kubernetes.io/projected/776717e0-ca51-4856-abc7-f1b03d0d312b-kube-api-access-9fwfv\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.215228 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/776717e0-ca51-4856-abc7-f1b03d0d312b-httpd-config\") pod \"neutron-6d49c699bc-wmgbl\" (UID: \"776717e0-ca51-4856-abc7-f1b03d0d312b\") " pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.258036 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" event={"ID":"69f20036-5cf5-4dc4-92bc-e819bebcd8c7","Type":"ContainerStarted","Data":"1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de"} Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.258121 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.260902 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.282303 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" podStartSLOduration=3.282285667 podStartE2EDuration="3.282285667s" podCreationTimestamp="2026-03-18 08:15:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:15:08.276561798 +0000 UTC m=+5293.217716512" watchObservedRunningTime="2026-03-18 08:15:08.282285667 +0000 UTC m=+5293.223440371" Mar 18 08:15:08 crc kubenswrapper[4917]: I0318 08:15:08.853110 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6d49c699bc-wmgbl"] Mar 18 08:15:08 crc kubenswrapper[4917]: W0318 08:15:08.857291 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod776717e0_ca51_4856_abc7_f1b03d0d312b.slice/crio-1bb595c317d54686decde3a75dc42900306ad7357d712ceabb3d95b0e624dba5 WatchSource:0}: Error finding container 1bb595c317d54686decde3a75dc42900306ad7357d712ceabb3d95b0e624dba5: Status 404 returned error can't find the container with id 1bb595c317d54686decde3a75dc42900306ad7357d712ceabb3d95b0e624dba5 Mar 18 08:15:09 crc kubenswrapper[4917]: I0318 08:15:09.267981 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d49c699bc-wmgbl" event={"ID":"776717e0-ca51-4856-abc7-f1b03d0d312b","Type":"ContainerStarted","Data":"949112ef98c8f296c3e896fd22fe47903eac61f6ea6b62671586a20470a401a8"} Mar 18 08:15:09 crc kubenswrapper[4917]: I0318 08:15:09.268382 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d49c699bc-wmgbl" event={"ID":"776717e0-ca51-4856-abc7-f1b03d0d312b","Type":"ContainerStarted","Data":"20c50f37a012c3f20736194e85bf2d751fffc2f4fc3c982e2927262f96356991"} Mar 18 08:15:09 crc kubenswrapper[4917]: I0318 08:15:09.268405 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6d49c699bc-wmgbl" event={"ID":"776717e0-ca51-4856-abc7-f1b03d0d312b","Type":"ContainerStarted","Data":"1bb595c317d54686decde3a75dc42900306ad7357d712ceabb3d95b0e624dba5"} Mar 18 08:15:10 crc kubenswrapper[4917]: I0318 08:15:10.276704 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:10 crc kubenswrapper[4917]: E0318 08:15:10.756859 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice/crio-64188aac24ffcfcd6a0655bf12a824d7f7fbcaf5fa601b32bdd4719e5093f61c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8602779_52ca_408c_974d_bd5723b5cb8f.slice\": RecentStats: unable to find data in memory cache]" Mar 18 08:15:15 crc kubenswrapper[4917]: I0318 08:15:15.852424 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:15:15 crc kubenswrapper[4917]: I0318 08:15:15.881895 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6d49c699bc-wmgbl" podStartSLOduration=8.881879685 podStartE2EDuration="8.881879685s" podCreationTimestamp="2026-03-18 08:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:15:09.311066507 +0000 UTC m=+5294.252221231" watchObservedRunningTime="2026-03-18 08:15:15.881879685 +0000 UTC m=+5300.823034399" Mar 18 08:15:15 crc kubenswrapper[4917]: I0318 08:15:15.937790 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-798fdcd749-v7b8s"] Mar 18 08:15:15 crc kubenswrapper[4917]: I0318 08:15:15.938128 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" podUID="12e567d2-c790-47f7-86b6-11c50caedc2a" containerName="dnsmasq-dns" containerID="cri-o://22025d5050cdbe90cde38bef0fd2d94f4b8de52c90b73f135a59486261cd389a" gracePeriod=10 Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.342731 4917 generic.go:334] "Generic (PLEG): container finished" podID="12e567d2-c790-47f7-86b6-11c50caedc2a" containerID="22025d5050cdbe90cde38bef0fd2d94f4b8de52c90b73f135a59486261cd389a" exitCode=0 Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.342765 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" event={"ID":"12e567d2-c790-47f7-86b6-11c50caedc2a","Type":"ContainerDied","Data":"22025d5050cdbe90cde38bef0fd2d94f4b8de52c90b73f135a59486261cd389a"} Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.609220 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.757088 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-sb\") pod \"12e567d2-c790-47f7-86b6-11c50caedc2a\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.757146 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-config\") pod \"12e567d2-c790-47f7-86b6-11c50caedc2a\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.757187 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbccv\" (UniqueName: \"kubernetes.io/projected/12e567d2-c790-47f7-86b6-11c50caedc2a-kube-api-access-dbccv\") pod \"12e567d2-c790-47f7-86b6-11c50caedc2a\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.757362 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-nb\") pod \"12e567d2-c790-47f7-86b6-11c50caedc2a\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.757406 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-dns-svc\") pod \"12e567d2-c790-47f7-86b6-11c50caedc2a\" (UID: \"12e567d2-c790-47f7-86b6-11c50caedc2a\") " Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.771787 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e567d2-c790-47f7-86b6-11c50caedc2a-kube-api-access-dbccv" (OuterVolumeSpecName: "kube-api-access-dbccv") pod "12e567d2-c790-47f7-86b6-11c50caedc2a" (UID: "12e567d2-c790-47f7-86b6-11c50caedc2a"). InnerVolumeSpecName "kube-api-access-dbccv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.799437 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "12e567d2-c790-47f7-86b6-11c50caedc2a" (UID: "12e567d2-c790-47f7-86b6-11c50caedc2a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.812319 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-config" (OuterVolumeSpecName: "config") pod "12e567d2-c790-47f7-86b6-11c50caedc2a" (UID: "12e567d2-c790-47f7-86b6-11c50caedc2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.829167 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "12e567d2-c790-47f7-86b6-11c50caedc2a" (UID: "12e567d2-c790-47f7-86b6-11c50caedc2a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.851310 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12e567d2-c790-47f7-86b6-11c50caedc2a" (UID: "12e567d2-c790-47f7-86b6-11c50caedc2a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.859430 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.859478 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.859495 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.859507 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12e567d2-c790-47f7-86b6-11c50caedc2a-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:16 crc kubenswrapper[4917]: I0318 08:15:16.859519 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbccv\" (UniqueName: \"kubernetes.io/projected/12e567d2-c790-47f7-86b6-11c50caedc2a-kube-api-access-dbccv\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:17 crc kubenswrapper[4917]: I0318 08:15:17.351677 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" event={"ID":"12e567d2-c790-47f7-86b6-11c50caedc2a","Type":"ContainerDied","Data":"d70204ce8a412e968dc33ceda96ae6350dc88e811f914ae232c9a8e5353527a0"} Mar 18 08:15:17 crc kubenswrapper[4917]: I0318 08:15:17.351724 4917 scope.go:117] "RemoveContainer" containerID="22025d5050cdbe90cde38bef0fd2d94f4b8de52c90b73f135a59486261cd389a" Mar 18 08:15:17 crc kubenswrapper[4917]: I0318 08:15:17.351960 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" Mar 18 08:15:17 crc kubenswrapper[4917]: I0318 08:15:17.377286 4917 scope.go:117] "RemoveContainer" containerID="be99f3cef75880fa8331d548ecb4a127ac6706aa57058a7720f3b031bd93c8dc" Mar 18 08:15:17 crc kubenswrapper[4917]: I0318 08:15:17.391468 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-798fdcd749-v7b8s"] Mar 18 08:15:17 crc kubenswrapper[4917]: I0318 08:15:17.399397 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-798fdcd749-v7b8s"] Mar 18 08:15:17 crc kubenswrapper[4917]: I0318 08:15:17.790003 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e567d2-c790-47f7-86b6-11c50caedc2a" path="/var/lib/kubelet/pods/12e567d2-c790-47f7-86b6-11c50caedc2a/volumes" Mar 18 08:15:20 crc kubenswrapper[4917]: E0318 08:15:20.934949 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice/crio-64188aac24ffcfcd6a0655bf12a824d7f7fbcaf5fa601b32bdd4719e5093f61c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8602779_52ca_408c_974d_bd5723b5cb8f.slice\": RecentStats: unable to find data in memory cache]" Mar 18 08:15:21 crc kubenswrapper[4917]: I0318 08:15:21.234112 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-798fdcd749-v7b8s" podUID="12e567d2-c790-47f7-86b6-11c50caedc2a" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.64:5353: i/o timeout" Mar 18 08:15:30 crc kubenswrapper[4917]: I0318 08:15:30.066216 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jnk6v"] Mar 18 08:15:30 crc kubenswrapper[4917]: I0318 08:15:30.074762 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jnk6v"] Mar 18 08:15:31 crc kubenswrapper[4917]: E0318 08:15:31.142399 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice/crio-64188aac24ffcfcd6a0655bf12a824d7f7fbcaf5fa601b32bdd4719e5093f61c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8602779_52ca_408c_974d_bd5723b5cb8f.slice\": RecentStats: unable to find data in memory cache]" Mar 18 08:15:31 crc kubenswrapper[4917]: I0318 08:15:31.786392 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd2030a7-9b6f-4481-8f9f-ac6bb054e97b" path="/var/lib/kubelet/pods/bd2030a7-9b6f-4481-8f9f-ac6bb054e97b/volumes" Mar 18 08:15:32 crc kubenswrapper[4917]: I0318 08:15:32.929092 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:15:32 crc kubenswrapper[4917]: I0318 08:15:32.929169 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:15:36 crc kubenswrapper[4917]: I0318 08:15:36.050681 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:38 crc kubenswrapper[4917]: I0318 08:15:38.286851 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6d49c699bc-wmgbl" Mar 18 08:15:38 crc kubenswrapper[4917]: I0318 08:15:38.374050 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f5dd69f86-hl7mf"] Mar 18 08:15:38 crc kubenswrapper[4917]: I0318 08:15:38.374396 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f5dd69f86-hl7mf" podUID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerName="neutron-api" containerID="cri-o://ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c" gracePeriod=30 Mar 18 08:15:38 crc kubenswrapper[4917]: I0318 08:15:38.374848 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f5dd69f86-hl7mf" podUID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerName="neutron-httpd" containerID="cri-o://ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58" gracePeriod=30 Mar 18 08:15:38 crc kubenswrapper[4917]: I0318 08:15:38.563408 4917 generic.go:334] "Generic (PLEG): container finished" podID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerID="ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58" exitCode=0 Mar 18 08:15:38 crc kubenswrapper[4917]: I0318 08:15:38.563638 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f5dd69f86-hl7mf" event={"ID":"03cdb896-dde0-4c67-ac8c-96635e52e8f0","Type":"ContainerDied","Data":"ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58"} Mar 18 08:15:41 crc kubenswrapper[4917]: E0318 08:15:41.344889 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice/crio-64188aac24ffcfcd6a0655bf12a824d7f7fbcaf5fa601b32bdd4719e5093f61c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8602779_52ca_408c_974d_bd5723b5cb8f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice\": RecentStats: unable to find data in memory cache]" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.566538 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.607405 4917 generic.go:334] "Generic (PLEG): container finished" podID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerID="ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c" exitCode=0 Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.607457 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f5dd69f86-hl7mf" event={"ID":"03cdb896-dde0-4c67-ac8c-96635e52e8f0","Type":"ContainerDied","Data":"ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c"} Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.607532 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f5dd69f86-hl7mf" event={"ID":"03cdb896-dde0-4c67-ac8c-96635e52e8f0","Type":"ContainerDied","Data":"6de97fe6ea0e20b5f0ca72993c2949be53f4de19145d34c4e73bd97b940e9adf"} Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.607490 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f5dd69f86-hl7mf" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.607666 4917 scope.go:117] "RemoveContainer" containerID="ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.636529 4917 scope.go:117] "RemoveContainer" containerID="ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.653264 4917 scope.go:117] "RemoveContainer" containerID="ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58" Mar 18 08:15:41 crc kubenswrapper[4917]: E0318 08:15:41.653740 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58\": container with ID starting with ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58 not found: ID does not exist" containerID="ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.653801 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58"} err="failed to get container status \"ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58\": rpc error: code = NotFound desc = could not find container \"ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58\": container with ID starting with ba185c0e4fadc00809ddb922a8f1e7679aacc79d91d6320848f5edf3d4039d58 not found: ID does not exist" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.653916 4917 scope.go:117] "RemoveContainer" containerID="ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c" Mar 18 08:15:41 crc kubenswrapper[4917]: E0318 08:15:41.654318 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c\": container with ID starting with ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c not found: ID does not exist" containerID="ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.654386 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c"} err="failed to get container status \"ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c\": rpc error: code = NotFound desc = could not find container \"ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c\": container with ID starting with ae5cc7b811013c5225887449b58d5456a506fb4047da2f3c2aeee428bf53d53c not found: ID does not exist" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.728223 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-config\") pod \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.729884 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-httpd-config\") pod \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.729973 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-combined-ca-bundle\") pod \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.730024 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-ovndb-tls-certs\") pod \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.730160 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85shq\" (UniqueName: \"kubernetes.io/projected/03cdb896-dde0-4c67-ac8c-96635e52e8f0-kube-api-access-85shq\") pod \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\" (UID: \"03cdb896-dde0-4c67-ac8c-96635e52e8f0\") " Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.740319 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "03cdb896-dde0-4c67-ac8c-96635e52e8f0" (UID: "03cdb896-dde0-4c67-ac8c-96635e52e8f0"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.746919 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03cdb896-dde0-4c67-ac8c-96635e52e8f0-kube-api-access-85shq" (OuterVolumeSpecName: "kube-api-access-85shq") pod "03cdb896-dde0-4c67-ac8c-96635e52e8f0" (UID: "03cdb896-dde0-4c67-ac8c-96635e52e8f0"). InnerVolumeSpecName "kube-api-access-85shq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.814163 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03cdb896-dde0-4c67-ac8c-96635e52e8f0" (UID: "03cdb896-dde0-4c67-ac8c-96635e52e8f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.816346 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-config" (OuterVolumeSpecName: "config") pod "03cdb896-dde0-4c67-ac8c-96635e52e8f0" (UID: "03cdb896-dde0-4c67-ac8c-96635e52e8f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.834981 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.835016 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.835030 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.835044 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85shq\" (UniqueName: \"kubernetes.io/projected/03cdb896-dde0-4c67-ac8c-96635e52e8f0-kube-api-access-85shq\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.848454 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "03cdb896-dde0-4c67-ac8c-96635e52e8f0" (UID: "03cdb896-dde0-4c67-ac8c-96635e52e8f0"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.938363 4917 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/03cdb896-dde0-4c67-ac8c-96635e52e8f0-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.956446 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f5dd69f86-hl7mf"] Mar 18 08:15:41 crc kubenswrapper[4917]: I0318 08:15:41.962490 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f5dd69f86-hl7mf"] Mar 18 08:15:43 crc kubenswrapper[4917]: I0318 08:15:43.789965 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" path="/var/lib/kubelet/pods/03cdb896-dde0-4c67-ac8c-96635e52e8f0/volumes" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.638430 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9bq"] Mar 18 08:15:46 crc kubenswrapper[4917]: E0318 08:15:46.639244 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e567d2-c790-47f7-86b6-11c50caedc2a" containerName="dnsmasq-dns" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.639262 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e567d2-c790-47f7-86b6-11c50caedc2a" containerName="dnsmasq-dns" Mar 18 08:15:46 crc kubenswrapper[4917]: E0318 08:15:46.639292 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerName="neutron-api" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.639300 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerName="neutron-api" Mar 18 08:15:46 crc kubenswrapper[4917]: E0318 08:15:46.639318 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerName="neutron-httpd" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.639327 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerName="neutron-httpd" Mar 18 08:15:46 crc kubenswrapper[4917]: E0318 08:15:46.639348 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e567d2-c790-47f7-86b6-11c50caedc2a" containerName="init" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.639356 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e567d2-c790-47f7-86b6-11c50caedc2a" containerName="init" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.639574 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerName="neutron-httpd" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.639622 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e567d2-c790-47f7-86b6-11c50caedc2a" containerName="dnsmasq-dns" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.639650 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="03cdb896-dde0-4c67-ac8c-96635e52e8f0" containerName="neutron-api" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.641388 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.669123 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9bq"] Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.734124 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqqv\" (UniqueName: \"kubernetes.io/projected/9f6c7c6e-e800-42d3-a665-0f75556fef49-kube-api-access-5wqqv\") pod \"redhat-marketplace-sw9bq\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.734550 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-utilities\") pod \"redhat-marketplace-sw9bq\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.734804 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-catalog-content\") pod \"redhat-marketplace-sw9bq\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.836755 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqqv\" (UniqueName: \"kubernetes.io/projected/9f6c7c6e-e800-42d3-a665-0f75556fef49-kube-api-access-5wqqv\") pod \"redhat-marketplace-sw9bq\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.836818 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-utilities\") pod \"redhat-marketplace-sw9bq\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.836882 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-catalog-content\") pod \"redhat-marketplace-sw9bq\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.837467 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-catalog-content\") pod \"redhat-marketplace-sw9bq\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.837704 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-utilities\") pod \"redhat-marketplace-sw9bq\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.858455 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqqv\" (UniqueName: \"kubernetes.io/projected/9f6c7c6e-e800-42d3-a665-0f75556fef49-kube-api-access-5wqqv\") pod \"redhat-marketplace-sw9bq\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:46 crc kubenswrapper[4917]: I0318 08:15:46.975553 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:47 crc kubenswrapper[4917]: I0318 08:15:47.470253 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9bq"] Mar 18 08:15:47 crc kubenswrapper[4917]: I0318 08:15:47.673338 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9bq" event={"ID":"9f6c7c6e-e800-42d3-a665-0f75556fef49","Type":"ContainerStarted","Data":"5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122"} Mar 18 08:15:47 crc kubenswrapper[4917]: I0318 08:15:47.673391 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9bq" event={"ID":"9f6c7c6e-e800-42d3-a665-0f75556fef49","Type":"ContainerStarted","Data":"a542d75ebf054d51a2e85a06759736b786081149889f6bf1165923c6b008a0b1"} Mar 18 08:15:48 crc kubenswrapper[4917]: I0318 08:15:48.689091 4917 generic.go:334] "Generic (PLEG): container finished" podID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerID="5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122" exitCode=0 Mar 18 08:15:48 crc kubenswrapper[4917]: I0318 08:15:48.689213 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9bq" event={"ID":"9f6c7c6e-e800-42d3-a665-0f75556fef49","Type":"ContainerDied","Data":"5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122"} Mar 18 08:15:48 crc kubenswrapper[4917]: I0318 08:15:48.693722 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:15:50 crc kubenswrapper[4917]: I0318 08:15:50.713968 4917 generic.go:334] "Generic (PLEG): container finished" podID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerID="45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb" exitCode=0 Mar 18 08:15:50 crc kubenswrapper[4917]: I0318 08:15:50.714619 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9bq" event={"ID":"9f6c7c6e-e800-42d3-a665-0f75556fef49","Type":"ContainerDied","Data":"45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb"} Mar 18 08:15:51 crc kubenswrapper[4917]: E0318 08:15:51.584468 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8602779_52ca_408c_974d_bd5723b5cb8f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53d77b9f_d04c_485f_ab9d_465b393ba56f.slice/crio-64188aac24ffcfcd6a0655bf12a824d7f7fbcaf5fa601b32bdd4719e5093f61c\": RecentStats: unable to find data in memory cache]" Mar 18 08:15:51 crc kubenswrapper[4917]: I0318 08:15:51.727935 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9bq" event={"ID":"9f6c7c6e-e800-42d3-a665-0f75556fef49","Type":"ContainerStarted","Data":"ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a"} Mar 18 08:15:51 crc kubenswrapper[4917]: I0318 08:15:51.761695 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sw9bq" podStartSLOduration=3.300373831 podStartE2EDuration="5.761668207s" podCreationTimestamp="2026-03-18 08:15:46 +0000 UTC" firstStartedPulling="2026-03-18 08:15:48.693241462 +0000 UTC m=+5333.634396216" lastFinishedPulling="2026-03-18 08:15:51.154535878 +0000 UTC m=+5336.095690592" observedRunningTime="2026-03-18 08:15:51.751176162 +0000 UTC m=+5336.692330876" watchObservedRunningTime="2026-03-18 08:15:51.761668207 +0000 UTC m=+5336.702822951" Mar 18 08:15:56 crc kubenswrapper[4917]: I0318 08:15:56.976120 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:56 crc kubenswrapper[4917]: I0318 08:15:56.976960 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:57 crc kubenswrapper[4917]: I0318 08:15:57.059973 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:57 crc kubenswrapper[4917]: I0318 08:15:57.873695 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:15:57 crc kubenswrapper[4917]: I0318 08:15:57.935260 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9bq"] Mar 18 08:15:59 crc kubenswrapper[4917]: I0318 08:15:59.833258 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sw9bq" podUID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerName="registry-server" containerID="cri-o://ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a" gracePeriod=2 Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.158384 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563696-xqzn4"] Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.160949 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563696-xqzn4" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.165845 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.165926 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.166905 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.171308 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563696-xqzn4"] Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.291773 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spvl7\" (UniqueName: \"kubernetes.io/projected/e2133003-1d0c-4244-9516-e3e55cf9491f-kube-api-access-spvl7\") pod \"auto-csr-approver-29563696-xqzn4\" (UID: \"e2133003-1d0c-4244-9516-e3e55cf9491f\") " pod="openshift-infra/auto-csr-approver-29563696-xqzn4" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.315440 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.394265 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spvl7\" (UniqueName: \"kubernetes.io/projected/e2133003-1d0c-4244-9516-e3e55cf9491f-kube-api-access-spvl7\") pod \"auto-csr-approver-29563696-xqzn4\" (UID: \"e2133003-1d0c-4244-9516-e3e55cf9491f\") " pod="openshift-infra/auto-csr-approver-29563696-xqzn4" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.417805 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spvl7\" (UniqueName: \"kubernetes.io/projected/e2133003-1d0c-4244-9516-e3e55cf9491f-kube-api-access-spvl7\") pod \"auto-csr-approver-29563696-xqzn4\" (UID: \"e2133003-1d0c-4244-9516-e3e55cf9491f\") " pod="openshift-infra/auto-csr-approver-29563696-xqzn4" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.481405 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563696-xqzn4" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.495447 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-catalog-content\") pod \"9f6c7c6e-e800-42d3-a665-0f75556fef49\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.495605 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wqqv\" (UniqueName: \"kubernetes.io/projected/9f6c7c6e-e800-42d3-a665-0f75556fef49-kube-api-access-5wqqv\") pod \"9f6c7c6e-e800-42d3-a665-0f75556fef49\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.495635 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-utilities\") pod \"9f6c7c6e-e800-42d3-a665-0f75556fef49\" (UID: \"9f6c7c6e-e800-42d3-a665-0f75556fef49\") " Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.496739 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-utilities" (OuterVolumeSpecName: "utilities") pod "9f6c7c6e-e800-42d3-a665-0f75556fef49" (UID: "9f6c7c6e-e800-42d3-a665-0f75556fef49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.499722 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f6c7c6e-e800-42d3-a665-0f75556fef49-kube-api-access-5wqqv" (OuterVolumeSpecName: "kube-api-access-5wqqv") pod "9f6c7c6e-e800-42d3-a665-0f75556fef49" (UID: "9f6c7c6e-e800-42d3-a665-0f75556fef49"). InnerVolumeSpecName "kube-api-access-5wqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.523254 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f6c7c6e-e800-42d3-a665-0f75556fef49" (UID: "9f6c7c6e-e800-42d3-a665-0f75556fef49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.598135 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.598174 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wqqv\" (UniqueName: \"kubernetes.io/projected/9f6c7c6e-e800-42d3-a665-0f75556fef49-kube-api-access-5wqqv\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.598186 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f6c7c6e-e800-42d3-a665-0f75556fef49-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.844018 4917 generic.go:334] "Generic (PLEG): container finished" podID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerID="ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a" exitCode=0 Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.844059 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9bq" event={"ID":"9f6c7c6e-e800-42d3-a665-0f75556fef49","Type":"ContainerDied","Data":"ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a"} Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.844084 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw9bq" event={"ID":"9f6c7c6e-e800-42d3-a665-0f75556fef49","Type":"ContainerDied","Data":"a542d75ebf054d51a2e85a06759736b786081149889f6bf1165923c6b008a0b1"} Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.844100 4917 scope.go:117] "RemoveContainer" containerID="ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.844218 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw9bq" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.910216 4917 scope.go:117] "RemoveContainer" containerID="45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.928911 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9bq"] Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.938796 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw9bq"] Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.945981 4917 scope.go:117] "RemoveContainer" containerID="5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.950570 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563696-xqzn4"] Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.971384 4917 scope.go:117] "RemoveContainer" containerID="ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a" Mar 18 08:16:00 crc kubenswrapper[4917]: E0318 08:16:00.971875 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a\": container with ID starting with ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a not found: ID does not exist" containerID="ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.971919 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a"} err="failed to get container status \"ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a\": rpc error: code = NotFound desc = could not find container \"ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a\": container with ID starting with ca2a2fe9791c6752d6e2d3ca8a8135e79f036b2f9f97283e0389d6249406f46a not found: ID does not exist" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.971947 4917 scope.go:117] "RemoveContainer" containerID="45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb" Mar 18 08:16:00 crc kubenswrapper[4917]: E0318 08:16:00.972275 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb\": container with ID starting with 45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb not found: ID does not exist" containerID="45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.972309 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb"} err="failed to get container status \"45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb\": rpc error: code = NotFound desc = could not find container \"45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb\": container with ID starting with 45e395767197778f80f3d2294f1952829a127b62351b50d362a389adc67b85eb not found: ID does not exist" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.972335 4917 scope.go:117] "RemoveContainer" containerID="5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122" Mar 18 08:16:00 crc kubenswrapper[4917]: E0318 08:16:00.972686 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122\": container with ID starting with 5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122 not found: ID does not exist" containerID="5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122" Mar 18 08:16:00 crc kubenswrapper[4917]: I0318 08:16:00.972714 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122"} err="failed to get container status \"5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122\": rpc error: code = NotFound desc = could not find container \"5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122\": container with ID starting with 5b73775154c49c54b07c04fd8b4b755594d4714aee92e0be1eda5fc4469ca122 not found: ID does not exist" Mar 18 08:16:01 crc kubenswrapper[4917]: I0318 08:16:01.787471 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f6c7c6e-e800-42d3-a665-0f75556fef49" path="/var/lib/kubelet/pods/9f6c7c6e-e800-42d3-a665-0f75556fef49/volumes" Mar 18 08:16:01 crc kubenswrapper[4917]: I0318 08:16:01.859018 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563696-xqzn4" event={"ID":"e2133003-1d0c-4244-9516-e3e55cf9491f","Type":"ContainerStarted","Data":"39edf87097078a6c14be2c2c2f00ebcf16ddcebac517e73bbcaa0ea53f368eb2"} Mar 18 08:16:02 crc kubenswrapper[4917]: I0318 08:16:02.872311 4917 generic.go:334] "Generic (PLEG): container finished" podID="e2133003-1d0c-4244-9516-e3e55cf9491f" containerID="8c428e6693deab537b168147994caa690dd7a782e495fa5a41b0ab97962aea4b" exitCode=0 Mar 18 08:16:02 crc kubenswrapper[4917]: I0318 08:16:02.872429 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563696-xqzn4" event={"ID":"e2133003-1d0c-4244-9516-e3e55cf9491f","Type":"ContainerDied","Data":"8c428e6693deab537b168147994caa690dd7a782e495fa5a41b0ab97962aea4b"} Mar 18 08:16:02 crc kubenswrapper[4917]: I0318 08:16:02.929096 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:16:02 crc kubenswrapper[4917]: I0318 08:16:02.929180 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:16:04 crc kubenswrapper[4917]: I0318 08:16:04.281056 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563696-xqzn4" Mar 18 08:16:04 crc kubenswrapper[4917]: I0318 08:16:04.393492 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spvl7\" (UniqueName: \"kubernetes.io/projected/e2133003-1d0c-4244-9516-e3e55cf9491f-kube-api-access-spvl7\") pod \"e2133003-1d0c-4244-9516-e3e55cf9491f\" (UID: \"e2133003-1d0c-4244-9516-e3e55cf9491f\") " Mar 18 08:16:04 crc kubenswrapper[4917]: I0318 08:16:04.400774 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2133003-1d0c-4244-9516-e3e55cf9491f-kube-api-access-spvl7" (OuterVolumeSpecName: "kube-api-access-spvl7") pod "e2133003-1d0c-4244-9516-e3e55cf9491f" (UID: "e2133003-1d0c-4244-9516-e3e55cf9491f"). InnerVolumeSpecName "kube-api-access-spvl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:16:04 crc kubenswrapper[4917]: I0318 08:16:04.496366 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spvl7\" (UniqueName: \"kubernetes.io/projected/e2133003-1d0c-4244-9516-e3e55cf9491f-kube-api-access-spvl7\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:04 crc kubenswrapper[4917]: I0318 08:16:04.891984 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563696-xqzn4" event={"ID":"e2133003-1d0c-4244-9516-e3e55cf9491f","Type":"ContainerDied","Data":"39edf87097078a6c14be2c2c2f00ebcf16ddcebac517e73bbcaa0ea53f368eb2"} Mar 18 08:16:04 crc kubenswrapper[4917]: I0318 08:16:04.892380 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39edf87097078a6c14be2c2c2f00ebcf16ddcebac517e73bbcaa0ea53f368eb2" Mar 18 08:16:04 crc kubenswrapper[4917]: I0318 08:16:04.892106 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563696-xqzn4" Mar 18 08:16:05 crc kubenswrapper[4917]: I0318 08:16:05.379400 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563690-lfwvt"] Mar 18 08:16:05 crc kubenswrapper[4917]: I0318 08:16:05.388790 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563690-lfwvt"] Mar 18 08:16:05 crc kubenswrapper[4917]: I0318 08:16:05.789678 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f213ddc0-297e-4e41-98bb-92ce80a5f87e" path="/var/lib/kubelet/pods/f213ddc0-297e-4e41-98bb-92ce80a5f87e/volumes" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.891269 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-65679"] Mar 18 08:16:06 crc kubenswrapper[4917]: E0318 08:16:06.892032 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2133003-1d0c-4244-9516-e3e55cf9491f" containerName="oc" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.892050 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2133003-1d0c-4244-9516-e3e55cf9491f" containerName="oc" Mar 18 08:16:06 crc kubenswrapper[4917]: E0318 08:16:06.892073 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerName="extract-utilities" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.892089 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerName="extract-utilities" Mar 18 08:16:06 crc kubenswrapper[4917]: E0318 08:16:06.892106 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerName="registry-server" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.892112 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerName="registry-server" Mar 18 08:16:06 crc kubenswrapper[4917]: E0318 08:16:06.892154 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerName="extract-content" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.892160 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerName="extract-content" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.892487 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2133003-1d0c-4244-9516-e3e55cf9491f" containerName="oc" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.892518 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f6c7c6e-e800-42d3-a665-0f75556fef49" containerName="registry-server" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.895981 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.906248 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.906338 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-9gdqj" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.906576 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.906695 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.906870 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 18 08:16:06 crc kubenswrapper[4917]: I0318 08:16:06.919844 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-65679"] Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.045719 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-668f47984f-844s4"] Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.046569 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7067b7e-c682-4627-be7d-f86139040b78-etc-swift\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.046683 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-ring-data-devices\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.046704 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-combined-ca-bundle\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.046792 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-scripts\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.046808 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-swiftconf\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.046832 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t7z5\" (UniqueName: \"kubernetes.io/projected/d7067b7e-c682-4627-be7d-f86139040b78-kube-api-access-8t7z5\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.046860 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-dispersionconf\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.047375 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.054290 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668f47984f-844s4"] Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148199 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-sb\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148271 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-swiftconf\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148295 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-scripts\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148312 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-nb\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148340 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t7z5\" (UniqueName: \"kubernetes.io/projected/d7067b7e-c682-4627-be7d-f86139040b78-kube-api-access-8t7z5\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148365 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-config\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148387 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-dispersionconf\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148404 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-dns-svc\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148442 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7067b7e-c682-4627-be7d-f86139040b78-etc-swift\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148461 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d46mr\" (UniqueName: \"kubernetes.io/projected/681c0c83-35a8-495b-ab0e-b87d4543bb1c-kube-api-access-d46mr\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148480 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-ring-data-devices\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.148497 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-combined-ca-bundle\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.149375 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7067b7e-c682-4627-be7d-f86139040b78-etc-swift\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.149797 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-scripts\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.150093 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-ring-data-devices\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.153412 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-swiftconf\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.160001 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-dispersionconf\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.162676 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-combined-ca-bundle\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.169076 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t7z5\" (UniqueName: \"kubernetes.io/projected/d7067b7e-c682-4627-be7d-f86139040b78-kube-api-access-8t7z5\") pod \"swift-ring-rebalance-65679\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.222228 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.252346 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-nb\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.252413 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-config\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.252449 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-dns-svc\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.252513 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d46mr\" (UniqueName: \"kubernetes.io/projected/681c0c83-35a8-495b-ab0e-b87d4543bb1c-kube-api-access-d46mr\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.252609 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-sb\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.253341 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-nb\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.253383 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-config\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.253567 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-sb\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.253719 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-dns-svc\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.271193 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d46mr\" (UniqueName: \"kubernetes.io/projected/681c0c83-35a8-495b-ab0e-b87d4543bb1c-kube-api-access-d46mr\") pod \"dnsmasq-dns-668f47984f-844s4\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.364082 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.699411 4917 scope.go:117] "RemoveContainer" containerID="b2b9ee93147aaef7294948f17ddf6b8ca090121f8e263aa9ced6ebaf2b0c2fa6" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.735927 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-65679"] Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.746288 4917 scope.go:117] "RemoveContainer" containerID="4784dd8b1b5bae5ddbbb1c54f5f48045e21a8b542035602e8b0fa07f0ff00001" Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.848331 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668f47984f-844s4"] Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.926806 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-65679" event={"ID":"d7067b7e-c682-4627-be7d-f86139040b78","Type":"ContainerStarted","Data":"37b55904473c83ff958415880499639a0a4464600fd0b8e1a8450699dee60072"} Mar 18 08:16:07 crc kubenswrapper[4917]: I0318 08:16:07.928280 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668f47984f-844s4" event={"ID":"681c0c83-35a8-495b-ab0e-b87d4543bb1c","Type":"ContainerStarted","Data":"06fd1032b730e34189d51ce4d2e6e5adbe2644d50a747d8524af92cf31329175"} Mar 18 08:16:08 crc kubenswrapper[4917]: I0318 08:16:08.944138 4917 generic.go:334] "Generic (PLEG): container finished" podID="681c0c83-35a8-495b-ab0e-b87d4543bb1c" containerID="c5f570a90c3510c8cf370963eb8cd8506aa23020395c6d09363cb7c164daa56b" exitCode=0 Mar 18 08:16:08 crc kubenswrapper[4917]: I0318 08:16:08.944235 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668f47984f-844s4" event={"ID":"681c0c83-35a8-495b-ab0e-b87d4543bb1c","Type":"ContainerDied","Data":"c5f570a90c3510c8cf370963eb8cd8506aa23020395c6d09363cb7c164daa56b"} Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.104870 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r6s44"] Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.115150 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.120020 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6s44"] Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.190513 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-utilities\") pod \"redhat-operators-r6s44\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.192680 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhpp2\" (UniqueName: \"kubernetes.io/projected/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-kube-api-access-bhpp2\") pod \"redhat-operators-r6s44\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.192732 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-catalog-content\") pod \"redhat-operators-r6s44\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.294936 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhpp2\" (UniqueName: \"kubernetes.io/projected/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-kube-api-access-bhpp2\") pod \"redhat-operators-r6s44\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.295039 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-catalog-content\") pod \"redhat-operators-r6s44\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.295189 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-utilities\") pod \"redhat-operators-r6s44\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.295449 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-catalog-content\") pod \"redhat-operators-r6s44\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.295540 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-utilities\") pod \"redhat-operators-r6s44\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.316719 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhpp2\" (UniqueName: \"kubernetes.io/projected/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-kube-api-access-bhpp2\") pod \"redhat-operators-r6s44\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.487776 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.948505 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6f897f46d4-xp9h9"] Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.951259 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.954324 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.971857 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f897f46d4-xp9h9"] Mar 18 08:16:09 crc kubenswrapper[4917]: I0318 08:16:09.995905 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r6s44"] Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.016288 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgt6n\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-kube-api-access-mgt6n\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.016396 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-combined-ca-bundle\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.016434 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-config-data\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.016483 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-run-httpd\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.016505 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-etc-swift\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.016526 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-log-httpd\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.017144 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668f47984f-844s4" event={"ID":"681c0c83-35a8-495b-ab0e-b87d4543bb1c","Type":"ContainerStarted","Data":"dad7f5941aaf62da67618a4a0d5c9ede2035754f3b11a9701c153b35e6a39313"} Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.020722 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.041991 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-668f47984f-844s4" podStartSLOduration=4.041971614 podStartE2EDuration="4.041971614s" podCreationTimestamp="2026-03-18 08:16:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:16:10.036307647 +0000 UTC m=+5354.977462371" watchObservedRunningTime="2026-03-18 08:16:10.041971614 +0000 UTC m=+5354.983126328" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.118371 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-combined-ca-bundle\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.118435 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-config-data\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.118514 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-run-httpd\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.118537 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-etc-swift\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.118562 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-log-httpd\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.118626 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgt6n\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-kube-api-access-mgt6n\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.119624 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-run-httpd\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.120325 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-log-httpd\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.125729 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-combined-ca-bundle\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.130388 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-config-data\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.138472 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgt6n\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-kube-api-access-mgt6n\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.138643 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-etc-swift\") pod \"swift-proxy-6f897f46d4-xp9h9\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:10 crc kubenswrapper[4917]: I0318 08:16:10.279463 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:12 crc kubenswrapper[4917]: W0318 08:16:12.039543 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bec30e9_d2e9_461c_a8cd_bc3c7962a557.slice/crio-d1a0b188f77a7541ee80182621abe37a3507d7b651f86b451befab323af8d7be WatchSource:0}: Error finding container d1a0b188f77a7541ee80182621abe37a3507d7b651f86b451befab323af8d7be: Status 404 returned error can't find the container with id d1a0b188f77a7541ee80182621abe37a3507d7b651f86b451befab323af8d7be Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.237662 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-747bdb4bf8-skf9g"] Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.239831 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.247129 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.248066 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.268370 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-747bdb4bf8-skf9g"] Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.363423 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b6e2-0173-4d18-a025-57faa95077af-log-httpd\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.363514 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-internal-tls-certs\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.363548 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg6mc\" (UniqueName: \"kubernetes.io/projected/7eb5b6e2-0173-4d18-a025-57faa95077af-kube-api-access-qg6mc\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.363651 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-combined-ca-bundle\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.363808 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-config-data\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.363870 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-public-tls-certs\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.363896 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b6e2-0173-4d18-a025-57faa95077af-run-httpd\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.363976 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7eb5b6e2-0173-4d18-a025-57faa95077af-etc-swift\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.465857 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7eb5b6e2-0173-4d18-a025-57faa95077af-etc-swift\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.465938 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b6e2-0173-4d18-a025-57faa95077af-log-httpd\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.466000 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-internal-tls-certs\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.466027 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg6mc\" (UniqueName: \"kubernetes.io/projected/7eb5b6e2-0173-4d18-a025-57faa95077af-kube-api-access-qg6mc\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.466070 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-combined-ca-bundle\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.466142 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-config-data\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.466178 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-public-tls-certs\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.466199 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b6e2-0173-4d18-a025-57faa95077af-run-httpd\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.466748 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b6e2-0173-4d18-a025-57faa95077af-run-httpd\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.467594 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7eb5b6e2-0173-4d18-a025-57faa95077af-log-httpd\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.471100 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-config-data\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.471686 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7eb5b6e2-0173-4d18-a025-57faa95077af-etc-swift\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.473409 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-combined-ca-bundle\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.474299 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-internal-tls-certs\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.475813 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7eb5b6e2-0173-4d18-a025-57faa95077af-public-tls-certs\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.484157 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg6mc\" (UniqueName: \"kubernetes.io/projected/7eb5b6e2-0173-4d18-a025-57faa95077af-kube-api-access-qg6mc\") pod \"swift-proxy-747bdb4bf8-skf9g\" (UID: \"7eb5b6e2-0173-4d18-a025-57faa95077af\") " pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.696102 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:12 crc kubenswrapper[4917]: I0318 08:16:12.744104 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f897f46d4-xp9h9"] Mar 18 08:16:13 crc kubenswrapper[4917]: I0318 08:16:13.041958 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f897f46d4-xp9h9" event={"ID":"bc0768c2-2820-4e59-bc91-badf5425c9a8","Type":"ContainerStarted","Data":"6a0c3f6eb803560e0d4a6753f8179ba30f59bbd7769374f9254a9aa2df8e3827"} Mar 18 08:16:13 crc kubenswrapper[4917]: I0318 08:16:13.042315 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f897f46d4-xp9h9" event={"ID":"bc0768c2-2820-4e59-bc91-badf5425c9a8","Type":"ContainerStarted","Data":"9c4884b02b3f38fccbf2b31f895c2db2688710ad2bb84ceb739d6f329ad6c909"} Mar 18 08:16:13 crc kubenswrapper[4917]: I0318 08:16:13.044194 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-65679" event={"ID":"d7067b7e-c682-4627-be7d-f86139040b78","Type":"ContainerStarted","Data":"1bde06bc2405c4e5c4c22f2790cfbc6fd887ea343b6f091e40a46f6d54c56cb1"} Mar 18 08:16:13 crc kubenswrapper[4917]: I0318 08:16:13.046705 4917 generic.go:334] "Generic (PLEG): container finished" podID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerID="a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de" exitCode=0 Mar 18 08:16:13 crc kubenswrapper[4917]: I0318 08:16:13.046780 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6s44" event={"ID":"2bec30e9-d2e9-461c-a8cd-bc3c7962a557","Type":"ContainerDied","Data":"a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de"} Mar 18 08:16:13 crc kubenswrapper[4917]: I0318 08:16:13.046823 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6s44" event={"ID":"2bec30e9-d2e9-461c-a8cd-bc3c7962a557","Type":"ContainerStarted","Data":"d1a0b188f77a7541ee80182621abe37a3507d7b651f86b451befab323af8d7be"} Mar 18 08:16:13 crc kubenswrapper[4917]: I0318 08:16:13.083298 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-65679" podStartSLOduration=2.737971424 podStartE2EDuration="7.083274361s" podCreationTimestamp="2026-03-18 08:16:06 +0000 UTC" firstStartedPulling="2026-03-18 08:16:07.755130288 +0000 UTC m=+5352.696285002" lastFinishedPulling="2026-03-18 08:16:12.100433225 +0000 UTC m=+5357.041587939" observedRunningTime="2026-03-18 08:16:13.070142023 +0000 UTC m=+5358.011296737" watchObservedRunningTime="2026-03-18 08:16:13.083274361 +0000 UTC m=+5358.024429075" Mar 18 08:16:13 crc kubenswrapper[4917]: W0318 08:16:13.310781 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7eb5b6e2_0173_4d18_a025_57faa95077af.slice/crio-8a4f550d4fd36c2a8c0750f5507ab32051ddf9173ee5787b8d80436b8ea166b2 WatchSource:0}: Error finding container 8a4f550d4fd36c2a8c0750f5507ab32051ddf9173ee5787b8d80436b8ea166b2: Status 404 returned error can't find the container with id 8a4f550d4fd36c2a8c0750f5507ab32051ddf9173ee5787b8d80436b8ea166b2 Mar 18 08:16:13 crc kubenswrapper[4917]: I0318 08:16:13.316216 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-747bdb4bf8-skf9g"] Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.059173 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f897f46d4-xp9h9" event={"ID":"bc0768c2-2820-4e59-bc91-badf5425c9a8","Type":"ContainerStarted","Data":"4f041ba39e945e1d53bca86bee1c4fd9ea043a08dfad2c16a454eaaecc71e2ae"} Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.059647 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.069743 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6s44" event={"ID":"2bec30e9-d2e9-461c-a8cd-bc3c7962a557","Type":"ContainerStarted","Data":"9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68"} Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.076798 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-747bdb4bf8-skf9g" event={"ID":"7eb5b6e2-0173-4d18-a025-57faa95077af","Type":"ContainerStarted","Data":"5392d75a12dbbee415d725f13f651ef093383ed66eab5dfc1bf8519b29a31d06"} Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.076860 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.076872 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-747bdb4bf8-skf9g" event={"ID":"7eb5b6e2-0173-4d18-a025-57faa95077af","Type":"ContainerStarted","Data":"1bd93d9d08e5d4bffc15263ccda2b6ec6c070ae5baf6dcf911c28643e74d3fae"} Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.076884 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.076892 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-747bdb4bf8-skf9g" event={"ID":"7eb5b6e2-0173-4d18-a025-57faa95077af","Type":"ContainerStarted","Data":"8a4f550d4fd36c2a8c0750f5507ab32051ddf9173ee5787b8d80436b8ea166b2"} Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.088369 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6f897f46d4-xp9h9" podStartSLOduration=5.088348856 podStartE2EDuration="5.088348856s" podCreationTimestamp="2026-03-18 08:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:16:14.087161947 +0000 UTC m=+5359.028316661" watchObservedRunningTime="2026-03-18 08:16:14.088348856 +0000 UTC m=+5359.029503590" Mar 18 08:16:14 crc kubenswrapper[4917]: I0318 08:16:14.132101 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-747bdb4bf8-skf9g" podStartSLOduration=2.132083686 podStartE2EDuration="2.132083686s" podCreationTimestamp="2026-03-18 08:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:16:14.126439599 +0000 UTC m=+5359.067594333" watchObservedRunningTime="2026-03-18 08:16:14.132083686 +0000 UTC m=+5359.073238400" Mar 18 08:16:15 crc kubenswrapper[4917]: I0318 08:16:15.086079 4917 generic.go:334] "Generic (PLEG): container finished" podID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerID="9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68" exitCode=0 Mar 18 08:16:15 crc kubenswrapper[4917]: I0318 08:16:15.086156 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6s44" event={"ID":"2bec30e9-d2e9-461c-a8cd-bc3c7962a557","Type":"ContainerDied","Data":"9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68"} Mar 18 08:16:15 crc kubenswrapper[4917]: I0318 08:16:15.089115 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:16 crc kubenswrapper[4917]: I0318 08:16:16.101698 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6s44" event={"ID":"2bec30e9-d2e9-461c-a8cd-bc3c7962a557","Type":"ContainerStarted","Data":"9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6"} Mar 18 08:16:16 crc kubenswrapper[4917]: I0318 08:16:16.147025 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r6s44" podStartSLOduration=4.636239486 podStartE2EDuration="7.146993191s" podCreationTimestamp="2026-03-18 08:16:09 +0000 UTC" firstStartedPulling="2026-03-18 08:16:13.059383692 +0000 UTC m=+5358.000538406" lastFinishedPulling="2026-03-18 08:16:15.570137397 +0000 UTC m=+5360.511292111" observedRunningTime="2026-03-18 08:16:16.132966511 +0000 UTC m=+5361.074121245" watchObservedRunningTime="2026-03-18 08:16:16.146993191 +0000 UTC m=+5361.088147945" Mar 18 08:16:17 crc kubenswrapper[4917]: I0318 08:16:17.120069 4917 generic.go:334] "Generic (PLEG): container finished" podID="d7067b7e-c682-4627-be7d-f86139040b78" containerID="1bde06bc2405c4e5c4c22f2790cfbc6fd887ea343b6f091e40a46f6d54c56cb1" exitCode=0 Mar 18 08:16:17 crc kubenswrapper[4917]: I0318 08:16:17.120175 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-65679" event={"ID":"d7067b7e-c682-4627-be7d-f86139040b78","Type":"ContainerDied","Data":"1bde06bc2405c4e5c4c22f2790cfbc6fd887ea343b6f091e40a46f6d54c56cb1"} Mar 18 08:16:17 crc kubenswrapper[4917]: I0318 08:16:17.366044 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:16:17 crc kubenswrapper[4917]: I0318 08:16:17.450900 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd658d79-dpjcx"] Mar 18 08:16:17 crc kubenswrapper[4917]: I0318 08:16:17.451264 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" podUID="69f20036-5cf5-4dc4-92bc-e819bebcd8c7" containerName="dnsmasq-dns" containerID="cri-o://1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de" gracePeriod=10 Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.008741 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.083688 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-sb\") pod \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.083767 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57h6l\" (UniqueName: \"kubernetes.io/projected/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-kube-api-access-57h6l\") pod \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.083816 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-nb\") pod \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.083832 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-config\") pod \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.083901 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-dns-svc\") pod \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\" (UID: \"69f20036-5cf5-4dc4-92bc-e819bebcd8c7\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.131882 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-kube-api-access-57h6l" (OuterVolumeSpecName: "kube-api-access-57h6l") pod "69f20036-5cf5-4dc4-92bc-e819bebcd8c7" (UID: "69f20036-5cf5-4dc4-92bc-e819bebcd8c7"). InnerVolumeSpecName "kube-api-access-57h6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.159818 4917 generic.go:334] "Generic (PLEG): container finished" podID="69f20036-5cf5-4dc4-92bc-e819bebcd8c7" containerID="1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de" exitCode=0 Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.160043 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.160676 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" event={"ID":"69f20036-5cf5-4dc4-92bc-e819bebcd8c7","Type":"ContainerDied","Data":"1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de"} Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.160722 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cd658d79-dpjcx" event={"ID":"69f20036-5cf5-4dc4-92bc-e819bebcd8c7","Type":"ContainerDied","Data":"f0a3c92dc1a10430d4737530a6eb0b2edf198cb05eb7694e8a7f6a748a4dc50b"} Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.160739 4917 scope.go:117] "RemoveContainer" containerID="1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.185252 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69f20036-5cf5-4dc4-92bc-e819bebcd8c7" (UID: "69f20036-5cf5-4dc4-92bc-e819bebcd8c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.185324 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69f20036-5cf5-4dc4-92bc-e819bebcd8c7" (UID: "69f20036-5cf5-4dc4-92bc-e819bebcd8c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.185919 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69f20036-5cf5-4dc4-92bc-e819bebcd8c7" (UID: "69f20036-5cf5-4dc4-92bc-e819bebcd8c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.186939 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.186963 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.186975 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57h6l\" (UniqueName: \"kubernetes.io/projected/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-kube-api-access-57h6l\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.186983 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.206169 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-config" (OuterVolumeSpecName: "config") pod "69f20036-5cf5-4dc4-92bc-e819bebcd8c7" (UID: "69f20036-5cf5-4dc4-92bc-e819bebcd8c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.207986 4917 scope.go:117] "RemoveContainer" containerID="9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.229407 4917 scope.go:117] "RemoveContainer" containerID="1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de" Mar 18 08:16:18 crc kubenswrapper[4917]: E0318 08:16:18.229882 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de\": container with ID starting with 1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de not found: ID does not exist" containerID="1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.229932 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de"} err="failed to get container status \"1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de\": rpc error: code = NotFound desc = could not find container \"1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de\": container with ID starting with 1bbbbd4553691205bfcbf559357573067ed44658a0adfc89dbac93704403e6de not found: ID does not exist" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.229968 4917 scope.go:117] "RemoveContainer" containerID="9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93" Mar 18 08:16:18 crc kubenswrapper[4917]: E0318 08:16:18.230352 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93\": container with ID starting with 9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93 not found: ID does not exist" containerID="9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.230411 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93"} err="failed to get container status \"9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93\": rpc error: code = NotFound desc = could not find container \"9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93\": container with ID starting with 9c6eecaa0aa2b1b3c0b0904f5643bcb6605665bdd4d08f345d4afe14d8d2af93 not found: ID does not exist" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.288161 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69f20036-5cf5-4dc4-92bc-e819bebcd8c7-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.429802 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.492404 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-swiftconf\") pod \"d7067b7e-c682-4627-be7d-f86139040b78\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.492939 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7067b7e-c682-4627-be7d-f86139040b78-etc-swift\") pod \"d7067b7e-c682-4627-be7d-f86139040b78\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.492980 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-dispersionconf\") pod \"d7067b7e-c682-4627-be7d-f86139040b78\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.493087 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t7z5\" (UniqueName: \"kubernetes.io/projected/d7067b7e-c682-4627-be7d-f86139040b78-kube-api-access-8t7z5\") pod \"d7067b7e-c682-4627-be7d-f86139040b78\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.493203 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-combined-ca-bundle\") pod \"d7067b7e-c682-4627-be7d-f86139040b78\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.493284 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-scripts\") pod \"d7067b7e-c682-4627-be7d-f86139040b78\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.493342 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-ring-data-devices\") pod \"d7067b7e-c682-4627-be7d-f86139040b78\" (UID: \"d7067b7e-c682-4627-be7d-f86139040b78\") " Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.493834 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7067b7e-c682-4627-be7d-f86139040b78-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d7067b7e-c682-4627-be7d-f86139040b78" (UID: "d7067b7e-c682-4627-be7d-f86139040b78"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.494014 4917 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d7067b7e-c682-4627-be7d-f86139040b78-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.494429 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d7067b7e-c682-4627-be7d-f86139040b78" (UID: "d7067b7e-c682-4627-be7d-f86139040b78"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.497009 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7067b7e-c682-4627-be7d-f86139040b78-kube-api-access-8t7z5" (OuterVolumeSpecName: "kube-api-access-8t7z5") pod "d7067b7e-c682-4627-be7d-f86139040b78" (UID: "d7067b7e-c682-4627-be7d-f86139040b78"). InnerVolumeSpecName "kube-api-access-8t7z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.499673 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d7067b7e-c682-4627-be7d-f86139040b78" (UID: "d7067b7e-c682-4627-be7d-f86139040b78"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.501101 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cd658d79-dpjcx"] Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.508488 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cd658d79-dpjcx"] Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.520513 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d7067b7e-c682-4627-be7d-f86139040b78" (UID: "d7067b7e-c682-4627-be7d-f86139040b78"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.522417 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7067b7e-c682-4627-be7d-f86139040b78" (UID: "d7067b7e-c682-4627-be7d-f86139040b78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.528505 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-scripts" (OuterVolumeSpecName: "scripts") pod "d7067b7e-c682-4627-be7d-f86139040b78" (UID: "d7067b7e-c682-4627-be7d-f86139040b78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.595474 4917 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.595522 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t7z5\" (UniqueName: \"kubernetes.io/projected/d7067b7e-c682-4627-be7d-f86139040b78-kube-api-access-8t7z5\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.595535 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.595550 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.595561 4917 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d7067b7e-c682-4627-be7d-f86139040b78-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:18 crc kubenswrapper[4917]: I0318 08:16:18.595572 4917 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d7067b7e-c682-4627-be7d-f86139040b78-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:19 crc kubenswrapper[4917]: I0318 08:16:19.170128 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-65679" Mar 18 08:16:19 crc kubenswrapper[4917]: I0318 08:16:19.170126 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-65679" event={"ID":"d7067b7e-c682-4627-be7d-f86139040b78","Type":"ContainerDied","Data":"37b55904473c83ff958415880499639a0a4464600fd0b8e1a8450699dee60072"} Mar 18 08:16:19 crc kubenswrapper[4917]: I0318 08:16:19.170243 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37b55904473c83ff958415880499639a0a4464600fd0b8e1a8450699dee60072" Mar 18 08:16:19 crc kubenswrapper[4917]: I0318 08:16:19.488913 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:19 crc kubenswrapper[4917]: I0318 08:16:19.489293 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:19 crc kubenswrapper[4917]: I0318 08:16:19.787463 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69f20036-5cf5-4dc4-92bc-e819bebcd8c7" path="/var/lib/kubelet/pods/69f20036-5cf5-4dc4-92bc-e819bebcd8c7/volumes" Mar 18 08:16:20 crc kubenswrapper[4917]: I0318 08:16:20.282928 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:20 crc kubenswrapper[4917]: I0318 08:16:20.283721 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:20 crc kubenswrapper[4917]: I0318 08:16:20.560942 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r6s44" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerName="registry-server" probeResult="failure" output=< Mar 18 08:16:20 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:16:20 crc kubenswrapper[4917]: > Mar 18 08:16:22 crc kubenswrapper[4917]: I0318 08:16:22.702656 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:22 crc kubenswrapper[4917]: I0318 08:16:22.703628 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-747bdb4bf8-skf9g" Mar 18 08:16:22 crc kubenswrapper[4917]: I0318 08:16:22.823302 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6f897f46d4-xp9h9"] Mar 18 08:16:22 crc kubenswrapper[4917]: I0318 08:16:22.823631 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6f897f46d4-xp9h9" podUID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerName="proxy-httpd" containerID="cri-o://6a0c3f6eb803560e0d4a6753f8179ba30f59bbd7769374f9254a9aa2df8e3827" gracePeriod=30 Mar 18 08:16:22 crc kubenswrapper[4917]: I0318 08:16:22.823813 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6f897f46d4-xp9h9" podUID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerName="proxy-server" containerID="cri-o://4f041ba39e945e1d53bca86bee1c4fd9ea043a08dfad2c16a454eaaecc71e2ae" gracePeriod=30 Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.244630 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f897f46d4-xp9h9" event={"ID":"bc0768c2-2820-4e59-bc91-badf5425c9a8","Type":"ContainerDied","Data":"4f041ba39e945e1d53bca86bee1c4fd9ea043a08dfad2c16a454eaaecc71e2ae"} Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.244563 4917 generic.go:334] "Generic (PLEG): container finished" podID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerID="4f041ba39e945e1d53bca86bee1c4fd9ea043a08dfad2c16a454eaaecc71e2ae" exitCode=0 Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.244719 4917 generic.go:334] "Generic (PLEG): container finished" podID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerID="6a0c3f6eb803560e0d4a6753f8179ba30f59bbd7769374f9254a9aa2df8e3827" exitCode=0 Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.244858 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f897f46d4-xp9h9" event={"ID":"bc0768c2-2820-4e59-bc91-badf5425c9a8","Type":"ContainerDied","Data":"6a0c3f6eb803560e0d4a6753f8179ba30f59bbd7769374f9254a9aa2df8e3827"} Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.676630 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.814421 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-config-data\") pod \"bc0768c2-2820-4e59-bc91-badf5425c9a8\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.814693 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-run-httpd\") pod \"bc0768c2-2820-4e59-bc91-badf5425c9a8\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.815300 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bc0768c2-2820-4e59-bc91-badf5425c9a8" (UID: "bc0768c2-2820-4e59-bc91-badf5425c9a8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.815389 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-etc-swift\") pod \"bc0768c2-2820-4e59-bc91-badf5425c9a8\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.815761 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgt6n\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-kube-api-access-mgt6n\") pod \"bc0768c2-2820-4e59-bc91-badf5425c9a8\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.815808 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-combined-ca-bundle\") pod \"bc0768c2-2820-4e59-bc91-badf5425c9a8\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.815838 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-log-httpd\") pod \"bc0768c2-2820-4e59-bc91-badf5425c9a8\" (UID: \"bc0768c2-2820-4e59-bc91-badf5425c9a8\") " Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.816242 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.816452 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bc0768c2-2820-4e59-bc91-badf5425c9a8" (UID: "bc0768c2-2820-4e59-bc91-badf5425c9a8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.820727 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bc0768c2-2820-4e59-bc91-badf5425c9a8" (UID: "bc0768c2-2820-4e59-bc91-badf5425c9a8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.830799 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-kube-api-access-mgt6n" (OuterVolumeSpecName: "kube-api-access-mgt6n") pod "bc0768c2-2820-4e59-bc91-badf5425c9a8" (UID: "bc0768c2-2820-4e59-bc91-badf5425c9a8"). InnerVolumeSpecName "kube-api-access-mgt6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.871746 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc0768c2-2820-4e59-bc91-badf5425c9a8" (UID: "bc0768c2-2820-4e59-bc91-badf5425c9a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.875176 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-config-data" (OuterVolumeSpecName: "config-data") pod "bc0768c2-2820-4e59-bc91-badf5425c9a8" (UID: "bc0768c2-2820-4e59-bc91-badf5425c9a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.918084 4917 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.918118 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgt6n\" (UniqueName: \"kubernetes.io/projected/bc0768c2-2820-4e59-bc91-badf5425c9a8-kube-api-access-mgt6n\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.918132 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.918142 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc0768c2-2820-4e59-bc91-badf5425c9a8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:23 crc kubenswrapper[4917]: I0318 08:16:23.918153 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc0768c2-2820-4e59-bc91-badf5425c9a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:24 crc kubenswrapper[4917]: I0318 08:16:24.258960 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f897f46d4-xp9h9" event={"ID":"bc0768c2-2820-4e59-bc91-badf5425c9a8","Type":"ContainerDied","Data":"9c4884b02b3f38fccbf2b31f895c2db2688710ad2bb84ceb739d6f329ad6c909"} Mar 18 08:16:24 crc kubenswrapper[4917]: I0318 08:16:24.259044 4917 scope.go:117] "RemoveContainer" containerID="4f041ba39e945e1d53bca86bee1c4fd9ea043a08dfad2c16a454eaaecc71e2ae" Mar 18 08:16:24 crc kubenswrapper[4917]: I0318 08:16:24.259087 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f897f46d4-xp9h9" Mar 18 08:16:24 crc kubenswrapper[4917]: I0318 08:16:24.296356 4917 scope.go:117] "RemoveContainer" containerID="6a0c3f6eb803560e0d4a6753f8179ba30f59bbd7769374f9254a9aa2df8e3827" Mar 18 08:16:24 crc kubenswrapper[4917]: I0318 08:16:24.311033 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6f897f46d4-xp9h9"] Mar 18 08:16:24 crc kubenswrapper[4917]: I0318 08:16:24.321576 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6f897f46d4-xp9h9"] Mar 18 08:16:25 crc kubenswrapper[4917]: I0318 08:16:25.792265 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc0768c2-2820-4e59-bc91-badf5425c9a8" path="/var/lib/kubelet/pods/bc0768c2-2820-4e59-bc91-badf5425c9a8/volumes" Mar 18 08:16:29 crc kubenswrapper[4917]: I0318 08:16:29.560678 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:29 crc kubenswrapper[4917]: I0318 08:16:29.648888 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:29 crc kubenswrapper[4917]: I0318 08:16:29.810031 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6s44"] Mar 18 08:16:31 crc kubenswrapper[4917]: I0318 08:16:31.338580 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r6s44" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerName="registry-server" containerID="cri-o://9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6" gracePeriod=2 Mar 18 08:16:31 crc kubenswrapper[4917]: I0318 08:16:31.859457 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:31 crc kubenswrapper[4917]: I0318 08:16:31.888008 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhpp2\" (UniqueName: \"kubernetes.io/projected/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-kube-api-access-bhpp2\") pod \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " Mar 18 08:16:31 crc kubenswrapper[4917]: I0318 08:16:31.888319 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-catalog-content\") pod \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " Mar 18 08:16:31 crc kubenswrapper[4917]: I0318 08:16:31.888344 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-utilities\") pod \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\" (UID: \"2bec30e9-d2e9-461c-a8cd-bc3c7962a557\") " Mar 18 08:16:31 crc kubenswrapper[4917]: I0318 08:16:31.890197 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-utilities" (OuterVolumeSpecName: "utilities") pod "2bec30e9-d2e9-461c-a8cd-bc3c7962a557" (UID: "2bec30e9-d2e9-461c-a8cd-bc3c7962a557"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:16:31 crc kubenswrapper[4917]: I0318 08:16:31.897005 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:31 crc kubenswrapper[4917]: I0318 08:16:31.905256 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-kube-api-access-bhpp2" (OuterVolumeSpecName: "kube-api-access-bhpp2") pod "2bec30e9-d2e9-461c-a8cd-bc3c7962a557" (UID: "2bec30e9-d2e9-461c-a8cd-bc3c7962a557"). InnerVolumeSpecName "kube-api-access-bhpp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:16:31 crc kubenswrapper[4917]: I0318 08:16:31.998183 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhpp2\" (UniqueName: \"kubernetes.io/projected/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-kube-api-access-bhpp2\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.043607 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2bec30e9-d2e9-461c-a8cd-bc3c7962a557" (UID: "2bec30e9-d2e9-461c-a8cd-bc3c7962a557"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.100110 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2bec30e9-d2e9-461c-a8cd-bc3c7962a557-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.347646 4917 generic.go:334] "Generic (PLEG): container finished" podID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerID="9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6" exitCode=0 Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.347685 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6s44" event={"ID":"2bec30e9-d2e9-461c-a8cd-bc3c7962a557","Type":"ContainerDied","Data":"9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6"} Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.347709 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r6s44" event={"ID":"2bec30e9-d2e9-461c-a8cd-bc3c7962a557","Type":"ContainerDied","Data":"d1a0b188f77a7541ee80182621abe37a3507d7b651f86b451befab323af8d7be"} Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.347725 4917 scope.go:117] "RemoveContainer" containerID="9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.347827 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r6s44" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.378736 4917 scope.go:117] "RemoveContainer" containerID="9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.383991 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r6s44"] Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.391632 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r6s44"] Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.406624 4917 scope.go:117] "RemoveContainer" containerID="a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.457050 4917 scope.go:117] "RemoveContainer" containerID="9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6" Mar 18 08:16:32 crc kubenswrapper[4917]: E0318 08:16:32.457512 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6\": container with ID starting with 9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6 not found: ID does not exist" containerID="9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.457564 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6"} err="failed to get container status \"9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6\": rpc error: code = NotFound desc = could not find container \"9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6\": container with ID starting with 9eb56b05271d7cd0c38015bd9d1fe56be45f3c5fb179966ed81ff35a813fb4b6 not found: ID does not exist" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.457624 4917 scope.go:117] "RemoveContainer" containerID="9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68" Mar 18 08:16:32 crc kubenswrapper[4917]: E0318 08:16:32.457892 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68\": container with ID starting with 9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68 not found: ID does not exist" containerID="9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.457909 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68"} err="failed to get container status \"9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68\": rpc error: code = NotFound desc = could not find container \"9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68\": container with ID starting with 9201903ab80b9a3f149a10b1e07caad83b89f12db2d06183e119916c70431b68 not found: ID does not exist" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.457942 4917 scope.go:117] "RemoveContainer" containerID="a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de" Mar 18 08:16:32 crc kubenswrapper[4917]: E0318 08:16:32.458178 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de\": container with ID starting with a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de not found: ID does not exist" containerID="a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.458215 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de"} err="failed to get container status \"a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de\": rpc error: code = NotFound desc = could not find container \"a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de\": container with ID starting with a170b1402ef0e34ddaa66e58bf940ef9bea7e3a1c908c5093611a497367ae5de not found: ID does not exist" Mar 18 08:16:32 crc kubenswrapper[4917]: E0318 08:16:32.507750 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bec30e9_d2e9_461c_a8cd_bc3c7962a557.slice\": RecentStats: unable to find data in memory cache]" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.928615 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.928984 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.929039 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.929769 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:16:32 crc kubenswrapper[4917]: I0318 08:16:32.929831 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" gracePeriod=600 Mar 18 08:16:33 crc kubenswrapper[4917]: E0318 08:16:33.075641 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:16:33 crc kubenswrapper[4917]: I0318 08:16:33.362302 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" exitCode=0 Mar 18 08:16:33 crc kubenswrapper[4917]: I0318 08:16:33.362371 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1"} Mar 18 08:16:33 crc kubenswrapper[4917]: I0318 08:16:33.362404 4917 scope.go:117] "RemoveContainer" containerID="4857c0f586b258498a2f24a9ee2cbd7165480a218adc673893556e562f68bc3b" Mar 18 08:16:33 crc kubenswrapper[4917]: I0318 08:16:33.363278 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:16:33 crc kubenswrapper[4917]: E0318 08:16:33.363733 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:16:33 crc kubenswrapper[4917]: I0318 08:16:33.784180 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" path="/var/lib/kubelet/pods/2bec30e9-d2e9-461c-a8cd-bc3c7962a557/volumes" Mar 18 08:16:45 crc kubenswrapper[4917]: I0318 08:16:45.779378 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:16:45 crc kubenswrapper[4917]: E0318 08:16:45.780320 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.632870 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ea18-account-create-update-n8jpt"] Mar 18 08:16:55 crc kubenswrapper[4917]: E0318 08:16:55.634524 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerName="registry-server" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.634545 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerName="registry-server" Mar 18 08:16:55 crc kubenswrapper[4917]: E0318 08:16:55.634599 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerName="extract-utilities" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.634608 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerName="extract-utilities" Mar 18 08:16:55 crc kubenswrapper[4917]: E0318 08:16:55.634635 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerName="proxy-httpd" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.634644 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerName="proxy-httpd" Mar 18 08:16:55 crc kubenswrapper[4917]: E0318 08:16:55.634666 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerName="extract-content" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.634675 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerName="extract-content" Mar 18 08:16:55 crc kubenswrapper[4917]: E0318 08:16:55.634706 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerName="proxy-server" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.634713 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerName="proxy-server" Mar 18 08:16:55 crc kubenswrapper[4917]: E0318 08:16:55.634731 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f20036-5cf5-4dc4-92bc-e819bebcd8c7" containerName="init" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.634738 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f20036-5cf5-4dc4-92bc-e819bebcd8c7" containerName="init" Mar 18 08:16:55 crc kubenswrapper[4917]: E0318 08:16:55.634756 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69f20036-5cf5-4dc4-92bc-e819bebcd8c7" containerName="dnsmasq-dns" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.634765 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="69f20036-5cf5-4dc4-92bc-e819bebcd8c7" containerName="dnsmasq-dns" Mar 18 08:16:55 crc kubenswrapper[4917]: E0318 08:16:55.634791 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7067b7e-c682-4627-be7d-f86139040b78" containerName="swift-ring-rebalance" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.634800 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7067b7e-c682-4627-be7d-f86139040b78" containerName="swift-ring-rebalance" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.635188 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bec30e9-d2e9-461c-a8cd-bc3c7962a557" containerName="registry-server" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.635238 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7067b7e-c682-4627-be7d-f86139040b78" containerName="swift-ring-rebalance" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.635269 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="69f20036-5cf5-4dc4-92bc-e819bebcd8c7" containerName="dnsmasq-dns" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.635291 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerName="proxy-httpd" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.635310 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc0768c2-2820-4e59-bc91-badf5425c9a8" containerName="proxy-server" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.636265 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.643122 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.655765 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-786w2"] Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.658030 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-786w2" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.667398 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-786w2"] Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.679722 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ea18-account-create-update-n8jpt"] Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.764240 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1a26d5-418d-4dc5-b71c-7db92630eb74-operator-scripts\") pod \"cinder-ea18-account-create-update-n8jpt\" (UID: \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\") " pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.764290 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc6794f-aaa6-4c41-b7c1-668c0c840241-operator-scripts\") pod \"cinder-db-create-786w2\" (UID: \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\") " pod="openstack/cinder-db-create-786w2" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.764358 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlh27\" (UniqueName: \"kubernetes.io/projected/ad1a26d5-418d-4dc5-b71c-7db92630eb74-kube-api-access-dlh27\") pod \"cinder-ea18-account-create-update-n8jpt\" (UID: \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\") " pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.764420 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/0dc6794f-aaa6-4c41-b7c1-668c0c840241-kube-api-access-t2llq\") pod \"cinder-db-create-786w2\" (UID: \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\") " pod="openstack/cinder-db-create-786w2" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.865946 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1a26d5-418d-4dc5-b71c-7db92630eb74-operator-scripts\") pod \"cinder-ea18-account-create-update-n8jpt\" (UID: \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\") " pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.865988 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc6794f-aaa6-4c41-b7c1-668c0c840241-operator-scripts\") pod \"cinder-db-create-786w2\" (UID: \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\") " pod="openstack/cinder-db-create-786w2" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.866028 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlh27\" (UniqueName: \"kubernetes.io/projected/ad1a26d5-418d-4dc5-b71c-7db92630eb74-kube-api-access-dlh27\") pod \"cinder-ea18-account-create-update-n8jpt\" (UID: \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\") " pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.866081 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/0dc6794f-aaa6-4c41-b7c1-668c0c840241-kube-api-access-t2llq\") pod \"cinder-db-create-786w2\" (UID: \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\") " pod="openstack/cinder-db-create-786w2" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.866744 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc6794f-aaa6-4c41-b7c1-668c0c840241-operator-scripts\") pod \"cinder-db-create-786w2\" (UID: \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\") " pod="openstack/cinder-db-create-786w2" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.866838 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1a26d5-418d-4dc5-b71c-7db92630eb74-operator-scripts\") pod \"cinder-ea18-account-create-update-n8jpt\" (UID: \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\") " pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.888997 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/0dc6794f-aaa6-4c41-b7c1-668c0c840241-kube-api-access-t2llq\") pod \"cinder-db-create-786w2\" (UID: \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\") " pod="openstack/cinder-db-create-786w2" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.889302 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlh27\" (UniqueName: \"kubernetes.io/projected/ad1a26d5-418d-4dc5-b71c-7db92630eb74-kube-api-access-dlh27\") pod \"cinder-ea18-account-create-update-n8jpt\" (UID: \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\") " pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.969050 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:55 crc kubenswrapper[4917]: I0318 08:16:55.980983 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-786w2" Mar 18 08:16:56 crc kubenswrapper[4917]: I0318 08:16:56.428445 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ea18-account-create-update-n8jpt"] Mar 18 08:16:56 crc kubenswrapper[4917]: W0318 08:16:56.434503 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad1a26d5_418d_4dc5_b71c_7db92630eb74.slice/crio-f40b1cbad18ecc61c56567e15f052388b92fefb0bbc4f8f7a24efe721281698c WatchSource:0}: Error finding container f40b1cbad18ecc61c56567e15f052388b92fefb0bbc4f8f7a24efe721281698c: Status 404 returned error can't find the container with id f40b1cbad18ecc61c56567e15f052388b92fefb0bbc4f8f7a24efe721281698c Mar 18 08:16:56 crc kubenswrapper[4917]: I0318 08:16:56.439777 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 18 08:16:56 crc kubenswrapper[4917]: I0318 08:16:56.443015 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-786w2"] Mar 18 08:16:56 crc kubenswrapper[4917]: W0318 08:16:56.446696 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0dc6794f_aaa6_4c41_b7c1_668c0c840241.slice/crio-9ac11a8e4eb97a56a743a42ba92696a94c046da4cf2aec9fea4b573324c09c23 WatchSource:0}: Error finding container 9ac11a8e4eb97a56a743a42ba92696a94c046da4cf2aec9fea4b573324c09c23: Status 404 returned error can't find the container with id 9ac11a8e4eb97a56a743a42ba92696a94c046da4cf2aec9fea4b573324c09c23 Mar 18 08:16:56 crc kubenswrapper[4917]: I0318 08:16:56.654737 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea18-account-create-update-n8jpt" event={"ID":"ad1a26d5-418d-4dc5-b71c-7db92630eb74","Type":"ContainerStarted","Data":"ed0dcf577a24c5b067e176deab67828293fb9859ba45d8f2690026c9ef0bcd59"} Mar 18 08:16:56 crc kubenswrapper[4917]: I0318 08:16:56.654779 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea18-account-create-update-n8jpt" event={"ID":"ad1a26d5-418d-4dc5-b71c-7db92630eb74","Type":"ContainerStarted","Data":"f40b1cbad18ecc61c56567e15f052388b92fefb0bbc4f8f7a24efe721281698c"} Mar 18 08:16:56 crc kubenswrapper[4917]: I0318 08:16:56.659688 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-786w2" event={"ID":"0dc6794f-aaa6-4c41-b7c1-668c0c840241","Type":"ContainerStarted","Data":"88bfe776a1fd558423ef36cb38306302c772f10688a8c666d523e33c02f30d3c"} Mar 18 08:16:56 crc kubenswrapper[4917]: I0318 08:16:56.659731 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-786w2" event={"ID":"0dc6794f-aaa6-4c41-b7c1-668c0c840241","Type":"ContainerStarted","Data":"9ac11a8e4eb97a56a743a42ba92696a94c046da4cf2aec9fea4b573324c09c23"} Mar 18 08:16:56 crc kubenswrapper[4917]: I0318 08:16:56.675447 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-ea18-account-create-update-n8jpt" podStartSLOduration=1.675424541 podStartE2EDuration="1.675424541s" podCreationTimestamp="2026-03-18 08:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:16:56.669984139 +0000 UTC m=+5401.611138893" watchObservedRunningTime="2026-03-18 08:16:56.675424541 +0000 UTC m=+5401.616579275" Mar 18 08:16:56 crc kubenswrapper[4917]: I0318 08:16:56.687251 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-786w2" podStartSLOduration=1.687227398 podStartE2EDuration="1.687227398s" podCreationTimestamp="2026-03-18 08:16:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:16:56.683780334 +0000 UTC m=+5401.624935048" watchObservedRunningTime="2026-03-18 08:16:56.687227398 +0000 UTC m=+5401.628382152" Mar 18 08:16:57 crc kubenswrapper[4917]: I0318 08:16:57.673626 4917 generic.go:334] "Generic (PLEG): container finished" podID="ad1a26d5-418d-4dc5-b71c-7db92630eb74" containerID="ed0dcf577a24c5b067e176deab67828293fb9859ba45d8f2690026c9ef0bcd59" exitCode=0 Mar 18 08:16:57 crc kubenswrapper[4917]: I0318 08:16:57.673800 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea18-account-create-update-n8jpt" event={"ID":"ad1a26d5-418d-4dc5-b71c-7db92630eb74","Type":"ContainerDied","Data":"ed0dcf577a24c5b067e176deab67828293fb9859ba45d8f2690026c9ef0bcd59"} Mar 18 08:16:57 crc kubenswrapper[4917]: I0318 08:16:57.676258 4917 generic.go:334] "Generic (PLEG): container finished" podID="0dc6794f-aaa6-4c41-b7c1-668c0c840241" containerID="88bfe776a1fd558423ef36cb38306302c772f10688a8c666d523e33c02f30d3c" exitCode=0 Mar 18 08:16:57 crc kubenswrapper[4917]: I0318 08:16:57.676403 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-786w2" event={"ID":"0dc6794f-aaa6-4c41-b7c1-668c0c840241","Type":"ContainerDied","Data":"88bfe776a1fd558423ef36cb38306302c772f10688a8c666d523e33c02f30d3c"} Mar 18 08:16:57 crc kubenswrapper[4917]: I0318 08:16:57.773470 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:16:57 crc kubenswrapper[4917]: E0318 08:16:57.773939 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.091008 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.102636 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-786w2" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.247924 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc6794f-aaa6-4c41-b7c1-668c0c840241-operator-scripts\") pod \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\" (UID: \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\") " Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.248033 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/0dc6794f-aaa6-4c41-b7c1-668c0c840241-kube-api-access-t2llq\") pod \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\" (UID: \"0dc6794f-aaa6-4c41-b7c1-668c0c840241\") " Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.248068 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlh27\" (UniqueName: \"kubernetes.io/projected/ad1a26d5-418d-4dc5-b71c-7db92630eb74-kube-api-access-dlh27\") pod \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\" (UID: \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\") " Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.248173 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1a26d5-418d-4dc5-b71c-7db92630eb74-operator-scripts\") pod \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\" (UID: \"ad1a26d5-418d-4dc5-b71c-7db92630eb74\") " Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.248988 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0dc6794f-aaa6-4c41-b7c1-668c0c840241-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0dc6794f-aaa6-4c41-b7c1-668c0c840241" (UID: "0dc6794f-aaa6-4c41-b7c1-668c0c840241"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.249721 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad1a26d5-418d-4dc5-b71c-7db92630eb74-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad1a26d5-418d-4dc5-b71c-7db92630eb74" (UID: "ad1a26d5-418d-4dc5-b71c-7db92630eb74"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.256836 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1a26d5-418d-4dc5-b71c-7db92630eb74-kube-api-access-dlh27" (OuterVolumeSpecName: "kube-api-access-dlh27") pod "ad1a26d5-418d-4dc5-b71c-7db92630eb74" (UID: "ad1a26d5-418d-4dc5-b71c-7db92630eb74"). InnerVolumeSpecName "kube-api-access-dlh27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.260814 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dc6794f-aaa6-4c41-b7c1-668c0c840241-kube-api-access-t2llq" (OuterVolumeSpecName: "kube-api-access-t2llq") pod "0dc6794f-aaa6-4c41-b7c1-668c0c840241" (UID: "0dc6794f-aaa6-4c41-b7c1-668c0c840241"). InnerVolumeSpecName "kube-api-access-t2llq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.350811 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0dc6794f-aaa6-4c41-b7c1-668c0c840241-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.350860 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2llq\" (UniqueName: \"kubernetes.io/projected/0dc6794f-aaa6-4c41-b7c1-668c0c840241-kube-api-access-t2llq\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.350886 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlh27\" (UniqueName: \"kubernetes.io/projected/ad1a26d5-418d-4dc5-b71c-7db92630eb74-kube-api-access-dlh27\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.350905 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad1a26d5-418d-4dc5-b71c-7db92630eb74-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.714671 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ea18-account-create-update-n8jpt" event={"ID":"ad1a26d5-418d-4dc5-b71c-7db92630eb74","Type":"ContainerDied","Data":"f40b1cbad18ecc61c56567e15f052388b92fefb0bbc4f8f7a24efe721281698c"} Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.714737 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f40b1cbad18ecc61c56567e15f052388b92fefb0bbc4f8f7a24efe721281698c" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.714862 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ea18-account-create-update-n8jpt" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.721978 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-786w2" event={"ID":"0dc6794f-aaa6-4c41-b7c1-668c0c840241","Type":"ContainerDied","Data":"9ac11a8e4eb97a56a743a42ba92696a94c046da4cf2aec9fea4b573324c09c23"} Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.722015 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac11a8e4eb97a56a743a42ba92696a94c046da4cf2aec9fea4b573324c09c23" Mar 18 08:16:59 crc kubenswrapper[4917]: I0318 08:16:59.722065 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-786w2" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.979489 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-wbgr8"] Mar 18 08:17:00 crc kubenswrapper[4917]: E0318 08:17:00.982102 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1a26d5-418d-4dc5-b71c-7db92630eb74" containerName="mariadb-account-create-update" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.982135 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1a26d5-418d-4dc5-b71c-7db92630eb74" containerName="mariadb-account-create-update" Mar 18 08:17:00 crc kubenswrapper[4917]: E0318 08:17:00.982169 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dc6794f-aaa6-4c41-b7c1-668c0c840241" containerName="mariadb-database-create" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.982178 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dc6794f-aaa6-4c41-b7c1-668c0c840241" containerName="mariadb-database-create" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.982366 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1a26d5-418d-4dc5-b71c-7db92630eb74" containerName="mariadb-account-create-update" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.982402 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dc6794f-aaa6-4c41-b7c1-668c0c840241" containerName="mariadb-database-create" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.983605 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.987655 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.987803 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xnvn9" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.987930 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 08:17:00 crc kubenswrapper[4917]: I0318 08:17:00.995963 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wbgr8"] Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.086116 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgts2\" (UniqueName: \"kubernetes.io/projected/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-kube-api-access-xgts2\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.086179 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-etc-machine-id\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.086312 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-config-data\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.086381 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-scripts\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.087186 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-combined-ca-bundle\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.087260 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-db-sync-config-data\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.189060 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-combined-ca-bundle\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.189150 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-db-sync-config-data\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.189248 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgts2\" (UniqueName: \"kubernetes.io/projected/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-kube-api-access-xgts2\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.189288 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-etc-machine-id\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.189312 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-config-data\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.189354 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-scripts\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.189640 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-etc-machine-id\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.195688 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-combined-ca-bundle\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.196929 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-db-sync-config-data\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.197869 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-config-data\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.201685 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-scripts\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.209683 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgts2\" (UniqueName: \"kubernetes.io/projected/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-kube-api-access-xgts2\") pod \"cinder-db-sync-wbgr8\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.309397 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.577107 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-wbgr8"] Mar 18 08:17:01 crc kubenswrapper[4917]: I0318 08:17:01.740678 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wbgr8" event={"ID":"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5","Type":"ContainerStarted","Data":"4b21ab38d71b3635114897588a506f0e6caeeb33bd31376e19fb687e831de6f1"} Mar 18 08:17:07 crc kubenswrapper[4917]: I0318 08:17:07.869566 4917 scope.go:117] "RemoveContainer" containerID="2e2c7ee69e1792ddb81da77d1ab7f978e0fa504fc36a5afe536ecba60f615e3c" Mar 18 08:17:08 crc kubenswrapper[4917]: I0318 08:17:08.773319 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:17:08 crc kubenswrapper[4917]: E0318 08:17:08.773992 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:17:19 crc kubenswrapper[4917]: I0318 08:17:19.772842 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:17:19 crc kubenswrapper[4917]: E0318 08:17:19.774034 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:17:21 crc kubenswrapper[4917]: I0318 08:17:21.922033 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wbgr8" event={"ID":"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5","Type":"ContainerStarted","Data":"ed10f7eb8c55d5d0c5006599edf62280def30b7222e40e9b7ebd740498928072"} Mar 18 08:17:21 crc kubenswrapper[4917]: I0318 08:17:21.945786 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-wbgr8" podStartSLOduration=2.5196524289999997 podStartE2EDuration="21.945765607s" podCreationTimestamp="2026-03-18 08:17:00 +0000 UTC" firstStartedPulling="2026-03-18 08:17:01.590691899 +0000 UTC m=+5406.531846613" lastFinishedPulling="2026-03-18 08:17:21.016805077 +0000 UTC m=+5425.957959791" observedRunningTime="2026-03-18 08:17:21.936045452 +0000 UTC m=+5426.877200166" watchObservedRunningTime="2026-03-18 08:17:21.945765607 +0000 UTC m=+5426.886920341" Mar 18 08:17:23 crc kubenswrapper[4917]: I0318 08:17:23.943969 4917 generic.go:334] "Generic (PLEG): container finished" podID="8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" containerID="ed10f7eb8c55d5d0c5006599edf62280def30b7222e40e9b7ebd740498928072" exitCode=0 Mar 18 08:17:23 crc kubenswrapper[4917]: I0318 08:17:23.944106 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wbgr8" event={"ID":"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5","Type":"ContainerDied","Data":"ed10f7eb8c55d5d0c5006599edf62280def30b7222e40e9b7ebd740498928072"} Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.343082 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.455453 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-db-sync-config-data\") pod \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.455750 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-scripts\") pod \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.455854 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-config-data\") pod \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.455897 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-etc-machine-id\") pod \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.455975 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-combined-ca-bundle\") pod \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.456077 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgts2\" (UniqueName: \"kubernetes.io/projected/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-kube-api-access-xgts2\") pod \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\" (UID: \"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5\") " Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.456105 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" (UID: "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.456644 4917 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.461119 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-kube-api-access-xgts2" (OuterVolumeSpecName: "kube-api-access-xgts2") pod "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" (UID: "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5"). InnerVolumeSpecName "kube-api-access-xgts2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.461972 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" (UID: "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.463760 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-scripts" (OuterVolumeSpecName: "scripts") pod "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" (UID: "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.484891 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" (UID: "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.517675 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-config-data" (OuterVolumeSpecName: "config-data") pod "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" (UID: "8a76d7db-f61e-45b4-b91d-9f81fb1e20b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.558203 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.558545 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.558562 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.558595 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgts2\" (UniqueName: \"kubernetes.io/projected/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-kube-api-access-xgts2\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.558610 4917 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.982452 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-wbgr8" event={"ID":"8a76d7db-f61e-45b4-b91d-9f81fb1e20b5","Type":"ContainerDied","Data":"4b21ab38d71b3635114897588a506f0e6caeeb33bd31376e19fb687e831de6f1"} Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.982498 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b21ab38d71b3635114897588a506f0e6caeeb33bd31376e19fb687e831de6f1" Mar 18 08:17:25 crc kubenswrapper[4917]: I0318 08:17:25.982524 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-wbgr8" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.418535 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb7f96c85-6pxj5"] Mar 18 08:17:26 crc kubenswrapper[4917]: E0318 08:17:26.421889 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" containerName="cinder-db-sync" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.421943 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" containerName="cinder-db-sync" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.422459 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" containerName="cinder-db-sync" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.423508 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.429982 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb7f96c85-6pxj5"] Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.518838 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.520126 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.523394 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.523767 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xnvn9" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.524015 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.524354 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.537555 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.579950 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.580021 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.580097 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-dns-svc\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.580146 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9xwq\" (UniqueName: \"kubernetes.io/projected/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-kube-api-access-s9xwq\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.580255 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-config\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.681354 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/274252b4-4327-4222-8c3a-f491643f6d94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.681397 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf5c7\" (UniqueName: \"kubernetes.io/projected/274252b4-4327-4222-8c3a-f491643f6d94-kube-api-access-kf5c7\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.681434 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-config\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.681463 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data-custom\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.681489 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.681512 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.681660 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-scripts\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.682266 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-config\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.682560 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.683516 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-sb\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.683561 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.684188 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-nb\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.684277 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/274252b4-4327-4222-8c3a-f491643f6d94-logs\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.684342 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-dns-svc\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.684401 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9xwq\" (UniqueName: \"kubernetes.io/projected/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-kube-api-access-s9xwq\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.684903 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-dns-svc\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.707559 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9xwq\" (UniqueName: \"kubernetes.io/projected/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-kube-api-access-s9xwq\") pod \"dnsmasq-dns-cb7f96c85-6pxj5\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.743344 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.786236 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/274252b4-4327-4222-8c3a-f491643f6d94-logs\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.786344 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/274252b4-4327-4222-8c3a-f491643f6d94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.786384 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf5c7\" (UniqueName: \"kubernetes.io/projected/274252b4-4327-4222-8c3a-f491643f6d94-kube-api-access-kf5c7\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.786416 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data-custom\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.786455 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.786477 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.786500 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/274252b4-4327-4222-8c3a-f491643f6d94-etc-machine-id\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.786602 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-scripts\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.788460 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/274252b4-4327-4222-8c3a-f491643f6d94-logs\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.791565 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.794078 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-scripts\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.794358 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data-custom\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.803147 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.809319 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf5c7\" (UniqueName: \"kubernetes.io/projected/274252b4-4327-4222-8c3a-f491643f6d94-kube-api-access-kf5c7\") pod \"cinder-api-0\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " pod="openstack/cinder-api-0" Mar 18 08:17:26 crc kubenswrapper[4917]: I0318 08:17:26.838350 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:17:27 crc kubenswrapper[4917]: I0318 08:17:27.219811 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb7f96c85-6pxj5"] Mar 18 08:17:27 crc kubenswrapper[4917]: I0318 08:17:27.336845 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:17:27 crc kubenswrapper[4917]: W0318 08:17:27.337732 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod274252b4_4327_4222_8c3a_f491643f6d94.slice/crio-12b3b50ace7578175b2588fc55904445f5822fa000ac4b5b12cb57f121a8cf22 WatchSource:0}: Error finding container 12b3b50ace7578175b2588fc55904445f5822fa000ac4b5b12cb57f121a8cf22: Status 404 returned error can't find the container with id 12b3b50ace7578175b2588fc55904445f5822fa000ac4b5b12cb57f121a8cf22 Mar 18 08:17:28 crc kubenswrapper[4917]: I0318 08:17:28.002900 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"274252b4-4327-4222-8c3a-f491643f6d94","Type":"ContainerStarted","Data":"12b3b50ace7578175b2588fc55904445f5822fa000ac4b5b12cb57f121a8cf22"} Mar 18 08:17:28 crc kubenswrapper[4917]: I0318 08:17:28.005752 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" event={"ID":"c0ea0608-3bbb-463a-99ac-d54f4736fc1b","Type":"ContainerStarted","Data":"1b4719fd7d503defdd18095300bfbbe03c2bf8be5a35652d55bf1ab7af9a129d"} Mar 18 08:17:29 crc kubenswrapper[4917]: I0318 08:17:29.019672 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"274252b4-4327-4222-8c3a-f491643f6d94","Type":"ContainerStarted","Data":"5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6"} Mar 18 08:17:29 crc kubenswrapper[4917]: I0318 08:17:29.023053 4917 generic.go:334] "Generic (PLEG): container finished" podID="c0ea0608-3bbb-463a-99ac-d54f4736fc1b" containerID="aa9083ff473a50ffe0898142f884b794e03493a6d7f5861037d96c74decb51ad" exitCode=0 Mar 18 08:17:29 crc kubenswrapper[4917]: I0318 08:17:29.023153 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" event={"ID":"c0ea0608-3bbb-463a-99ac-d54f4736fc1b","Type":"ContainerDied","Data":"aa9083ff473a50ffe0898142f884b794e03493a6d7f5861037d96c74decb51ad"} Mar 18 08:17:29 crc kubenswrapper[4917]: I0318 08:17:29.096165 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.032743 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" event={"ID":"c0ea0608-3bbb-463a-99ac-d54f4736fc1b","Type":"ContainerStarted","Data":"19790dc3b98925c739c07c127221b9c4e5a900f0e816310447cbc3162481e2b1"} Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.033993 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.035338 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"274252b4-4327-4222-8c3a-f491643f6d94","Type":"ContainerStarted","Data":"119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7"} Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.035433 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="274252b4-4327-4222-8c3a-f491643f6d94" containerName="cinder-api-log" containerID="cri-o://5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6" gracePeriod=30 Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.035639 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.035673 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="274252b4-4327-4222-8c3a-f491643f6d94" containerName="cinder-api" containerID="cri-o://119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7" gracePeriod=30 Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.100062 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.100044207 podStartE2EDuration="4.100044207s" podCreationTimestamp="2026-03-18 08:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:17:30.094203105 +0000 UTC m=+5435.035357819" watchObservedRunningTime="2026-03-18 08:17:30.100044207 +0000 UTC m=+5435.041198921" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.104088 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" podStartSLOduration=4.104066444 podStartE2EDuration="4.104066444s" podCreationTimestamp="2026-03-18 08:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:17:30.065104029 +0000 UTC m=+5435.006258743" watchObservedRunningTime="2026-03-18 08:17:30.104066444 +0000 UTC m=+5435.045221158" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.629294 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.768003 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data-custom\") pod \"274252b4-4327-4222-8c3a-f491643f6d94\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.768071 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data\") pod \"274252b4-4327-4222-8c3a-f491643f6d94\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.768198 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/274252b4-4327-4222-8c3a-f491643f6d94-etc-machine-id\") pod \"274252b4-4327-4222-8c3a-f491643f6d94\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.768274 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf5c7\" (UniqueName: \"kubernetes.io/projected/274252b4-4327-4222-8c3a-f491643f6d94-kube-api-access-kf5c7\") pod \"274252b4-4327-4222-8c3a-f491643f6d94\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.768298 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/274252b4-4327-4222-8c3a-f491643f6d94-logs\") pod \"274252b4-4327-4222-8c3a-f491643f6d94\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.768315 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-combined-ca-bundle\") pod \"274252b4-4327-4222-8c3a-f491643f6d94\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.768363 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-scripts\") pod \"274252b4-4327-4222-8c3a-f491643f6d94\" (UID: \"274252b4-4327-4222-8c3a-f491643f6d94\") " Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.769031 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/274252b4-4327-4222-8c3a-f491643f6d94-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "274252b4-4327-4222-8c3a-f491643f6d94" (UID: "274252b4-4327-4222-8c3a-f491643f6d94"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.769914 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/274252b4-4327-4222-8c3a-f491643f6d94-logs" (OuterVolumeSpecName: "logs") pod "274252b4-4327-4222-8c3a-f491643f6d94" (UID: "274252b4-4327-4222-8c3a-f491643f6d94"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.775679 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274252b4-4327-4222-8c3a-f491643f6d94-kube-api-access-kf5c7" (OuterVolumeSpecName: "kube-api-access-kf5c7") pod "274252b4-4327-4222-8c3a-f491643f6d94" (UID: "274252b4-4327-4222-8c3a-f491643f6d94"). InnerVolumeSpecName "kube-api-access-kf5c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.777784 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-scripts" (OuterVolumeSpecName: "scripts") pod "274252b4-4327-4222-8c3a-f491643f6d94" (UID: "274252b4-4327-4222-8c3a-f491643f6d94"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.796893 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "274252b4-4327-4222-8c3a-f491643f6d94" (UID: "274252b4-4327-4222-8c3a-f491643f6d94"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.798018 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "274252b4-4327-4222-8c3a-f491643f6d94" (UID: "274252b4-4327-4222-8c3a-f491643f6d94"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.823560 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data" (OuterVolumeSpecName: "config-data") pod "274252b4-4327-4222-8c3a-f491643f6d94" (UID: "274252b4-4327-4222-8c3a-f491643f6d94"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.870533 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/274252b4-4327-4222-8c3a-f491643f6d94-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.870849 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kf5c7\" (UniqueName: \"kubernetes.io/projected/274252b4-4327-4222-8c3a-f491643f6d94-kube-api-access-kf5c7\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.870945 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.871149 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.871245 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.871309 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/274252b4-4327-4222-8c3a-f491643f6d94-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:30 crc kubenswrapper[4917]: I0318 08:17:30.871392 4917 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/274252b4-4327-4222-8c3a-f491643f6d94-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.049289 4917 generic.go:334] "Generic (PLEG): container finished" podID="274252b4-4327-4222-8c3a-f491643f6d94" containerID="119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7" exitCode=0 Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.049641 4917 generic.go:334] "Generic (PLEG): container finished" podID="274252b4-4327-4222-8c3a-f491643f6d94" containerID="5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6" exitCode=143 Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.049364 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"274252b4-4327-4222-8c3a-f491643f6d94","Type":"ContainerDied","Data":"119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7"} Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.049393 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.049717 4917 scope.go:117] "RemoveContainer" containerID="119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.049701 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"274252b4-4327-4222-8c3a-f491643f6d94","Type":"ContainerDied","Data":"5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6"} Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.050321 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"274252b4-4327-4222-8c3a-f491643f6d94","Type":"ContainerDied","Data":"12b3b50ace7578175b2588fc55904445f5822fa000ac4b5b12cb57f121a8cf22"} Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.087986 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.094266 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.100416 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:17:31 crc kubenswrapper[4917]: E0318 08:17:31.100744 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274252b4-4327-4222-8c3a-f491643f6d94" containerName="cinder-api" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.100760 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="274252b4-4327-4222-8c3a-f491643f6d94" containerName="cinder-api" Mar 18 08:17:31 crc kubenswrapper[4917]: E0318 08:17:31.100769 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274252b4-4327-4222-8c3a-f491643f6d94" containerName="cinder-api-log" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.100776 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="274252b4-4327-4222-8c3a-f491643f6d94" containerName="cinder-api-log" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.100960 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="274252b4-4327-4222-8c3a-f491643f6d94" containerName="cinder-api-log" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.100977 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="274252b4-4327-4222-8c3a-f491643f6d94" containerName="cinder-api" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.101840 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.107704 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.107976 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xnvn9" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.108177 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.108334 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.108416 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.108729 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.115366 4917 scope.go:117] "RemoveContainer" containerID="5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.117667 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.153635 4917 scope.go:117] "RemoveContainer" containerID="119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7" Mar 18 08:17:31 crc kubenswrapper[4917]: E0318 08:17:31.154780 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7\": container with ID starting with 119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7 not found: ID does not exist" containerID="119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.154819 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7"} err="failed to get container status \"119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7\": rpc error: code = NotFound desc = could not find container \"119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7\": container with ID starting with 119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7 not found: ID does not exist" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.154848 4917 scope.go:117] "RemoveContainer" containerID="5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6" Mar 18 08:17:31 crc kubenswrapper[4917]: E0318 08:17:31.155209 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6\": container with ID starting with 5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6 not found: ID does not exist" containerID="5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.155239 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6"} err="failed to get container status \"5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6\": rpc error: code = NotFound desc = could not find container \"5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6\": container with ID starting with 5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6 not found: ID does not exist" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.155254 4917 scope.go:117] "RemoveContainer" containerID="119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.155481 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7"} err="failed to get container status \"119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7\": rpc error: code = NotFound desc = could not find container \"119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7\": container with ID starting with 119813d23a7fb14327d6285c571244efa530c3e0ef17ab12adbe8b9f8d2f75f7 not found: ID does not exist" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.155507 4917 scope.go:117] "RemoveContainer" containerID="5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.155771 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6"} err="failed to get container status \"5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6\": rpc error: code = NotFound desc = could not find container \"5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6\": container with ID starting with 5534ceafc6a54c62351ed79de93175cbdac6ccb7734d0f3988f675dbb81f87f6 not found: ID does not exist" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.290813 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f564a2b-47d8-4508-946f-be6b466fc27a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.291039 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-scripts\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.291216 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f77rn\" (UniqueName: \"kubernetes.io/projected/4f564a2b-47d8-4508-946f-be6b466fc27a-kube-api-access-f77rn\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.291308 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.291446 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.291535 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.291647 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.291694 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data-custom\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.291732 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f564a2b-47d8-4508-946f-be6b466fc27a-logs\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.393341 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.393418 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.393460 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data-custom\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.393510 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f564a2b-47d8-4508-946f-be6b466fc27a-logs\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.393640 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f564a2b-47d8-4508-946f-be6b466fc27a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.393731 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-scripts\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.393773 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f77rn\" (UniqueName: \"kubernetes.io/projected/4f564a2b-47d8-4508-946f-be6b466fc27a-kube-api-access-f77rn\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.393813 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.393880 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.394334 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f564a2b-47d8-4508-946f-be6b466fc27a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.394957 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f564a2b-47d8-4508-946f-be6b466fc27a-logs\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.400341 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-scripts\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.400469 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data-custom\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.400466 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.400630 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.402087 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.402112 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.421549 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f77rn\" (UniqueName: \"kubernetes.io/projected/4f564a2b-47d8-4508-946f-be6b466fc27a-kube-api-access-f77rn\") pod \"cinder-api-0\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.434113 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.792658 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274252b4-4327-4222-8c3a-f491643f6d94" path="/var/lib/kubelet/pods/274252b4-4327-4222-8c3a-f491643f6d94/volumes" Mar 18 08:17:31 crc kubenswrapper[4917]: W0318 08:17:31.971571 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f564a2b_47d8_4508_946f_be6b466fc27a.slice/crio-73c4fdb5523babd59d7273ed5bc2c8054b7426e6dfeb0f144e90c1d981949e0b WatchSource:0}: Error finding container 73c4fdb5523babd59d7273ed5bc2c8054b7426e6dfeb0f144e90c1d981949e0b: Status 404 returned error can't find the container with id 73c4fdb5523babd59d7273ed5bc2c8054b7426e6dfeb0f144e90c1d981949e0b Mar 18 08:17:31 crc kubenswrapper[4917]: I0318 08:17:31.972858 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:17:32 crc kubenswrapper[4917]: I0318 08:17:32.068633 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f564a2b-47d8-4508-946f-be6b466fc27a","Type":"ContainerStarted","Data":"73c4fdb5523babd59d7273ed5bc2c8054b7426e6dfeb0f144e90c1d981949e0b"} Mar 18 08:17:33 crc kubenswrapper[4917]: I0318 08:17:33.097719 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f564a2b-47d8-4508-946f-be6b466fc27a","Type":"ContainerStarted","Data":"c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d"} Mar 18 08:17:34 crc kubenswrapper[4917]: I0318 08:17:34.115314 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f564a2b-47d8-4508-946f-be6b466fc27a","Type":"ContainerStarted","Data":"fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47"} Mar 18 08:17:34 crc kubenswrapper[4917]: I0318 08:17:34.116194 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 08:17:34 crc kubenswrapper[4917]: I0318 08:17:34.160097 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.160067582 podStartE2EDuration="3.160067582s" podCreationTimestamp="2026-03-18 08:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:17:34.143629453 +0000 UTC m=+5439.084784248" watchObservedRunningTime="2026-03-18 08:17:34.160067582 +0000 UTC m=+5439.101222366" Mar 18 08:17:34 crc kubenswrapper[4917]: I0318 08:17:34.774923 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:17:34 crc kubenswrapper[4917]: E0318 08:17:34.775231 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:17:36 crc kubenswrapper[4917]: I0318 08:17:36.745850 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:17:36 crc kubenswrapper[4917]: I0318 08:17:36.848825 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668f47984f-844s4"] Mar 18 08:17:36 crc kubenswrapper[4917]: I0318 08:17:36.849103 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-668f47984f-844s4" podUID="681c0c83-35a8-495b-ab0e-b87d4543bb1c" containerName="dnsmasq-dns" containerID="cri-o://dad7f5941aaf62da67618a4a0d5c9ede2035754f3b11a9701c153b35e6a39313" gracePeriod=10 Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.153963 4917 generic.go:334] "Generic (PLEG): container finished" podID="681c0c83-35a8-495b-ab0e-b87d4543bb1c" containerID="dad7f5941aaf62da67618a4a0d5c9ede2035754f3b11a9701c153b35e6a39313" exitCode=0 Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.154001 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668f47984f-844s4" event={"ID":"681c0c83-35a8-495b-ab0e-b87d4543bb1c","Type":"ContainerDied","Data":"dad7f5941aaf62da67618a4a0d5c9ede2035754f3b11a9701c153b35e6a39313"} Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.321373 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.417265 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-config\") pod \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.417376 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-nb\") pod \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.461533 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-config" (OuterVolumeSpecName: "config") pod "681c0c83-35a8-495b-ab0e-b87d4543bb1c" (UID: "681c0c83-35a8-495b-ab0e-b87d4543bb1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.467766 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "681c0c83-35a8-495b-ab0e-b87d4543bb1c" (UID: "681c0c83-35a8-495b-ab0e-b87d4543bb1c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.518483 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d46mr\" (UniqueName: \"kubernetes.io/projected/681c0c83-35a8-495b-ab0e-b87d4543bb1c-kube-api-access-d46mr\") pod \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.518526 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-sb\") pod \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.518568 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-dns-svc\") pod \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\" (UID: \"681c0c83-35a8-495b-ab0e-b87d4543bb1c\") " Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.519100 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.519116 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.522122 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/681c0c83-35a8-495b-ab0e-b87d4543bb1c-kube-api-access-d46mr" (OuterVolumeSpecName: "kube-api-access-d46mr") pod "681c0c83-35a8-495b-ab0e-b87d4543bb1c" (UID: "681c0c83-35a8-495b-ab0e-b87d4543bb1c"). InnerVolumeSpecName "kube-api-access-d46mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.563102 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "681c0c83-35a8-495b-ab0e-b87d4543bb1c" (UID: "681c0c83-35a8-495b-ab0e-b87d4543bb1c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.566356 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "681c0c83-35a8-495b-ab0e-b87d4543bb1c" (UID: "681c0c83-35a8-495b-ab0e-b87d4543bb1c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.621369 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d46mr\" (UniqueName: \"kubernetes.io/projected/681c0c83-35a8-495b-ab0e-b87d4543bb1c-kube-api-access-d46mr\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.621448 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:37 crc kubenswrapper[4917]: I0318 08:17:37.621477 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/681c0c83-35a8-495b-ab0e-b87d4543bb1c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.169114 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668f47984f-844s4" event={"ID":"681c0c83-35a8-495b-ab0e-b87d4543bb1c","Type":"ContainerDied","Data":"06fd1032b730e34189d51ce4d2e6e5adbe2644d50a747d8524af92cf31329175"} Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.169191 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668f47984f-844s4" Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.169201 4917 scope.go:117] "RemoveContainer" containerID="dad7f5941aaf62da67618a4a0d5c9ede2035754f3b11a9701c153b35e6a39313" Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.204803 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668f47984f-844s4"] Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.216374 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-668f47984f-844s4"] Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.217092 4917 scope.go:117] "RemoveContainer" containerID="c5f570a90c3510c8cf370963eb8cd8506aa23020395c6d09363cb7c164daa56b" Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.997623 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nmvwc"] Mar 18 08:17:38 crc kubenswrapper[4917]: E0318 08:17:38.998512 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="681c0c83-35a8-495b-ab0e-b87d4543bb1c" containerName="dnsmasq-dns" Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.998538 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="681c0c83-35a8-495b-ab0e-b87d4543bb1c" containerName="dnsmasq-dns" Mar 18 08:17:38 crc kubenswrapper[4917]: E0318 08:17:38.998565 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="681c0c83-35a8-495b-ab0e-b87d4543bb1c" containerName="init" Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.998575 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="681c0c83-35a8-495b-ab0e-b87d4543bb1c" containerName="init" Mar 18 08:17:38 crc kubenswrapper[4917]: I0318 08:17:38.998878 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="681c0c83-35a8-495b-ab0e-b87d4543bb1c" containerName="dnsmasq-dns" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.001605 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.017958 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nmvwc"] Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.154754 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7685910a-28c7-4d3a-83c4-22be2b2a3bf8-catalog-content\") pod \"certified-operators-nmvwc\" (UID: \"7685910a-28c7-4d3a-83c4-22be2b2a3bf8\") " pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.154840 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8grvn\" (UniqueName: \"kubernetes.io/projected/7685910a-28c7-4d3a-83c4-22be2b2a3bf8-kube-api-access-8grvn\") pod \"certified-operators-nmvwc\" (UID: \"7685910a-28c7-4d3a-83c4-22be2b2a3bf8\") " pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.155053 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7685910a-28c7-4d3a-83c4-22be2b2a3bf8-utilities\") pod \"certified-operators-nmvwc\" (UID: \"7685910a-28c7-4d3a-83c4-22be2b2a3bf8\") " pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.256480 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7685910a-28c7-4d3a-83c4-22be2b2a3bf8-catalog-content\") pod \"certified-operators-nmvwc\" (UID: \"7685910a-28c7-4d3a-83c4-22be2b2a3bf8\") " pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.256557 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8grvn\" (UniqueName: \"kubernetes.io/projected/7685910a-28c7-4d3a-83c4-22be2b2a3bf8-kube-api-access-8grvn\") pod \"certified-operators-nmvwc\" (UID: \"7685910a-28c7-4d3a-83c4-22be2b2a3bf8\") " pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.256637 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7685910a-28c7-4d3a-83c4-22be2b2a3bf8-utilities\") pod \"certified-operators-nmvwc\" (UID: \"7685910a-28c7-4d3a-83c4-22be2b2a3bf8\") " pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.256989 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7685910a-28c7-4d3a-83c4-22be2b2a3bf8-catalog-content\") pod \"certified-operators-nmvwc\" (UID: \"7685910a-28c7-4d3a-83c4-22be2b2a3bf8\") " pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.257022 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7685910a-28c7-4d3a-83c4-22be2b2a3bf8-utilities\") pod \"certified-operators-nmvwc\" (UID: \"7685910a-28c7-4d3a-83c4-22be2b2a3bf8\") " pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.290061 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8grvn\" (UniqueName: \"kubernetes.io/projected/7685910a-28c7-4d3a-83c4-22be2b2a3bf8-kube-api-access-8grvn\") pod \"certified-operators-nmvwc\" (UID: \"7685910a-28c7-4d3a-83c4-22be2b2a3bf8\") " pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.337223 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.782574 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="681c0c83-35a8-495b-ab0e-b87d4543bb1c" path="/var/lib/kubelet/pods/681c0c83-35a8-495b-ab0e-b87d4543bb1c/volumes" Mar 18 08:17:39 crc kubenswrapper[4917]: I0318 08:17:39.805012 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nmvwc"] Mar 18 08:17:40 crc kubenswrapper[4917]: I0318 08:17:40.192361 4917 generic.go:334] "Generic (PLEG): container finished" podID="7685910a-28c7-4d3a-83c4-22be2b2a3bf8" containerID="8538f2335e42d1b5f6e15bceafdff102cbb79e7a46a3972c70635f7954f3b0ee" exitCode=0 Mar 18 08:17:40 crc kubenswrapper[4917]: I0318 08:17:40.192459 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmvwc" event={"ID":"7685910a-28c7-4d3a-83c4-22be2b2a3bf8","Type":"ContainerDied","Data":"8538f2335e42d1b5f6e15bceafdff102cbb79e7a46a3972c70635f7954f3b0ee"} Mar 18 08:17:40 crc kubenswrapper[4917]: I0318 08:17:40.193082 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmvwc" event={"ID":"7685910a-28c7-4d3a-83c4-22be2b2a3bf8","Type":"ContainerStarted","Data":"6394a541a3c97d35430413490a37eb2432970dd4c2de93c77650ab9217872b57"} Mar 18 08:17:43 crc kubenswrapper[4917]: I0318 08:17:43.319351 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 08:17:45 crc kubenswrapper[4917]: I0318 08:17:45.239338 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmvwc" event={"ID":"7685910a-28c7-4d3a-83c4-22be2b2a3bf8","Type":"ContainerStarted","Data":"c0618bd2d23887c22b26b6767d7050c3724cc688a8d7b248a2bed367d80ed736"} Mar 18 08:17:46 crc kubenswrapper[4917]: I0318 08:17:46.251018 4917 generic.go:334] "Generic (PLEG): container finished" podID="7685910a-28c7-4d3a-83c4-22be2b2a3bf8" containerID="c0618bd2d23887c22b26b6767d7050c3724cc688a8d7b248a2bed367d80ed736" exitCode=0 Mar 18 08:17:46 crc kubenswrapper[4917]: I0318 08:17:46.251284 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmvwc" event={"ID":"7685910a-28c7-4d3a-83c4-22be2b2a3bf8","Type":"ContainerDied","Data":"c0618bd2d23887c22b26b6767d7050c3724cc688a8d7b248a2bed367d80ed736"} Mar 18 08:17:47 crc kubenswrapper[4917]: I0318 08:17:47.265027 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nmvwc" event={"ID":"7685910a-28c7-4d3a-83c4-22be2b2a3bf8","Type":"ContainerStarted","Data":"a2757dfe786bbe280a34400754bb8a279228a341f2294096986d114d2d412736"} Mar 18 08:17:47 crc kubenswrapper[4917]: I0318 08:17:47.298011 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nmvwc" podStartSLOduration=2.66448118 podStartE2EDuration="9.297993218s" podCreationTimestamp="2026-03-18 08:17:38 +0000 UTC" firstStartedPulling="2026-03-18 08:17:40.195199519 +0000 UTC m=+5445.136354273" lastFinishedPulling="2026-03-18 08:17:46.828711557 +0000 UTC m=+5451.769866311" observedRunningTime="2026-03-18 08:17:47.290282841 +0000 UTC m=+5452.231437555" watchObservedRunningTime="2026-03-18 08:17:47.297993218 +0000 UTC m=+5452.239147922" Mar 18 08:17:47 crc kubenswrapper[4917]: I0318 08:17:47.773075 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:17:47 crc kubenswrapper[4917]: E0318 08:17:47.773371 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:17:49 crc kubenswrapper[4917]: I0318 08:17:49.337760 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:49 crc kubenswrapper[4917]: I0318 08:17:49.338248 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:49 crc kubenswrapper[4917]: I0318 08:17:49.399330 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:58 crc kubenswrapper[4917]: E0318 08:17:58.891901 4917 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.184:46202->38.102.83.184:33891: write tcp 38.102.83.184:46202->38.102.83.184:33891: write: broken pipe Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.415048 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nmvwc" Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.711419 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nmvwc"] Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.807345 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.809065 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.813932 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.819351 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.875673 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqzh7"] Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.876328 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rqzh7" podUID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerName="registry-server" containerID="cri-o://4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a" gracePeriod=2 Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.978764 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90aa42b3-0a3b-4966-8522-b8e30bca4432-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.978821 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.978919 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.978944 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-scripts\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.978972 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:17:59 crc kubenswrapper[4917]: I0318 08:17:59.978996 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcctf\" (UniqueName: \"kubernetes.io/projected/90aa42b3-0a3b-4966-8522-b8e30bca4432-kube-api-access-bcctf\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.080663 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90aa42b3-0a3b-4966-8522-b8e30bca4432-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.080717 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.080776 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.080783 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90aa42b3-0a3b-4966-8522-b8e30bca4432-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.080811 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-scripts\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.080922 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.080978 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcctf\" (UniqueName: \"kubernetes.io/projected/90aa42b3-0a3b-4966-8522-b8e30bca4432-kube-api-access-bcctf\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.086550 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.094168 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.097053 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-scripts\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.098663 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcctf\" (UniqueName: \"kubernetes.io/projected/90aa42b3-0a3b-4966-8522-b8e30bca4432-kube-api-access-bcctf\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.102467 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data\") pod \"cinder-scheduler-0\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.143693 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.145768 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563698-xss2r"] Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.147047 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563698-xss2r" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.149022 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.149367 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.149439 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.158053 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563698-xss2r"] Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.284715 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2fpf\" (UniqueName: \"kubernetes.io/projected/96877157-34b5-49d3-a3b4-e7db7a2952ff-kube-api-access-h2fpf\") pod \"auto-csr-approver-29563698-xss2r\" (UID: \"96877157-34b5-49d3-a3b4-e7db7a2952ff\") " pod="openshift-infra/auto-csr-approver-29563698-xss2r" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.350739 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.389693 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2fpf\" (UniqueName: \"kubernetes.io/projected/96877157-34b5-49d3-a3b4-e7db7a2952ff-kube-api-access-h2fpf\") pod \"auto-csr-approver-29563698-xss2r\" (UID: \"96877157-34b5-49d3-a3b4-e7db7a2952ff\") " pod="openshift-infra/auto-csr-approver-29563698-xss2r" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.412299 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2fpf\" (UniqueName: \"kubernetes.io/projected/96877157-34b5-49d3-a3b4-e7db7a2952ff-kube-api-access-h2fpf\") pod \"auto-csr-approver-29563698-xss2r\" (UID: \"96877157-34b5-49d3-a3b4-e7db7a2952ff\") " pod="openshift-infra/auto-csr-approver-29563698-xss2r" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.450175 4917 generic.go:334] "Generic (PLEG): container finished" podID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerID="4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a" exitCode=0 Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.451048 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rqzh7" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.451435 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqzh7" event={"ID":"e4bdf3cd-7fe6-4853-a75d-dfb580089f25","Type":"ContainerDied","Data":"4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a"} Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.451457 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rqzh7" event={"ID":"e4bdf3cd-7fe6-4853-a75d-dfb580089f25","Type":"ContainerDied","Data":"a527075f51924d0d1a0e971a5815ae48041d85ccddcb257d05b4e60fa85cfcf4"} Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.451472 4917 scope.go:117] "RemoveContainer" containerID="4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.479039 4917 scope.go:117] "RemoveContainer" containerID="f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.491210 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-utilities\") pod \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.491400 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx84h\" (UniqueName: \"kubernetes.io/projected/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-kube-api-access-jx84h\") pod \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.491527 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-catalog-content\") pod \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\" (UID: \"e4bdf3cd-7fe6-4853-a75d-dfb580089f25\") " Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.492327 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-utilities" (OuterVolumeSpecName: "utilities") pod "e4bdf3cd-7fe6-4853-a75d-dfb580089f25" (UID: "e4bdf3cd-7fe6-4853-a75d-dfb580089f25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.496157 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-kube-api-access-jx84h" (OuterVolumeSpecName: "kube-api-access-jx84h") pod "e4bdf3cd-7fe6-4853-a75d-dfb580089f25" (UID: "e4bdf3cd-7fe6-4853-a75d-dfb580089f25"). InnerVolumeSpecName "kube-api-access-jx84h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.499948 4917 scope.go:117] "RemoveContainer" containerID="34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.519962 4917 scope.go:117] "RemoveContainer" containerID="4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a" Mar 18 08:18:00 crc kubenswrapper[4917]: E0318 08:18:00.520347 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a\": container with ID starting with 4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a not found: ID does not exist" containerID="4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.520391 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a"} err="failed to get container status \"4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a\": rpc error: code = NotFound desc = could not find container \"4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a\": container with ID starting with 4a5935ad8b60c7ec1dd8d512531515cfcd92af7c9aedc4f0dad92754fcd1c03a not found: ID does not exist" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.520409 4917 scope.go:117] "RemoveContainer" containerID="f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95" Mar 18 08:18:00 crc kubenswrapper[4917]: E0318 08:18:00.520796 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95\": container with ID starting with f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95 not found: ID does not exist" containerID="f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.520823 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95"} err="failed to get container status \"f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95\": rpc error: code = NotFound desc = could not find container \"f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95\": container with ID starting with f33fe8c130bd0027fe34fe80451f15b19aeb271e37263323dbd4c1e37a648c95 not found: ID does not exist" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.520836 4917 scope.go:117] "RemoveContainer" containerID="34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0" Mar 18 08:18:00 crc kubenswrapper[4917]: E0318 08:18:00.521189 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0\": container with ID starting with 34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0 not found: ID does not exist" containerID="34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.521216 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0"} err="failed to get container status \"34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0\": rpc error: code = NotFound desc = could not find container \"34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0\": container with ID starting with 34e153e8dce19f4c6e3ed879a6dfc3606a2cbc74c7c8b7a05d1cb518453e46e0 not found: ID does not exist" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.546419 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4bdf3cd-7fe6-4853-a75d-dfb580089f25" (UID: "e4bdf3cd-7fe6-4853-a75d-dfb580089f25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.570573 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563698-xss2r" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.593627 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx84h\" (UniqueName: \"kubernetes.io/projected/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-kube-api-access-jx84h\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.593664 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.593674 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bdf3cd-7fe6-4853-a75d-dfb580089f25-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.752807 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.802943 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rqzh7"] Mar 18 08:18:00 crc kubenswrapper[4917]: I0318 08:18:00.811335 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rqzh7"] Mar 18 08:18:01 crc kubenswrapper[4917]: I0318 08:18:01.080563 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563698-xss2r"] Mar 18 08:18:01 crc kubenswrapper[4917]: W0318 08:18:01.100543 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96877157_34b5_49d3_a3b4_e7db7a2952ff.slice/crio-d22f190a3f27e3b7bde822b216cb399800c4205f774fe423bfd5c300d7e7f8fd WatchSource:0}: Error finding container d22f190a3f27e3b7bde822b216cb399800c4205f774fe423bfd5c300d7e7f8fd: Status 404 returned error can't find the container with id d22f190a3f27e3b7bde822b216cb399800c4205f774fe423bfd5c300d7e7f8fd Mar 18 08:18:01 crc kubenswrapper[4917]: I0318 08:18:01.461458 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563698-xss2r" event={"ID":"96877157-34b5-49d3-a3b4-e7db7a2952ff","Type":"ContainerStarted","Data":"d22f190a3f27e3b7bde822b216cb399800c4205f774fe423bfd5c300d7e7f8fd"} Mar 18 08:18:01 crc kubenswrapper[4917]: I0318 08:18:01.469262 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90aa42b3-0a3b-4966-8522-b8e30bca4432","Type":"ContainerStarted","Data":"d1ed2486690290d7855d24f7419d8d28f8081c131075078fb1ad9c1a51ad8032"} Mar 18 08:18:01 crc kubenswrapper[4917]: I0318 08:18:01.576678 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:18:01 crc kubenswrapper[4917]: I0318 08:18:01.576910 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerName="cinder-api-log" containerID="cri-o://c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d" gracePeriod=30 Mar 18 08:18:01 crc kubenswrapper[4917]: I0318 08:18:01.576986 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerName="cinder-api" containerID="cri-o://fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47" gracePeriod=30 Mar 18 08:18:01 crc kubenswrapper[4917]: I0318 08:18:01.782551 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" path="/var/lib/kubelet/pods/e4bdf3cd-7fe6-4853-a75d-dfb580089f25/volumes" Mar 18 08:18:02 crc kubenswrapper[4917]: I0318 08:18:02.480836 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90aa42b3-0a3b-4966-8522-b8e30bca4432","Type":"ContainerStarted","Data":"020f67fd2394d310031527ecf730be68692ca258efd7335b0c6764849b50284c"} Mar 18 08:18:02 crc kubenswrapper[4917]: I0318 08:18:02.481470 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90aa42b3-0a3b-4966-8522-b8e30bca4432","Type":"ContainerStarted","Data":"a6776147629fe8772d09861dc1749c6ed443ffdd735181d0c2c2b6eaa3ae05d9"} Mar 18 08:18:02 crc kubenswrapper[4917]: I0318 08:18:02.483076 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563698-xss2r" event={"ID":"96877157-34b5-49d3-a3b4-e7db7a2952ff","Type":"ContainerStarted","Data":"cbd24c68189d983b64dca22a063d908da5a1b7d1d707de951fd6a726ad633846"} Mar 18 08:18:02 crc kubenswrapper[4917]: I0318 08:18:02.485935 4917 generic.go:334] "Generic (PLEG): container finished" podID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerID="c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d" exitCode=143 Mar 18 08:18:02 crc kubenswrapper[4917]: I0318 08:18:02.485976 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f564a2b-47d8-4508-946f-be6b466fc27a","Type":"ContainerDied","Data":"c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d"} Mar 18 08:18:02 crc kubenswrapper[4917]: I0318 08:18:02.500096 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.184075882 podStartE2EDuration="3.500073965s" podCreationTimestamp="2026-03-18 08:17:59 +0000 UTC" firstStartedPulling="2026-03-18 08:18:00.714671915 +0000 UTC m=+5465.655826629" lastFinishedPulling="2026-03-18 08:18:01.030669998 +0000 UTC m=+5465.971824712" observedRunningTime="2026-03-18 08:18:02.496214281 +0000 UTC m=+5467.437369045" watchObservedRunningTime="2026-03-18 08:18:02.500073965 +0000 UTC m=+5467.441228679" Mar 18 08:18:02 crc kubenswrapper[4917]: I0318 08:18:02.509912 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563698-xss2r" podStartSLOduration=1.5970886549999999 podStartE2EDuration="2.509886393s" podCreationTimestamp="2026-03-18 08:18:00 +0000 UTC" firstStartedPulling="2026-03-18 08:18:01.10582338 +0000 UTC m=+5466.046978084" lastFinishedPulling="2026-03-18 08:18:02.018621108 +0000 UTC m=+5466.959775822" observedRunningTime="2026-03-18 08:18:02.50810581 +0000 UTC m=+5467.449260524" watchObservedRunningTime="2026-03-18 08:18:02.509886393 +0000 UTC m=+5467.451041107" Mar 18 08:18:02 crc kubenswrapper[4917]: I0318 08:18:02.772389 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:18:02 crc kubenswrapper[4917]: E0318 08:18:02.772763 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:18:03 crc kubenswrapper[4917]: I0318 08:18:03.499759 4917 generic.go:334] "Generic (PLEG): container finished" podID="96877157-34b5-49d3-a3b4-e7db7a2952ff" containerID="cbd24c68189d983b64dca22a063d908da5a1b7d1d707de951fd6a726ad633846" exitCode=0 Mar 18 08:18:03 crc kubenswrapper[4917]: I0318 08:18:03.499904 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563698-xss2r" event={"ID":"96877157-34b5-49d3-a3b4-e7db7a2952ff","Type":"ContainerDied","Data":"cbd24c68189d983b64dca22a063d908da5a1b7d1d707de951fd6a726ad633846"} Mar 18 08:18:04 crc kubenswrapper[4917]: I0318 08:18:04.873341 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563698-xss2r" Mar 18 08:18:04 crc kubenswrapper[4917]: I0318 08:18:04.987217 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2fpf\" (UniqueName: \"kubernetes.io/projected/96877157-34b5-49d3-a3b4-e7db7a2952ff-kube-api-access-h2fpf\") pod \"96877157-34b5-49d3-a3b4-e7db7a2952ff\" (UID: \"96877157-34b5-49d3-a3b4-e7db7a2952ff\") " Mar 18 08:18:04 crc kubenswrapper[4917]: I0318 08:18:04.992388 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96877157-34b5-49d3-a3b4-e7db7a2952ff-kube-api-access-h2fpf" (OuterVolumeSpecName: "kube-api-access-h2fpf") pod "96877157-34b5-49d3-a3b4-e7db7a2952ff" (UID: "96877157-34b5-49d3-a3b4-e7db7a2952ff"). InnerVolumeSpecName "kube-api-access-h2fpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.089007 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2fpf\" (UniqueName: \"kubernetes.io/projected/96877157-34b5-49d3-a3b4-e7db7a2952ff-kube-api-access-h2fpf\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.144108 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.519010 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.522334 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563698-xss2r" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.522312 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563698-xss2r" event={"ID":"96877157-34b5-49d3-a3b4-e7db7a2952ff","Type":"ContainerDied","Data":"d22f190a3f27e3b7bde822b216cb399800c4205f774fe423bfd5c300d7e7f8fd"} Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.522682 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d22f190a3f27e3b7bde822b216cb399800c4205f774fe423bfd5c300d7e7f8fd" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.536660 4917 generic.go:334] "Generic (PLEG): container finished" podID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerID="fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47" exitCode=0 Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.536832 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.536834 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f564a2b-47d8-4508-946f-be6b466fc27a","Type":"ContainerDied","Data":"fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47"} Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.537423 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4f564a2b-47d8-4508-946f-be6b466fc27a","Type":"ContainerDied","Data":"73c4fdb5523babd59d7273ed5bc2c8054b7426e6dfeb0f144e90c1d981949e0b"} Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.537465 4917 scope.go:117] "RemoveContainer" containerID="fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.597711 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f77rn\" (UniqueName: \"kubernetes.io/projected/4f564a2b-47d8-4508-946f-be6b466fc27a-kube-api-access-f77rn\") pod \"4f564a2b-47d8-4508-946f-be6b466fc27a\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.597778 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data\") pod \"4f564a2b-47d8-4508-946f-be6b466fc27a\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.597836 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-internal-tls-certs\") pod \"4f564a2b-47d8-4508-946f-be6b466fc27a\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.597853 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data-custom\") pod \"4f564a2b-47d8-4508-946f-be6b466fc27a\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.597897 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-public-tls-certs\") pod \"4f564a2b-47d8-4508-946f-be6b466fc27a\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.597946 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-scripts\") pod \"4f564a2b-47d8-4508-946f-be6b466fc27a\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.598001 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f564a2b-47d8-4508-946f-be6b466fc27a-logs\") pod \"4f564a2b-47d8-4508-946f-be6b466fc27a\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.598097 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f564a2b-47d8-4508-946f-be6b466fc27a-etc-machine-id\") pod \"4f564a2b-47d8-4508-946f-be6b466fc27a\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.598170 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-combined-ca-bundle\") pod \"4f564a2b-47d8-4508-946f-be6b466fc27a\" (UID: \"4f564a2b-47d8-4508-946f-be6b466fc27a\") " Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.601902 4917 scope.go:117] "RemoveContainer" containerID="c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.603694 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f564a2b-47d8-4508-946f-be6b466fc27a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4f564a2b-47d8-4508-946f-be6b466fc27a" (UID: "4f564a2b-47d8-4508-946f-be6b466fc27a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.606191 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f564a2b-47d8-4508-946f-be6b466fc27a-logs" (OuterVolumeSpecName: "logs") pod "4f564a2b-47d8-4508-946f-be6b466fc27a" (UID: "4f564a2b-47d8-4508-946f-be6b466fc27a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.613943 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f564a2b-47d8-4508-946f-be6b466fc27a-kube-api-access-f77rn" (OuterVolumeSpecName: "kube-api-access-f77rn") pod "4f564a2b-47d8-4508-946f-be6b466fc27a" (UID: "4f564a2b-47d8-4508-946f-be6b466fc27a"). InnerVolumeSpecName "kube-api-access-f77rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.614828 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563692-k6hj5"] Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.622406 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4f564a2b-47d8-4508-946f-be6b466fc27a" (UID: "4f564a2b-47d8-4508-946f-be6b466fc27a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.623443 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563692-k6hj5"] Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.627921 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-scripts" (OuterVolumeSpecName: "scripts") pod "4f564a2b-47d8-4508-946f-be6b466fc27a" (UID: "4f564a2b-47d8-4508-946f-be6b466fc27a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.644935 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f564a2b-47d8-4508-946f-be6b466fc27a" (UID: "4f564a2b-47d8-4508-946f-be6b466fc27a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.645864 4917 scope.go:117] "RemoveContainer" containerID="fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47" Mar 18 08:18:05 crc kubenswrapper[4917]: E0318 08:18:05.646418 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47\": container with ID starting with fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47 not found: ID does not exist" containerID="fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.646523 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47"} err="failed to get container status \"fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47\": rpc error: code = NotFound desc = could not find container \"fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47\": container with ID starting with fb9d030a4dfe04cc5c4cc2bc31d828722cc9fc8feb236572969c7c9db4bf9e47 not found: ID does not exist" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.646562 4917 scope.go:117] "RemoveContainer" containerID="c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d" Mar 18 08:18:05 crc kubenswrapper[4917]: E0318 08:18:05.646905 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d\": container with ID starting with c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d not found: ID does not exist" containerID="c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.646947 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d"} err="failed to get container status \"c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d\": rpc error: code = NotFound desc = could not find container \"c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d\": container with ID starting with c05dee650f4b3ad99867d48f03d50a7ecb29c98337459fead54adb98947ca99d not found: ID does not exist" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.676833 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4f564a2b-47d8-4508-946f-be6b466fc27a" (UID: "4f564a2b-47d8-4508-946f-be6b466fc27a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.693546 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4f564a2b-47d8-4508-946f-be6b466fc27a" (UID: "4f564a2b-47d8-4508-946f-be6b466fc27a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.701406 4917 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f564a2b-47d8-4508-946f-be6b466fc27a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.701913 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.701941 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f77rn\" (UniqueName: \"kubernetes.io/projected/4f564a2b-47d8-4508-946f-be6b466fc27a-kube-api-access-f77rn\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.701960 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.701974 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.701989 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.702008 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.702023 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f564a2b-47d8-4508-946f-be6b466fc27a-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.720367 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data" (OuterVolumeSpecName: "config-data") pod "4f564a2b-47d8-4508-946f-be6b466fc27a" (UID: "4f564a2b-47d8-4508-946f-be6b466fc27a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.783714 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abfd30f8-cac5-4990-8237-9a0381bd36a0" path="/var/lib/kubelet/pods/abfd30f8-cac5-4990-8237-9a0381bd36a0/volumes" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.803598 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f564a2b-47d8-4508-946f-be6b466fc27a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.866727 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.874498 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884062 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:18:05 crc kubenswrapper[4917]: E0318 08:18:05.884375 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerName="extract-utilities" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884391 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerName="extract-utilities" Mar 18 08:18:05 crc kubenswrapper[4917]: E0318 08:18:05.884407 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96877157-34b5-49d3-a3b4-e7db7a2952ff" containerName="oc" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884413 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="96877157-34b5-49d3-a3b4-e7db7a2952ff" containerName="oc" Mar 18 08:18:05 crc kubenswrapper[4917]: E0318 08:18:05.884425 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerName="registry-server" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884431 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerName="registry-server" Mar 18 08:18:05 crc kubenswrapper[4917]: E0318 08:18:05.884445 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerName="cinder-api-log" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884450 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerName="cinder-api-log" Mar 18 08:18:05 crc kubenswrapper[4917]: E0318 08:18:05.884458 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerName="extract-content" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884463 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerName="extract-content" Mar 18 08:18:05 crc kubenswrapper[4917]: E0318 08:18:05.884470 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerName="cinder-api" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884478 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerName="cinder-api" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884634 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="96877157-34b5-49d3-a3b4-e7db7a2952ff" containerName="oc" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884648 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerName="cinder-api-log" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884661 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f564a2b-47d8-4508-946f-be6b466fc27a" containerName="cinder-api" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.884672 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4bdf3cd-7fe6-4853-a75d-dfb580089f25" containerName="registry-server" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.885446 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.888430 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.892663 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.893145 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 18 08:18:05 crc kubenswrapper[4917]: I0318 08:18:05.900645 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.007642 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crvsw\" (UniqueName: \"kubernetes.io/projected/94257751-2950-43fe-acb4-f96f1a8e5264-kube-api-access-crvsw\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.007694 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.007729 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-scripts\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.007825 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94257751-2950-43fe-acb4-f96f1a8e5264-etc-machine-id\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.007866 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94257751-2950-43fe-acb4-f96f1a8e5264-logs\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.007886 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-config-data\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.007910 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.007929 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-config-data-custom\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.008022 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-public-tls-certs\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.110739 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.110824 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-scripts\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.110919 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94257751-2950-43fe-acb4-f96f1a8e5264-etc-machine-id\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.111002 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94257751-2950-43fe-acb4-f96f1a8e5264-logs\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.111044 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-config-data\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.111096 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.111133 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-config-data-custom\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.111156 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/94257751-2950-43fe-acb4-f96f1a8e5264-etc-machine-id\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.111180 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-public-tls-certs\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.111477 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crvsw\" (UniqueName: \"kubernetes.io/projected/94257751-2950-43fe-acb4-f96f1a8e5264-kube-api-access-crvsw\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.112478 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/94257751-2950-43fe-acb4-f96f1a8e5264-logs\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.116050 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.117178 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-public-tls-certs\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.118747 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-config-data\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.119700 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-config-data-custom\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.120738 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-scripts\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.128897 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94257751-2950-43fe-acb4-f96f1a8e5264-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.150338 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crvsw\" (UniqueName: \"kubernetes.io/projected/94257751-2950-43fe-acb4-f96f1a8e5264-kube-api-access-crvsw\") pod \"cinder-api-0\" (UID: \"94257751-2950-43fe-acb4-f96f1a8e5264\") " pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.202397 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 18 08:18:06 crc kubenswrapper[4917]: I0318 08:18:06.727117 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 18 08:18:06 crc kubenswrapper[4917]: W0318 08:18:06.732917 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94257751_2950_43fe_acb4_f96f1a8e5264.slice/crio-99e63c8c1432d999108e4e717822ff029072f7f72887400743be816bd1753968 WatchSource:0}: Error finding container 99e63c8c1432d999108e4e717822ff029072f7f72887400743be816bd1753968: Status 404 returned error can't find the container with id 99e63c8c1432d999108e4e717822ff029072f7f72887400743be816bd1753968 Mar 18 08:18:07 crc kubenswrapper[4917]: I0318 08:18:07.562563 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"94257751-2950-43fe-acb4-f96f1a8e5264","Type":"ContainerStarted","Data":"bce7173520ce1cf3cd3d5febf7331ae3a8d4601acace16fae7ca5e886943854b"} Mar 18 08:18:07 crc kubenswrapper[4917]: I0318 08:18:07.562842 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"94257751-2950-43fe-acb4-f96f1a8e5264","Type":"ContainerStarted","Data":"99e63c8c1432d999108e4e717822ff029072f7f72887400743be816bd1753968"} Mar 18 08:18:07 crc kubenswrapper[4917]: I0318 08:18:07.786172 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f564a2b-47d8-4508-946f-be6b466fc27a" path="/var/lib/kubelet/pods/4f564a2b-47d8-4508-946f-be6b466fc27a/volumes" Mar 18 08:18:07 crc kubenswrapper[4917]: I0318 08:18:07.975296 4917 scope.go:117] "RemoveContainer" containerID="b845de4e4b0fddb0e1f6ec2c1cb7676a693c25921a229bae7da663095b22b2d7" Mar 18 08:18:08 crc kubenswrapper[4917]: I0318 08:18:08.029698 4917 scope.go:117] "RemoveContainer" containerID="644cdb8030a839044c71ac7afde2875311ed34a59e4fddc22e707852743fcf4a" Mar 18 08:18:08 crc kubenswrapper[4917]: I0318 08:18:08.577869 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"94257751-2950-43fe-acb4-f96f1a8e5264","Type":"ContainerStarted","Data":"6f56ccf28f4555194726656ac4bc4c6c7404bb91d844cd69104845cbc9744779"} Mar 18 08:18:08 crc kubenswrapper[4917]: I0318 08:18:08.578076 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 18 08:18:08 crc kubenswrapper[4917]: I0318 08:18:08.616921 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.6168923619999997 podStartE2EDuration="3.616892362s" podCreationTimestamp="2026-03-18 08:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:18:08.607805771 +0000 UTC m=+5473.548960515" watchObservedRunningTime="2026-03-18 08:18:08.616892362 +0000 UTC m=+5473.558047116" Mar 18 08:18:10 crc kubenswrapper[4917]: I0318 08:18:10.396164 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 08:18:10 crc kubenswrapper[4917]: I0318 08:18:10.489274 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 08:18:10 crc kubenswrapper[4917]: I0318 08:18:10.602784 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerName="cinder-scheduler" containerID="cri-o://a6776147629fe8772d09861dc1749c6ed443ffdd735181d0c2c2b6eaa3ae05d9" gracePeriod=30 Mar 18 08:18:10 crc kubenswrapper[4917]: I0318 08:18:10.602853 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerName="probe" containerID="cri-o://020f67fd2394d310031527ecf730be68692ca258efd7335b0c6764849b50284c" gracePeriod=30 Mar 18 08:18:11 crc kubenswrapper[4917]: I0318 08:18:11.615862 4917 generic.go:334] "Generic (PLEG): container finished" podID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerID="020f67fd2394d310031527ecf730be68692ca258efd7335b0c6764849b50284c" exitCode=0 Mar 18 08:18:11 crc kubenswrapper[4917]: I0318 08:18:11.615950 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90aa42b3-0a3b-4966-8522-b8e30bca4432","Type":"ContainerDied","Data":"020f67fd2394d310031527ecf730be68692ca258efd7335b0c6764849b50284c"} Mar 18 08:18:12 crc kubenswrapper[4917]: I0318 08:18:12.631879 4917 generic.go:334] "Generic (PLEG): container finished" podID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerID="a6776147629fe8772d09861dc1749c6ed443ffdd735181d0c2c2b6eaa3ae05d9" exitCode=0 Mar 18 08:18:12 crc kubenswrapper[4917]: I0318 08:18:12.631929 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90aa42b3-0a3b-4966-8522-b8e30bca4432","Type":"ContainerDied","Data":"a6776147629fe8772d09861dc1749c6ed443ffdd735181d0c2c2b6eaa3ae05d9"} Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.036294 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.155166 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data\") pod \"90aa42b3-0a3b-4966-8522-b8e30bca4432\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.155251 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-combined-ca-bundle\") pod \"90aa42b3-0a3b-4966-8522-b8e30bca4432\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.155290 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-scripts\") pod \"90aa42b3-0a3b-4966-8522-b8e30bca4432\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.155341 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90aa42b3-0a3b-4966-8522-b8e30bca4432-etc-machine-id\") pod \"90aa42b3-0a3b-4966-8522-b8e30bca4432\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.155359 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data-custom\") pod \"90aa42b3-0a3b-4966-8522-b8e30bca4432\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.155390 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcctf\" (UniqueName: \"kubernetes.io/projected/90aa42b3-0a3b-4966-8522-b8e30bca4432-kube-api-access-bcctf\") pod \"90aa42b3-0a3b-4966-8522-b8e30bca4432\" (UID: \"90aa42b3-0a3b-4966-8522-b8e30bca4432\") " Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.157625 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90aa42b3-0a3b-4966-8522-b8e30bca4432-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "90aa42b3-0a3b-4966-8522-b8e30bca4432" (UID: "90aa42b3-0a3b-4966-8522-b8e30bca4432"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.162468 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "90aa42b3-0a3b-4966-8522-b8e30bca4432" (UID: "90aa42b3-0a3b-4966-8522-b8e30bca4432"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.167715 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90aa42b3-0a3b-4966-8522-b8e30bca4432-kube-api-access-bcctf" (OuterVolumeSpecName: "kube-api-access-bcctf") pod "90aa42b3-0a3b-4966-8522-b8e30bca4432" (UID: "90aa42b3-0a3b-4966-8522-b8e30bca4432"). InnerVolumeSpecName "kube-api-access-bcctf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.168567 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-scripts" (OuterVolumeSpecName: "scripts") pod "90aa42b3-0a3b-4966-8522-b8e30bca4432" (UID: "90aa42b3-0a3b-4966-8522-b8e30bca4432"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.213961 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90aa42b3-0a3b-4966-8522-b8e30bca4432" (UID: "90aa42b3-0a3b-4966-8522-b8e30bca4432"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.257699 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.257730 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.257754 4917 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90aa42b3-0a3b-4966-8522-b8e30bca4432-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.257762 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.257771 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcctf\" (UniqueName: \"kubernetes.io/projected/90aa42b3-0a3b-4966-8522-b8e30bca4432-kube-api-access-bcctf\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.260833 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data" (OuterVolumeSpecName: "config-data") pod "90aa42b3-0a3b-4966-8522-b8e30bca4432" (UID: "90aa42b3-0a3b-4966-8522-b8e30bca4432"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.359140 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90aa42b3-0a3b-4966-8522-b8e30bca4432-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.647558 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"90aa42b3-0a3b-4966-8522-b8e30bca4432","Type":"ContainerDied","Data":"d1ed2486690290d7855d24f7419d8d28f8081c131075078fb1ad9c1a51ad8032"} Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.647638 4917 scope.go:117] "RemoveContainer" containerID="020f67fd2394d310031527ecf730be68692ca258efd7335b0c6764849b50284c" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.647704 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.717675 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.724792 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.746765 4917 scope.go:117] "RemoveContainer" containerID="a6776147629fe8772d09861dc1749c6ed443ffdd735181d0c2c2b6eaa3ae05d9" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.761166 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 08:18:13 crc kubenswrapper[4917]: E0318 08:18:13.761725 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerName="probe" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.761753 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerName="probe" Mar 18 08:18:13 crc kubenswrapper[4917]: E0318 08:18:13.761800 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerName="cinder-scheduler" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.761814 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerName="cinder-scheduler" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.762112 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerName="probe" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.762149 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="90aa42b3-0a3b-4966-8522-b8e30bca4432" containerName="cinder-scheduler" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.765548 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.768289 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.776735 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.793381 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90aa42b3-0a3b-4966-8522-b8e30bca4432" path="/var/lib/kubelet/pods/90aa42b3-0a3b-4966-8522-b8e30bca4432/volumes" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.867549 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.867633 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.867664 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb5a322e-ac24-41a9-a411-1ba1e05738e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.867987 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.868134 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.868254 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-986v4\" (UniqueName: \"kubernetes.io/projected/cb5a322e-ac24-41a9-a411-1ba1e05738e5-kube-api-access-986v4\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.972855 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.972973 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.973042 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb5a322e-ac24-41a9-a411-1ba1e05738e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.973163 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.973223 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.973284 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-986v4\" (UniqueName: \"kubernetes.io/projected/cb5a322e-ac24-41a9-a411-1ba1e05738e5-kube-api-access-986v4\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.973963 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cb5a322e-ac24-41a9-a411-1ba1e05738e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.979277 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.979460 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.982480 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.988470 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5a322e-ac24-41a9-a411-1ba1e05738e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:13 crc kubenswrapper[4917]: I0318 08:18:13.998016 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-986v4\" (UniqueName: \"kubernetes.io/projected/cb5a322e-ac24-41a9-a411-1ba1e05738e5-kube-api-access-986v4\") pod \"cinder-scheduler-0\" (UID: \"cb5a322e-ac24-41a9-a411-1ba1e05738e5\") " pod="openstack/cinder-scheduler-0" Mar 18 08:18:14 crc kubenswrapper[4917]: I0318 08:18:14.097899 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 18 08:18:14 crc kubenswrapper[4917]: I0318 08:18:14.617444 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 18 08:18:14 crc kubenswrapper[4917]: I0318 08:18:14.664015 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb5a322e-ac24-41a9-a411-1ba1e05738e5","Type":"ContainerStarted","Data":"4e277f13bd7131ca61c50911602c70758a5dcd0246dd30a1f167c246faa80216"} Mar 18 08:18:14 crc kubenswrapper[4917]: I0318 08:18:14.773126 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:18:14 crc kubenswrapper[4917]: E0318 08:18:14.773373 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:18:15 crc kubenswrapper[4917]: I0318 08:18:15.674603 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb5a322e-ac24-41a9-a411-1ba1e05738e5","Type":"ContainerStarted","Data":"c7623e8679c620754111bf0928147ddc5e293efa33456ee98cf0e18a70884085"} Mar 18 08:18:16 crc kubenswrapper[4917]: I0318 08:18:16.688654 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cb5a322e-ac24-41a9-a411-1ba1e05738e5","Type":"ContainerStarted","Data":"a7bfac950d61fcf576fe923a3500f88f92d26de4566d626abd1ad452b790302a"} Mar 18 08:18:16 crc kubenswrapper[4917]: I0318 08:18:16.719435 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.7194210869999997 podStartE2EDuration="3.719421087s" podCreationTimestamp="2026-03-18 08:18:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:18:16.716410403 +0000 UTC m=+5481.657565157" watchObservedRunningTime="2026-03-18 08:18:16.719421087 +0000 UTC m=+5481.660575801" Mar 18 08:18:18 crc kubenswrapper[4917]: I0318 08:18:18.057508 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 18 08:18:19 crc kubenswrapper[4917]: I0318 08:18:19.098422 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.306913 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.563161 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gm7kh"] Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.564730 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.575641 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gm7kh"] Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.601032 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rswgn\" (UniqueName: \"kubernetes.io/projected/0b1d2452-5873-42b8-9c28-a00cb39bc927-kube-api-access-rswgn\") pod \"glance-db-create-gm7kh\" (UID: \"0b1d2452-5873-42b8-9c28-a00cb39bc927\") " pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.601152 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1d2452-5873-42b8-9c28-a00cb39bc927-operator-scripts\") pod \"glance-db-create-gm7kh\" (UID: \"0b1d2452-5873-42b8-9c28-a00cb39bc927\") " pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.662266 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-e690-account-create-update-xnq4k"] Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.663471 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.665539 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.674092 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e690-account-create-update-xnq4k"] Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.703007 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rswgn\" (UniqueName: \"kubernetes.io/projected/0b1d2452-5873-42b8-9c28-a00cb39bc927-kube-api-access-rswgn\") pod \"glance-db-create-gm7kh\" (UID: \"0b1d2452-5873-42b8-9c28-a00cb39bc927\") " pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.703060 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dvw\" (UniqueName: \"kubernetes.io/projected/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-kube-api-access-k4dvw\") pod \"glance-e690-account-create-update-xnq4k\" (UID: \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\") " pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.703188 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-operator-scripts\") pod \"glance-e690-account-create-update-xnq4k\" (UID: \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\") " pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.703234 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1d2452-5873-42b8-9c28-a00cb39bc927-operator-scripts\") pod \"glance-db-create-gm7kh\" (UID: \"0b1d2452-5873-42b8-9c28-a00cb39bc927\") " pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.705415 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1d2452-5873-42b8-9c28-a00cb39bc927-operator-scripts\") pod \"glance-db-create-gm7kh\" (UID: \"0b1d2452-5873-42b8-9c28-a00cb39bc927\") " pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.725041 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rswgn\" (UniqueName: \"kubernetes.io/projected/0b1d2452-5873-42b8-9c28-a00cb39bc927-kube-api-access-rswgn\") pod \"glance-db-create-gm7kh\" (UID: \"0b1d2452-5873-42b8-9c28-a00cb39bc927\") " pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.804814 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dvw\" (UniqueName: \"kubernetes.io/projected/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-kube-api-access-k4dvw\") pod \"glance-e690-account-create-update-xnq4k\" (UID: \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\") " pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.805210 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-operator-scripts\") pod \"glance-e690-account-create-update-xnq4k\" (UID: \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\") " pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.806353 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-operator-scripts\") pod \"glance-e690-account-create-update-xnq4k\" (UID: \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\") " pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.826403 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dvw\" (UniqueName: \"kubernetes.io/projected/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-kube-api-access-k4dvw\") pod \"glance-e690-account-create-update-xnq4k\" (UID: \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\") " pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.894941 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:24 crc kubenswrapper[4917]: I0318 08:18:24.983428 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:25 crc kubenswrapper[4917]: I0318 08:18:25.296543 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-e690-account-create-update-xnq4k"] Mar 18 08:18:25 crc kubenswrapper[4917]: W0318 08:18:25.298415 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82e0e08f_d0a3_4279_b848_c9a7e6c1e62b.slice/crio-dcd5d5e8c62511281e974c32a70326f978afe833efbfc6c3973979ec02961fb2 WatchSource:0}: Error finding container dcd5d5e8c62511281e974c32a70326f978afe833efbfc6c3973979ec02961fb2: Status 404 returned error can't find the container with id dcd5d5e8c62511281e974c32a70326f978afe833efbfc6c3973979ec02961fb2 Mar 18 08:18:25 crc kubenswrapper[4917]: I0318 08:18:25.425651 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gm7kh"] Mar 18 08:18:25 crc kubenswrapper[4917]: W0318 08:18:25.437844 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b1d2452_5873_42b8_9c28_a00cb39bc927.slice/crio-2a8dff9f799f3e321c3086384cd81bf9f83c94345cd108a0b7a0cf34f717bd08 WatchSource:0}: Error finding container 2a8dff9f799f3e321c3086384cd81bf9f83c94345cd108a0b7a0cf34f717bd08: Status 404 returned error can't find the container with id 2a8dff9f799f3e321c3086384cd81bf9f83c94345cd108a0b7a0cf34f717bd08 Mar 18 08:18:25 crc kubenswrapper[4917]: I0318 08:18:25.790946 4917 generic.go:334] "Generic (PLEG): container finished" podID="82e0e08f-d0a3-4279-b848-c9a7e6c1e62b" containerID="656c593bede78f589afef3d6f8137261b7ec3a8c8d9d0914356baf2172bd0b58" exitCode=0 Mar 18 08:18:25 crc kubenswrapper[4917]: I0318 08:18:25.791126 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e690-account-create-update-xnq4k" event={"ID":"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b","Type":"ContainerDied","Data":"656c593bede78f589afef3d6f8137261b7ec3a8c8d9d0914356baf2172bd0b58"} Mar 18 08:18:25 crc kubenswrapper[4917]: I0318 08:18:25.791157 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e690-account-create-update-xnq4k" event={"ID":"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b","Type":"ContainerStarted","Data":"dcd5d5e8c62511281e974c32a70326f978afe833efbfc6c3973979ec02961fb2"} Mar 18 08:18:25 crc kubenswrapper[4917]: I0318 08:18:25.793048 4917 generic.go:334] "Generic (PLEG): container finished" podID="0b1d2452-5873-42b8-9c28-a00cb39bc927" containerID="09387f4abe131d4c4d1ea2f47aa2b51b59892523e42bc89acbb47cfc7db3d531" exitCode=0 Mar 18 08:18:25 crc kubenswrapper[4917]: I0318 08:18:25.793072 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gm7kh" event={"ID":"0b1d2452-5873-42b8-9c28-a00cb39bc927","Type":"ContainerDied","Data":"09387f4abe131d4c4d1ea2f47aa2b51b59892523e42bc89acbb47cfc7db3d531"} Mar 18 08:18:25 crc kubenswrapper[4917]: I0318 08:18:25.793087 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gm7kh" event={"ID":"0b1d2452-5873-42b8-9c28-a00cb39bc927","Type":"ContainerStarted","Data":"2a8dff9f799f3e321c3086384cd81bf9f83c94345cd108a0b7a0cf34f717bd08"} Mar 18 08:18:26 crc kubenswrapper[4917]: I0318 08:18:26.772287 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:18:26 crc kubenswrapper[4917]: E0318 08:18:26.772869 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.187288 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.193079 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.253851 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rswgn\" (UniqueName: \"kubernetes.io/projected/0b1d2452-5873-42b8-9c28-a00cb39bc927-kube-api-access-rswgn\") pod \"0b1d2452-5873-42b8-9c28-a00cb39bc927\" (UID: \"0b1d2452-5873-42b8-9c28-a00cb39bc927\") " Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.253930 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4dvw\" (UniqueName: \"kubernetes.io/projected/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-kube-api-access-k4dvw\") pod \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\" (UID: \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\") " Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.253960 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1d2452-5873-42b8-9c28-a00cb39bc927-operator-scripts\") pod \"0b1d2452-5873-42b8-9c28-a00cb39bc927\" (UID: \"0b1d2452-5873-42b8-9c28-a00cb39bc927\") " Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.253983 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-operator-scripts\") pod \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\" (UID: \"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b\") " Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.254975 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82e0e08f-d0a3-4279-b848-c9a7e6c1e62b" (UID: "82e0e08f-d0a3-4279-b848-c9a7e6c1e62b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.254991 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b1d2452-5873-42b8-9c28-a00cb39bc927-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b1d2452-5873-42b8-9c28-a00cb39bc927" (UID: "0b1d2452-5873-42b8-9c28-a00cb39bc927"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.259866 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-kube-api-access-k4dvw" (OuterVolumeSpecName: "kube-api-access-k4dvw") pod "82e0e08f-d0a3-4279-b848-c9a7e6c1e62b" (UID: "82e0e08f-d0a3-4279-b848-c9a7e6c1e62b"). InnerVolumeSpecName "kube-api-access-k4dvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.259981 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1d2452-5873-42b8-9c28-a00cb39bc927-kube-api-access-rswgn" (OuterVolumeSpecName: "kube-api-access-rswgn") pod "0b1d2452-5873-42b8-9c28-a00cb39bc927" (UID: "0b1d2452-5873-42b8-9c28-a00cb39bc927"). InnerVolumeSpecName "kube-api-access-rswgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.355561 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rswgn\" (UniqueName: \"kubernetes.io/projected/0b1d2452-5873-42b8-9c28-a00cb39bc927-kube-api-access-rswgn\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.355614 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4dvw\" (UniqueName: \"kubernetes.io/projected/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-kube-api-access-k4dvw\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.355625 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b1d2452-5873-42b8-9c28-a00cb39bc927-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.355633 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.817851 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-e690-account-create-update-xnq4k" event={"ID":"82e0e08f-d0a3-4279-b848-c9a7e6c1e62b","Type":"ContainerDied","Data":"dcd5d5e8c62511281e974c32a70326f978afe833efbfc6c3973979ec02961fb2"} Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.818808 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcd5d5e8c62511281e974c32a70326f978afe833efbfc6c3973979ec02961fb2" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.818158 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-e690-account-create-update-xnq4k" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.820535 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gm7kh" event={"ID":"0b1d2452-5873-42b8-9c28-a00cb39bc927","Type":"ContainerDied","Data":"2a8dff9f799f3e321c3086384cd81bf9f83c94345cd108a0b7a0cf34f717bd08"} Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.820643 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a8dff9f799f3e321c3086384cd81bf9f83c94345cd108a0b7a0cf34f717bd08" Mar 18 08:18:27 crc kubenswrapper[4917]: I0318 08:18:27.820739 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gm7kh" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.843467 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-k7f54"] Mar 18 08:18:29 crc kubenswrapper[4917]: E0318 08:18:29.845185 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82e0e08f-d0a3-4279-b848-c9a7e6c1e62b" containerName="mariadb-account-create-update" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.845276 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="82e0e08f-d0a3-4279-b848-c9a7e6c1e62b" containerName="mariadb-account-create-update" Mar 18 08:18:29 crc kubenswrapper[4917]: E0318 08:18:29.845378 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1d2452-5873-42b8-9c28-a00cb39bc927" containerName="mariadb-database-create" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.845474 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1d2452-5873-42b8-9c28-a00cb39bc927" containerName="mariadb-database-create" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.845771 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1d2452-5873-42b8-9c28-a00cb39bc927" containerName="mariadb-database-create" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.845871 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="82e0e08f-d0a3-4279-b848-c9a7e6c1e62b" containerName="mariadb-account-create-update" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.846680 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.848870 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9s7sz" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.849042 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.853269 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-k7f54"] Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.904289 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-combined-ca-bundle\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.904370 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nn94\" (UniqueName: \"kubernetes.io/projected/3c6bdbaf-275f-43d6-b502-37e885457649-kube-api-access-7nn94\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.904419 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-config-data\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:29 crc kubenswrapper[4917]: I0318 08:18:29.904505 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-db-sync-config-data\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.006883 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-config-data\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.008344 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-db-sync-config-data\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.008402 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-combined-ca-bundle\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.008561 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nn94\" (UniqueName: \"kubernetes.io/projected/3c6bdbaf-275f-43d6-b502-37e885457649-kube-api-access-7nn94\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.012629 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-db-sync-config-data\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.013100 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-config-data\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.014057 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-combined-ca-bundle\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.028351 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nn94\" (UniqueName: \"kubernetes.io/projected/3c6bdbaf-275f-43d6-b502-37e885457649-kube-api-access-7nn94\") pod \"glance-db-sync-k7f54\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.165366 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.748562 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-k7f54"] Mar 18 08:18:30 crc kubenswrapper[4917]: W0318 08:18:30.750030 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c6bdbaf_275f_43d6_b502_37e885457649.slice/crio-cf5fbe1126d69c275b4ccc3d8b2e66a61f7bb7c75047c8fd78e6d94102eda932 WatchSource:0}: Error finding container cf5fbe1126d69c275b4ccc3d8b2e66a61f7bb7c75047c8fd78e6d94102eda932: Status 404 returned error can't find the container with id cf5fbe1126d69c275b4ccc3d8b2e66a61f7bb7c75047c8fd78e6d94102eda932 Mar 18 08:18:30 crc kubenswrapper[4917]: I0318 08:18:30.845540 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k7f54" event={"ID":"3c6bdbaf-275f-43d6-b502-37e885457649","Type":"ContainerStarted","Data":"cf5fbe1126d69c275b4ccc3d8b2e66a61f7bb7c75047c8fd78e6d94102eda932"} Mar 18 08:18:41 crc kubenswrapper[4917]: I0318 08:18:41.773064 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:18:41 crc kubenswrapper[4917]: E0318 08:18:41.773924 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:18:52 crc kubenswrapper[4917]: I0318 08:18:52.772718 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:18:52 crc kubenswrapper[4917]: E0318 08:18:52.773859 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:18:53 crc kubenswrapper[4917]: I0318 08:18:53.065621 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k7f54" event={"ID":"3c6bdbaf-275f-43d6-b502-37e885457649","Type":"ContainerStarted","Data":"8aeb64520d48c0a4cc6d7c6710c1fc281898f64f7a729e8b6529cb874e37ffa0"} Mar 18 08:18:53 crc kubenswrapper[4917]: I0318 08:18:53.087478 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-k7f54" podStartSLOduration=2.77827372 podStartE2EDuration="24.087455011s" podCreationTimestamp="2026-03-18 08:18:29 +0000 UTC" firstStartedPulling="2026-03-18 08:18:30.75199102 +0000 UTC m=+5495.693145754" lastFinishedPulling="2026-03-18 08:18:52.061172331 +0000 UTC m=+5517.002327045" observedRunningTime="2026-03-18 08:18:53.085860692 +0000 UTC m=+5518.027015416" watchObservedRunningTime="2026-03-18 08:18:53.087455011 +0000 UTC m=+5518.028609755" Mar 18 08:18:56 crc kubenswrapper[4917]: I0318 08:18:56.101136 4917 generic.go:334] "Generic (PLEG): container finished" podID="3c6bdbaf-275f-43d6-b502-37e885457649" containerID="8aeb64520d48c0a4cc6d7c6710c1fc281898f64f7a729e8b6529cb874e37ffa0" exitCode=0 Mar 18 08:18:56 crc kubenswrapper[4917]: I0318 08:18:56.101280 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k7f54" event={"ID":"3c6bdbaf-275f-43d6-b502-37e885457649","Type":"ContainerDied","Data":"8aeb64520d48c0a4cc6d7c6710c1fc281898f64f7a729e8b6529cb874e37ffa0"} Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.606525 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.683770 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-combined-ca-bundle\") pod \"3c6bdbaf-275f-43d6-b502-37e885457649\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.684022 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-db-sync-config-data\") pod \"3c6bdbaf-275f-43d6-b502-37e885457649\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.684107 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-config-data\") pod \"3c6bdbaf-275f-43d6-b502-37e885457649\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.684149 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nn94\" (UniqueName: \"kubernetes.io/projected/3c6bdbaf-275f-43d6-b502-37e885457649-kube-api-access-7nn94\") pod \"3c6bdbaf-275f-43d6-b502-37e885457649\" (UID: \"3c6bdbaf-275f-43d6-b502-37e885457649\") " Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.689215 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3c6bdbaf-275f-43d6-b502-37e885457649" (UID: "3c6bdbaf-275f-43d6-b502-37e885457649"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.689297 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c6bdbaf-275f-43d6-b502-37e885457649-kube-api-access-7nn94" (OuterVolumeSpecName: "kube-api-access-7nn94") pod "3c6bdbaf-275f-43d6-b502-37e885457649" (UID: "3c6bdbaf-275f-43d6-b502-37e885457649"). InnerVolumeSpecName "kube-api-access-7nn94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.737934 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c6bdbaf-275f-43d6-b502-37e885457649" (UID: "3c6bdbaf-275f-43d6-b502-37e885457649"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.775917 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-config-data" (OuterVolumeSpecName: "config-data") pod "3c6bdbaf-275f-43d6-b502-37e885457649" (UID: "3c6bdbaf-275f-43d6-b502-37e885457649"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.786985 4917 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.787144 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.787238 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nn94\" (UniqueName: \"kubernetes.io/projected/3c6bdbaf-275f-43d6-b502-37e885457649-kube-api-access-7nn94\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:57 crc kubenswrapper[4917]: I0318 08:18:57.787308 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c6bdbaf-275f-43d6-b502-37e885457649-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.126688 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-k7f54" event={"ID":"3c6bdbaf-275f-43d6-b502-37e885457649","Type":"ContainerDied","Data":"cf5fbe1126d69c275b4ccc3d8b2e66a61f7bb7c75047c8fd78e6d94102eda932"} Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.126733 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf5fbe1126d69c275b4ccc3d8b2e66a61f7bb7c75047c8fd78e6d94102eda932" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.127506 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-k7f54" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.448776 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:18:58 crc kubenswrapper[4917]: E0318 08:18:58.449347 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c6bdbaf-275f-43d6-b502-37e885457649" containerName="glance-db-sync" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.449364 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c6bdbaf-275f-43d6-b502-37e885457649" containerName="glance-db-sync" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.449550 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c6bdbaf-275f-43d6-b502-37e885457649" containerName="glance-db-sync" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.450431 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.454079 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9s7sz" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.455683 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.456363 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.480096 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.577200 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d8db65bf-qn8c9"] Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.580323 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.604635 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d8db65bf-qn8c9"] Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.608964 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.609015 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-logs\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.609047 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-config-data\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.609061 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.609249 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccrhj\" (UniqueName: \"kubernetes.io/projected/902580f4-80d0-4299-8370-f42c118ce4e8-kube-api-access-ccrhj\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.609376 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-scripts\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.671009 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.672404 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.675705 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.682449 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711255 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccrhj\" (UniqueName: \"kubernetes.io/projected/902580f4-80d0-4299-8370-f42c118ce4e8-kube-api-access-ccrhj\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711291 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-nb\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711327 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-dns-svc\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711355 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-scripts\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711397 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-config\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711453 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711475 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-logs\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711497 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-sb\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711513 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnsnj\" (UniqueName: \"kubernetes.io/projected/3dac1607-cb12-4d64-aef7-321e74512588-kube-api-access-bnsnj\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711562 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-config-data\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.711588 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.712051 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.713175 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-logs\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.717577 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-scripts\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.718808 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.720256 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-config-data\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.729380 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccrhj\" (UniqueName: \"kubernetes.io/projected/902580f4-80d0-4299-8370-f42c118ce4e8-kube-api-access-ccrhj\") pod \"glance-default-external-api-0\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.769415 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.813919 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-nb\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.813984 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-dns-svc\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.814022 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.814050 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.814074 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-config\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.814096 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6xs\" (UniqueName: \"kubernetes.io/projected/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-kube-api-access-vf6xs\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.814144 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.814164 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-logs\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.814187 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.814505 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-sb\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.814528 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnsnj\" (UniqueName: \"kubernetes.io/projected/3dac1607-cb12-4d64-aef7-321e74512588-kube-api-access-bnsnj\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.815731 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-nb\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.816229 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-dns-svc\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.818222 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-config\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.818816 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-sb\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.833012 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnsnj\" (UniqueName: \"kubernetes.io/projected/3dac1607-cb12-4d64-aef7-321e74512588-kube-api-access-bnsnj\") pod \"dnsmasq-dns-86d8db65bf-qn8c9\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.899177 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.915953 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.916015 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.916096 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6xs\" (UniqueName: \"kubernetes.io/projected/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-kube-api-access-vf6xs\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.916161 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.916183 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-logs\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.916208 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.921977 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.923413 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.923875 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.924080 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-logs\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.942148 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6xs\" (UniqueName: \"kubernetes.io/projected/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-kube-api-access-vf6xs\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:58 crc kubenswrapper[4917]: I0318 08:18:58.943739 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:18:59 crc kubenswrapper[4917]: I0318 08:18:59.178943 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:18:59 crc kubenswrapper[4917]: I0318 08:18:59.339350 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:18:59 crc kubenswrapper[4917]: I0318 08:18:59.431010 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d8db65bf-qn8c9"] Mar 18 08:18:59 crc kubenswrapper[4917]: W0318 08:18:59.452043 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dac1607_cb12_4d64_aef7_321e74512588.slice/crio-df3aad02ea7b05b1f90d0335d5bc77c7e24255208e212b6df0448a0d977bbd32 WatchSource:0}: Error finding container df3aad02ea7b05b1f90d0335d5bc77c7e24255208e212b6df0448a0d977bbd32: Status 404 returned error can't find the container with id df3aad02ea7b05b1f90d0335d5bc77c7e24255208e212b6df0448a0d977bbd32 Mar 18 08:18:59 crc kubenswrapper[4917]: I0318 08:18:59.555374 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:18:59 crc kubenswrapper[4917]: I0318 08:18:59.580015 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:19:00 crc kubenswrapper[4917]: I0318 08:19:00.154936 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"902580f4-80d0-4299-8370-f42c118ce4e8","Type":"ContainerStarted","Data":"a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a"} Mar 18 08:19:00 crc kubenswrapper[4917]: I0318 08:19:00.155447 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"902580f4-80d0-4299-8370-f42c118ce4e8","Type":"ContainerStarted","Data":"6715152588f1550af2465ac6ee23f239eebb2a212588bdb3712f57d999df20e1"} Mar 18 08:19:00 crc kubenswrapper[4917]: I0318 08:19:00.161513 4917 generic.go:334] "Generic (PLEG): container finished" podID="3dac1607-cb12-4d64-aef7-321e74512588" containerID="f9bec35a8a95a2f42835913615c45960b3d71fda506b83f97f9f143b30919f3e" exitCode=0 Mar 18 08:19:00 crc kubenswrapper[4917]: I0318 08:19:00.161680 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" event={"ID":"3dac1607-cb12-4d64-aef7-321e74512588","Type":"ContainerDied","Data":"f9bec35a8a95a2f42835913615c45960b3d71fda506b83f97f9f143b30919f3e"} Mar 18 08:19:00 crc kubenswrapper[4917]: I0318 08:19:00.161720 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" event={"ID":"3dac1607-cb12-4d64-aef7-321e74512588","Type":"ContainerStarted","Data":"df3aad02ea7b05b1f90d0335d5bc77c7e24255208e212b6df0448a0d977bbd32"} Mar 18 08:19:00 crc kubenswrapper[4917]: I0318 08:19:00.164626 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a84de3d8-5b6a-4f65-9882-b0ba6f9228be","Type":"ContainerStarted","Data":"7591e035c56b343cb29bc4ec44bd0697dd3ba20b656cb50e0dfdbaf607b2046e"} Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.001975 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.173366 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"902580f4-80d0-4299-8370-f42c118ce4e8","Type":"ContainerStarted","Data":"bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3"} Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.173806 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="902580f4-80d0-4299-8370-f42c118ce4e8" containerName="glance-httpd" containerID="cri-o://bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3" gracePeriod=30 Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.173764 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="902580f4-80d0-4299-8370-f42c118ce4e8" containerName="glance-log" containerID="cri-o://a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a" gracePeriod=30 Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.178660 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" event={"ID":"3dac1607-cb12-4d64-aef7-321e74512588","Type":"ContainerStarted","Data":"fbf4a49cf2ccf29f65fc784a461789ed30313cbb669fc1751c34d5b7c02a9989"} Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.179492 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.182985 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a84de3d8-5b6a-4f65-9882-b0ba6f9228be","Type":"ContainerStarted","Data":"6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f"} Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.183021 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a84de3d8-5b6a-4f65-9882-b0ba6f9228be","Type":"ContainerStarted","Data":"91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4"} Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.219354 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.219329258 podStartE2EDuration="3.219329258s" podCreationTimestamp="2026-03-18 08:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:19:01.193985334 +0000 UTC m=+5526.135140038" watchObservedRunningTime="2026-03-18 08:19:01.219329258 +0000 UTC m=+5526.160483992" Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.223532 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" podStartSLOduration=3.22351163 podStartE2EDuration="3.22351163s" podCreationTimestamp="2026-03-18 08:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:19:01.211331954 +0000 UTC m=+5526.152486678" watchObservedRunningTime="2026-03-18 08:19:01.22351163 +0000 UTC m=+5526.164666334" Mar 18 08:19:01 crc kubenswrapper[4917]: I0318 08:19:01.243358 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.24333996 podStartE2EDuration="3.24333996s" podCreationTimestamp="2026-03-18 08:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:19:01.234862585 +0000 UTC m=+5526.176017309" watchObservedRunningTime="2026-03-18 08:19:01.24333996 +0000 UTC m=+5526.184494684" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.018860 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.108366 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-scripts\") pod \"902580f4-80d0-4299-8370-f42c118ce4e8\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.108419 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccrhj\" (UniqueName: \"kubernetes.io/projected/902580f4-80d0-4299-8370-f42c118ce4e8-kube-api-access-ccrhj\") pod \"902580f4-80d0-4299-8370-f42c118ce4e8\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.108468 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-config-data\") pod \"902580f4-80d0-4299-8370-f42c118ce4e8\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.108500 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-logs\") pod \"902580f4-80d0-4299-8370-f42c118ce4e8\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.108548 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-httpd-run\") pod \"902580f4-80d0-4299-8370-f42c118ce4e8\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.108568 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-combined-ca-bundle\") pod \"902580f4-80d0-4299-8370-f42c118ce4e8\" (UID: \"902580f4-80d0-4299-8370-f42c118ce4e8\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.110328 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-logs" (OuterVolumeSpecName: "logs") pod "902580f4-80d0-4299-8370-f42c118ce4e8" (UID: "902580f4-80d0-4299-8370-f42c118ce4e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.110521 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "902580f4-80d0-4299-8370-f42c118ce4e8" (UID: "902580f4-80d0-4299-8370-f42c118ce4e8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.116735 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-scripts" (OuterVolumeSpecName: "scripts") pod "902580f4-80d0-4299-8370-f42c118ce4e8" (UID: "902580f4-80d0-4299-8370-f42c118ce4e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.120356 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902580f4-80d0-4299-8370-f42c118ce4e8-kube-api-access-ccrhj" (OuterVolumeSpecName: "kube-api-access-ccrhj") pod "902580f4-80d0-4299-8370-f42c118ce4e8" (UID: "902580f4-80d0-4299-8370-f42c118ce4e8"). InnerVolumeSpecName "kube-api-access-ccrhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.151450 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "902580f4-80d0-4299-8370-f42c118ce4e8" (UID: "902580f4-80d0-4299-8370-f42c118ce4e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.182409 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-config-data" (OuterVolumeSpecName: "config-data") pod "902580f4-80d0-4299-8370-f42c118ce4e8" (UID: "902580f4-80d0-4299-8370-f42c118ce4e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.198013 4917 generic.go:334] "Generic (PLEG): container finished" podID="902580f4-80d0-4299-8370-f42c118ce4e8" containerID="bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3" exitCode=0 Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.198045 4917 generic.go:334] "Generic (PLEG): container finished" podID="902580f4-80d0-4299-8370-f42c118ce4e8" containerID="a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a" exitCode=143 Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.198923 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.204784 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"902580f4-80d0-4299-8370-f42c118ce4e8","Type":"ContainerDied","Data":"bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3"} Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.204868 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"902580f4-80d0-4299-8370-f42c118ce4e8","Type":"ContainerDied","Data":"a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a"} Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.204890 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"902580f4-80d0-4299-8370-f42c118ce4e8","Type":"ContainerDied","Data":"6715152588f1550af2465ac6ee23f239eebb2a212588bdb3712f57d999df20e1"} Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.204911 4917 scope.go:117] "RemoveContainer" containerID="bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.204959 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerName="glance-log" containerID="cri-o://91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4" gracePeriod=30 Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.205068 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerName="glance-httpd" containerID="cri-o://6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f" gracePeriod=30 Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.212127 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.212156 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccrhj\" (UniqueName: \"kubernetes.io/projected/902580f4-80d0-4299-8370-f42c118ce4e8-kube-api-access-ccrhj\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.212168 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.212180 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.212187 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/902580f4-80d0-4299-8370-f42c118ce4e8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.212195 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902580f4-80d0-4299-8370-f42c118ce4e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.257204 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.260696 4917 scope.go:117] "RemoveContainer" containerID="a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.266740 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.278325 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:19:02 crc kubenswrapper[4917]: E0318 08:19:02.278769 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902580f4-80d0-4299-8370-f42c118ce4e8" containerName="glance-httpd" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.278792 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="902580f4-80d0-4299-8370-f42c118ce4e8" containerName="glance-httpd" Mar 18 08:19:02 crc kubenswrapper[4917]: E0318 08:19:02.278813 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902580f4-80d0-4299-8370-f42c118ce4e8" containerName="glance-log" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.278820 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="902580f4-80d0-4299-8370-f42c118ce4e8" containerName="glance-log" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.278985 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="902580f4-80d0-4299-8370-f42c118ce4e8" containerName="glance-log" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.279005 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="902580f4-80d0-4299-8370-f42c118ce4e8" containerName="glance-httpd" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.279935 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.281993 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.283907 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.286368 4917 scope.go:117] "RemoveContainer" containerID="bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3" Mar 18 08:19:02 crc kubenswrapper[4917]: E0318 08:19:02.286880 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3\": container with ID starting with bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3 not found: ID does not exist" containerID="bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.286974 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3"} err="failed to get container status \"bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3\": rpc error: code = NotFound desc = could not find container \"bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3\": container with ID starting with bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3 not found: ID does not exist" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.287097 4917 scope.go:117] "RemoveContainer" containerID="a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a" Mar 18 08:19:02 crc kubenswrapper[4917]: E0318 08:19:02.287457 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a\": container with ID starting with a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a not found: ID does not exist" containerID="a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.287492 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a"} err="failed to get container status \"a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a\": rpc error: code = NotFound desc = could not find container \"a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a\": container with ID starting with a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a not found: ID does not exist" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.287531 4917 scope.go:117] "RemoveContainer" containerID="bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.287957 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3"} err="failed to get container status \"bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3\": rpc error: code = NotFound desc = could not find container \"bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3\": container with ID starting with bf407f9b1e7f268b36a6f72cb2fc98fdaccef93b70fc7182fdf3bcd55c57a3f3 not found: ID does not exist" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.288001 4917 scope.go:117] "RemoveContainer" containerID="a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.288275 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a"} err="failed to get container status \"a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a\": rpc error: code = NotFound desc = could not find container \"a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a\": container with ID starting with a79837e628f74f94653766e014d55888ea67505732ed740977039e17e3b2718a not found: ID does not exist" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.293416 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.415478 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-scripts\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.415786 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.415832 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbcg9\" (UniqueName: \"kubernetes.io/projected/047f2cb4-a55b-41cf-88b0-34a29702ec38-kube-api-access-sbcg9\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.415852 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-logs\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.415877 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.415892 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.415937 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-config-data\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.517668 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-config-data\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.517760 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-scripts\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.517792 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.517833 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbcg9\" (UniqueName: \"kubernetes.io/projected/047f2cb4-a55b-41cf-88b0-34a29702ec38-kube-api-access-sbcg9\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.517850 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-logs\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.517877 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.517894 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.519512 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-logs\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.519776 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.527775 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-scripts\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.528283 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.528286 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.529438 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-config-data\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.543994 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbcg9\" (UniqueName: \"kubernetes.io/projected/047f2cb4-a55b-41cf-88b0-34a29702ec38-kube-api-access-sbcg9\") pod \"glance-default-external-api-0\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.605157 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.801503 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.924604 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-httpd-run\") pod \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.924676 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-combined-ca-bundle\") pod \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.924729 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf6xs\" (UniqueName: \"kubernetes.io/projected/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-kube-api-access-vf6xs\") pod \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.924840 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-scripts\") pod \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.924891 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-config-data\") pod \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.924957 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-logs\") pod \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\" (UID: \"a84de3d8-5b6a-4f65-9882-b0ba6f9228be\") " Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.926050 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-logs" (OuterVolumeSpecName: "logs") pod "a84de3d8-5b6a-4f65-9882-b0ba6f9228be" (UID: "a84de3d8-5b6a-4f65-9882-b0ba6f9228be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.926614 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a84de3d8-5b6a-4f65-9882-b0ba6f9228be" (UID: "a84de3d8-5b6a-4f65-9882-b0ba6f9228be"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.928771 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.928801 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.929964 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-scripts" (OuterVolumeSpecName: "scripts") pod "a84de3d8-5b6a-4f65-9882-b0ba6f9228be" (UID: "a84de3d8-5b6a-4f65-9882-b0ba6f9228be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.930145 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-kube-api-access-vf6xs" (OuterVolumeSpecName: "kube-api-access-vf6xs") pod "a84de3d8-5b6a-4f65-9882-b0ba6f9228be" (UID: "a84de3d8-5b6a-4f65-9882-b0ba6f9228be"). InnerVolumeSpecName "kube-api-access-vf6xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.951535 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a84de3d8-5b6a-4f65-9882-b0ba6f9228be" (UID: "a84de3d8-5b6a-4f65-9882-b0ba6f9228be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:19:02 crc kubenswrapper[4917]: I0318 08:19:02.981354 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-config-data" (OuterVolumeSpecName: "config-data") pod "a84de3d8-5b6a-4f65-9882-b0ba6f9228be" (UID: "a84de3d8-5b6a-4f65-9882-b0ba6f9228be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.031025 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.031069 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.031086 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.031100 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf6xs\" (UniqueName: \"kubernetes.io/projected/a84de3d8-5b6a-4f65-9882-b0ba6f9228be-kube-api-access-vf6xs\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.209548 4917 generic.go:334] "Generic (PLEG): container finished" podID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerID="6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f" exitCode=0 Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.209937 4917 generic.go:334] "Generic (PLEG): container finished" podID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerID="91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4" exitCode=143 Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.209615 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a84de3d8-5b6a-4f65-9882-b0ba6f9228be","Type":"ContainerDied","Data":"6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f"} Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.209662 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.210079 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a84de3d8-5b6a-4f65-9882-b0ba6f9228be","Type":"ContainerDied","Data":"91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4"} Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.210160 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a84de3d8-5b6a-4f65-9882-b0ba6f9228be","Type":"ContainerDied","Data":"7591e035c56b343cb29bc4ec44bd0697dd3ba20b656cb50e0dfdbaf607b2046e"} Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.210205 4917 scope.go:117] "RemoveContainer" containerID="6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.240931 4917 scope.go:117] "RemoveContainer" containerID="91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.242551 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.249832 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.260867 4917 scope.go:117] "RemoveContainer" containerID="6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f" Mar 18 08:19:03 crc kubenswrapper[4917]: E0318 08:19:03.261341 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f\": container with ID starting with 6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f not found: ID does not exist" containerID="6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.261374 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f"} err="failed to get container status \"6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f\": rpc error: code = NotFound desc = could not find container \"6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f\": container with ID starting with 6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f not found: ID does not exist" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.261396 4917 scope.go:117] "RemoveContainer" containerID="91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4" Mar 18 08:19:03 crc kubenswrapper[4917]: E0318 08:19:03.261749 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4\": container with ID starting with 91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4 not found: ID does not exist" containerID="91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.261772 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4"} err="failed to get container status \"91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4\": rpc error: code = NotFound desc = could not find container \"91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4\": container with ID starting with 91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4 not found: ID does not exist" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.261788 4917 scope.go:117] "RemoveContainer" containerID="6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.262020 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f"} err="failed to get container status \"6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f\": rpc error: code = NotFound desc = could not find container \"6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f\": container with ID starting with 6c8fc277be94535ea11798d33192b067e60a2610baa76a867431eac26172ff0f not found: ID does not exist" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.262038 4917 scope.go:117] "RemoveContainer" containerID="91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.262392 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4"} err="failed to get container status \"91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4\": rpc error: code = NotFound desc = could not find container \"91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4\": container with ID starting with 91cce85150c78d058429826c0aff46890cddd8d24ec0ebb19d863a968a5021f4 not found: ID does not exist" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.268748 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:19:03 crc kubenswrapper[4917]: E0318 08:19:03.269197 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerName="glance-httpd" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.269219 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerName="glance-httpd" Mar 18 08:19:03 crc kubenswrapper[4917]: E0318 08:19:03.269242 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerName="glance-log" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.269251 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerName="glance-log" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.269475 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerName="glance-httpd" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.269511 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" containerName="glance-log" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.270724 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.273718 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.273892 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.282529 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.323086 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.440535 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-logs\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.440707 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.440778 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prkpr\" (UniqueName: \"kubernetes.io/projected/265abbaf-11cf-4b9d-a72f-5265a57d02e1-kube-api-access-prkpr\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.441952 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.441981 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.442030 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.442081 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.544010 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prkpr\" (UniqueName: \"kubernetes.io/projected/265abbaf-11cf-4b9d-a72f-5265a57d02e1-kube-api-access-prkpr\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.544482 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.544510 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.544697 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.544742 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.544784 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-logs\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.544813 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.545165 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.545711 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-logs\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.553669 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.553707 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.556870 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.557384 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.564438 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prkpr\" (UniqueName: \"kubernetes.io/projected/265abbaf-11cf-4b9d-a72f-5265a57d02e1-kube-api-access-prkpr\") pod \"glance-default-internal-api-0\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.670153 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.797170 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902580f4-80d0-4299-8370-f42c118ce4e8" path="/var/lib/kubelet/pods/902580f4-80d0-4299-8370-f42c118ce4e8/volumes" Mar 18 08:19:03 crc kubenswrapper[4917]: I0318 08:19:03.798427 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a84de3d8-5b6a-4f65-9882-b0ba6f9228be" path="/var/lib/kubelet/pods/a84de3d8-5b6a-4f65-9882-b0ba6f9228be/volumes" Mar 18 08:19:04 crc kubenswrapper[4917]: I0318 08:19:04.178851 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:19:04 crc kubenswrapper[4917]: W0318 08:19:04.199184 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod265abbaf_11cf_4b9d_a72f_5265a57d02e1.slice/crio-42738e941d36fba83f410ca2c60b873c919eab08fe3424846281353ca6a7d155 WatchSource:0}: Error finding container 42738e941d36fba83f410ca2c60b873c919eab08fe3424846281353ca6a7d155: Status 404 returned error can't find the container with id 42738e941d36fba83f410ca2c60b873c919eab08fe3424846281353ca6a7d155 Mar 18 08:19:04 crc kubenswrapper[4917]: I0318 08:19:04.221074 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"047f2cb4-a55b-41cf-88b0-34a29702ec38","Type":"ContainerStarted","Data":"dbfa0f81af6e37352345c1c0ef2ff8f37746bca03d8aed509c46c4bd330e4e42"} Mar 18 08:19:04 crc kubenswrapper[4917]: I0318 08:19:04.221123 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"047f2cb4-a55b-41cf-88b0-34a29702ec38","Type":"ContainerStarted","Data":"05fa3c47a0985037f4a4d39835b76c02c47f9ec8d76a533076b03f8155eef3c3"} Mar 18 08:19:04 crc kubenswrapper[4917]: I0318 08:19:04.222700 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"265abbaf-11cf-4b9d-a72f-5265a57d02e1","Type":"ContainerStarted","Data":"42738e941d36fba83f410ca2c60b873c919eab08fe3424846281353ca6a7d155"} Mar 18 08:19:04 crc kubenswrapper[4917]: I0318 08:19:04.772807 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:19:04 crc kubenswrapper[4917]: E0318 08:19:04.773390 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:19:05 crc kubenswrapper[4917]: I0318 08:19:05.234992 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"265abbaf-11cf-4b9d-a72f-5265a57d02e1","Type":"ContainerStarted","Data":"324fcb1238a9c314e1fd7edcc6d35401edd8a86c3db980db17d27318db0e4249"} Mar 18 08:19:05 crc kubenswrapper[4917]: I0318 08:19:05.243011 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"047f2cb4-a55b-41cf-88b0-34a29702ec38","Type":"ContainerStarted","Data":"6473b0f3aac70b88f9abd869786c1a73a81aeff81ef68e26ab317d1f4cc5443d"} Mar 18 08:19:05 crc kubenswrapper[4917]: I0318 08:19:05.267186 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.267165398 podStartE2EDuration="3.267165398s" podCreationTimestamp="2026-03-18 08:19:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:19:05.262093905 +0000 UTC m=+5530.203248619" watchObservedRunningTime="2026-03-18 08:19:05.267165398 +0000 UTC m=+5530.208320122" Mar 18 08:19:06 crc kubenswrapper[4917]: I0318 08:19:06.261758 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"265abbaf-11cf-4b9d-a72f-5265a57d02e1","Type":"ContainerStarted","Data":"569cb979ff152b171b9c7fd88c371de8e2d06dea0eae265c59852e9973bfc260"} Mar 18 08:19:06 crc kubenswrapper[4917]: I0318 08:19:06.292738 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.29270949 podStartE2EDuration="3.29270949s" podCreationTimestamp="2026-03-18 08:19:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:19:06.286853107 +0000 UTC m=+5531.228007821" watchObservedRunningTime="2026-03-18 08:19:06.29270949 +0000 UTC m=+5531.233864194" Mar 18 08:19:08 crc kubenswrapper[4917]: I0318 08:19:08.901696 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:19:08 crc kubenswrapper[4917]: I0318 08:19:08.993100 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb7f96c85-6pxj5"] Mar 18 08:19:08 crc kubenswrapper[4917]: I0318 08:19:08.993351 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" podUID="c0ea0608-3bbb-463a-99ac-d54f4736fc1b" containerName="dnsmasq-dns" containerID="cri-o://19790dc3b98925c739c07c127221b9c4e5a900f0e816310447cbc3162481e2b1" gracePeriod=10 Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.297553 4917 generic.go:334] "Generic (PLEG): container finished" podID="c0ea0608-3bbb-463a-99ac-d54f4736fc1b" containerID="19790dc3b98925c739c07c127221b9c4e5a900f0e816310447cbc3162481e2b1" exitCode=0 Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.297742 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" event={"ID":"c0ea0608-3bbb-463a-99ac-d54f4736fc1b","Type":"ContainerDied","Data":"19790dc3b98925c739c07c127221b9c4e5a900f0e816310447cbc3162481e2b1"} Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.456757 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.575858 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9xwq\" (UniqueName: \"kubernetes.io/projected/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-kube-api-access-s9xwq\") pod \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.576020 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-sb\") pod \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.576070 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-config\") pod \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.576101 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-dns-svc\") pod \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.576176 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-nb\") pod \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\" (UID: \"c0ea0608-3bbb-463a-99ac-d54f4736fc1b\") " Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.589245 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-kube-api-access-s9xwq" (OuterVolumeSpecName: "kube-api-access-s9xwq") pod "c0ea0608-3bbb-463a-99ac-d54f4736fc1b" (UID: "c0ea0608-3bbb-463a-99ac-d54f4736fc1b"). InnerVolumeSpecName "kube-api-access-s9xwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.623429 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-config" (OuterVolumeSpecName: "config") pod "c0ea0608-3bbb-463a-99ac-d54f4736fc1b" (UID: "c0ea0608-3bbb-463a-99ac-d54f4736fc1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.629210 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0ea0608-3bbb-463a-99ac-d54f4736fc1b" (UID: "c0ea0608-3bbb-463a-99ac-d54f4736fc1b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.630376 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0ea0608-3bbb-463a-99ac-d54f4736fc1b" (UID: "c0ea0608-3bbb-463a-99ac-d54f4736fc1b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.640201 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0ea0608-3bbb-463a-99ac-d54f4736fc1b" (UID: "c0ea0608-3bbb-463a-99ac-d54f4736fc1b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.678163 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9xwq\" (UniqueName: \"kubernetes.io/projected/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-kube-api-access-s9xwq\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.678206 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.678221 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.678236 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:09 crc kubenswrapper[4917]: I0318 08:19:09.678248 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0ea0608-3bbb-463a-99ac-d54f4736fc1b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:10 crc kubenswrapper[4917]: I0318 08:19:10.309554 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" event={"ID":"c0ea0608-3bbb-463a-99ac-d54f4736fc1b","Type":"ContainerDied","Data":"1b4719fd7d503defdd18095300bfbbe03c2bf8be5a35652d55bf1ab7af9a129d"} Mar 18 08:19:10 crc kubenswrapper[4917]: I0318 08:19:10.309634 4917 scope.go:117] "RemoveContainer" containerID="19790dc3b98925c739c07c127221b9c4e5a900f0e816310447cbc3162481e2b1" Mar 18 08:19:10 crc kubenswrapper[4917]: I0318 08:19:10.309653 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb7f96c85-6pxj5" Mar 18 08:19:10 crc kubenswrapper[4917]: I0318 08:19:10.345437 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cb7f96c85-6pxj5"] Mar 18 08:19:10 crc kubenswrapper[4917]: I0318 08:19:10.345828 4917 scope.go:117] "RemoveContainer" containerID="aa9083ff473a50ffe0898142f884b794e03493a6d7f5861037d96c74decb51ad" Mar 18 08:19:10 crc kubenswrapper[4917]: I0318 08:19:10.368641 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cb7f96c85-6pxj5"] Mar 18 08:19:11 crc kubenswrapper[4917]: I0318 08:19:11.782662 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ea0608-3bbb-463a-99ac-d54f4736fc1b" path="/var/lib/kubelet/pods/c0ea0608-3bbb-463a-99ac-d54f4736fc1b/volumes" Mar 18 08:19:12 crc kubenswrapper[4917]: I0318 08:19:12.606636 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 08:19:12 crc kubenswrapper[4917]: I0318 08:19:12.606691 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 08:19:12 crc kubenswrapper[4917]: I0318 08:19:12.650855 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 08:19:12 crc kubenswrapper[4917]: I0318 08:19:12.668247 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 08:19:13 crc kubenswrapper[4917]: I0318 08:19:13.346443 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 08:19:13 crc kubenswrapper[4917]: I0318 08:19:13.348531 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 08:19:13 crc kubenswrapper[4917]: I0318 08:19:13.671228 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:13 crc kubenswrapper[4917]: I0318 08:19:13.671279 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:13 crc kubenswrapper[4917]: I0318 08:19:13.713815 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:13 crc kubenswrapper[4917]: I0318 08:19:13.743546 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:14 crc kubenswrapper[4917]: I0318 08:19:14.357552 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:14 crc kubenswrapper[4917]: I0318 08:19:14.357763 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:15 crc kubenswrapper[4917]: I0318 08:19:15.257250 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 08:19:15 crc kubenswrapper[4917]: I0318 08:19:15.278372 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 08:19:16 crc kubenswrapper[4917]: I0318 08:19:16.222125 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:16 crc kubenswrapper[4917]: I0318 08:19:16.244064 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 08:19:16 crc kubenswrapper[4917]: I0318 08:19:16.773090 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:19:16 crc kubenswrapper[4917]: E0318 08:19:16.773808 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.248113 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-vmff7"] Mar 18 08:19:26 crc kubenswrapper[4917]: E0318 08:19:26.249066 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ea0608-3bbb-463a-99ac-d54f4736fc1b" containerName="init" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.249084 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ea0608-3bbb-463a-99ac-d54f4736fc1b" containerName="init" Mar 18 08:19:26 crc kubenswrapper[4917]: E0318 08:19:26.249108 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ea0608-3bbb-463a-99ac-d54f4736fc1b" containerName="dnsmasq-dns" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.249117 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ea0608-3bbb-463a-99ac-d54f4736fc1b" containerName="dnsmasq-dns" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.249326 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ea0608-3bbb-463a-99ac-d54f4736fc1b" containerName="dnsmasq-dns" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.250095 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vmff7" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.254988 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vmff7"] Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.351766 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3ec6-account-create-update-mg7k5"] Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.374461 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3ec6-account-create-update-mg7k5"] Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.374597 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.377321 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.416817 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ba4b545-6574-461b-88fb-11801374835a-operator-scripts\") pod \"placement-db-create-vmff7\" (UID: \"6ba4b545-6574-461b-88fb-11801374835a\") " pod="openstack/placement-db-create-vmff7" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.416935 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d9rp\" (UniqueName: \"kubernetes.io/projected/6ba4b545-6574-461b-88fb-11801374835a-kube-api-access-4d9rp\") pod \"placement-db-create-vmff7\" (UID: \"6ba4b545-6574-461b-88fb-11801374835a\") " pod="openstack/placement-db-create-vmff7" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.519994 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d9rp\" (UniqueName: \"kubernetes.io/projected/6ba4b545-6574-461b-88fb-11801374835a-kube-api-access-4d9rp\") pod \"placement-db-create-vmff7\" (UID: \"6ba4b545-6574-461b-88fb-11801374835a\") " pod="openstack/placement-db-create-vmff7" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.520409 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f7rn\" (UniqueName: \"kubernetes.io/projected/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-kube-api-access-6f7rn\") pod \"placement-3ec6-account-create-update-mg7k5\" (UID: \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\") " pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.520511 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ba4b545-6574-461b-88fb-11801374835a-operator-scripts\") pod \"placement-db-create-vmff7\" (UID: \"6ba4b545-6574-461b-88fb-11801374835a\") " pod="openstack/placement-db-create-vmff7" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.520658 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-operator-scripts\") pod \"placement-3ec6-account-create-update-mg7k5\" (UID: \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\") " pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.521373 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ba4b545-6574-461b-88fb-11801374835a-operator-scripts\") pod \"placement-db-create-vmff7\" (UID: \"6ba4b545-6574-461b-88fb-11801374835a\") " pod="openstack/placement-db-create-vmff7" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.541906 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d9rp\" (UniqueName: \"kubernetes.io/projected/6ba4b545-6574-461b-88fb-11801374835a-kube-api-access-4d9rp\") pod \"placement-db-create-vmff7\" (UID: \"6ba4b545-6574-461b-88fb-11801374835a\") " pod="openstack/placement-db-create-vmff7" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.577366 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vmff7" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.622299 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-operator-scripts\") pod \"placement-3ec6-account-create-update-mg7k5\" (UID: \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\") " pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.622460 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f7rn\" (UniqueName: \"kubernetes.io/projected/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-kube-api-access-6f7rn\") pod \"placement-3ec6-account-create-update-mg7k5\" (UID: \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\") " pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.623688 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-operator-scripts\") pod \"placement-3ec6-account-create-update-mg7k5\" (UID: \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\") " pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.641326 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f7rn\" (UniqueName: \"kubernetes.io/projected/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-kube-api-access-6f7rn\") pod \"placement-3ec6-account-create-update-mg7k5\" (UID: \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\") " pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:26 crc kubenswrapper[4917]: I0318 08:19:26.704121 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:27 crc kubenswrapper[4917]: I0318 08:19:27.112960 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-vmff7"] Mar 18 08:19:27 crc kubenswrapper[4917]: W0318 08:19:27.113023 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ba4b545_6574_461b_88fb_11801374835a.slice/crio-3b464d8ae9d2d29d33ab353546cb7e2796582cb0d66eec56e098469a8e0b0a79 WatchSource:0}: Error finding container 3b464d8ae9d2d29d33ab353546cb7e2796582cb0d66eec56e098469a8e0b0a79: Status 404 returned error can't find the container with id 3b464d8ae9d2d29d33ab353546cb7e2796582cb0d66eec56e098469a8e0b0a79 Mar 18 08:19:27 crc kubenswrapper[4917]: W0318 08:19:27.174021 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0464b93f_ba11_4e92_8b47_5eac3f0e08c6.slice/crio-bd20496b80fb64b81a4fbf663a84ba6831f33de5ae8a40b2bc774d9adbae8ece WatchSource:0}: Error finding container bd20496b80fb64b81a4fbf663a84ba6831f33de5ae8a40b2bc774d9adbae8ece: Status 404 returned error can't find the container with id bd20496b80fb64b81a4fbf663a84ba6831f33de5ae8a40b2bc774d9adbae8ece Mar 18 08:19:27 crc kubenswrapper[4917]: I0318 08:19:27.176951 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3ec6-account-create-update-mg7k5"] Mar 18 08:19:27 crc kubenswrapper[4917]: I0318 08:19:27.497162 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vmff7" event={"ID":"6ba4b545-6574-461b-88fb-11801374835a","Type":"ContainerStarted","Data":"71084d56800a918dadab70c67f49ab25742f94e671e9b330a83e074d8835e906"} Mar 18 08:19:27 crc kubenswrapper[4917]: I0318 08:19:27.497227 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vmff7" event={"ID":"6ba4b545-6574-461b-88fb-11801374835a","Type":"ContainerStarted","Data":"3b464d8ae9d2d29d33ab353546cb7e2796582cb0d66eec56e098469a8e0b0a79"} Mar 18 08:19:27 crc kubenswrapper[4917]: I0318 08:19:27.498473 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3ec6-account-create-update-mg7k5" event={"ID":"0464b93f-ba11-4e92-8b47-5eac3f0e08c6","Type":"ContainerStarted","Data":"a9e70d4d7a0a28fe98fa44ac445d4c17303344845d458cbb1adf656bf91c940d"} Mar 18 08:19:27 crc kubenswrapper[4917]: I0318 08:19:27.498529 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3ec6-account-create-update-mg7k5" event={"ID":"0464b93f-ba11-4e92-8b47-5eac3f0e08c6","Type":"ContainerStarted","Data":"bd20496b80fb64b81a4fbf663a84ba6831f33de5ae8a40b2bc774d9adbae8ece"} Mar 18 08:19:27 crc kubenswrapper[4917]: I0318 08:19:27.521248 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-vmff7" podStartSLOduration=1.521231021 podStartE2EDuration="1.521231021s" podCreationTimestamp="2026-03-18 08:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:19:27.511233779 +0000 UTC m=+5552.452388503" watchObservedRunningTime="2026-03-18 08:19:27.521231021 +0000 UTC m=+5552.462385735" Mar 18 08:19:27 crc kubenswrapper[4917]: I0318 08:19:27.534617 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-3ec6-account-create-update-mg7k5" podStartSLOduration=1.534596036 podStartE2EDuration="1.534596036s" podCreationTimestamp="2026-03-18 08:19:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:19:27.529496352 +0000 UTC m=+5552.470651066" watchObservedRunningTime="2026-03-18 08:19:27.534596036 +0000 UTC m=+5552.475750750" Mar 18 08:19:28 crc kubenswrapper[4917]: I0318 08:19:28.514326 4917 generic.go:334] "Generic (PLEG): container finished" podID="0464b93f-ba11-4e92-8b47-5eac3f0e08c6" containerID="a9e70d4d7a0a28fe98fa44ac445d4c17303344845d458cbb1adf656bf91c940d" exitCode=0 Mar 18 08:19:28 crc kubenswrapper[4917]: I0318 08:19:28.514531 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3ec6-account-create-update-mg7k5" event={"ID":"0464b93f-ba11-4e92-8b47-5eac3f0e08c6","Type":"ContainerDied","Data":"a9e70d4d7a0a28fe98fa44ac445d4c17303344845d458cbb1adf656bf91c940d"} Mar 18 08:19:28 crc kubenswrapper[4917]: I0318 08:19:28.518524 4917 generic.go:334] "Generic (PLEG): container finished" podID="6ba4b545-6574-461b-88fb-11801374835a" containerID="71084d56800a918dadab70c67f49ab25742f94e671e9b330a83e074d8835e906" exitCode=0 Mar 18 08:19:28 crc kubenswrapper[4917]: I0318 08:19:28.518630 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vmff7" event={"ID":"6ba4b545-6574-461b-88fb-11801374835a","Type":"ContainerDied","Data":"71084d56800a918dadab70c67f49ab25742f94e671e9b330a83e074d8835e906"} Mar 18 08:19:29 crc kubenswrapper[4917]: I0318 08:19:29.773450 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:19:29 crc kubenswrapper[4917]: E0318 08:19:29.774029 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.047302 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vmff7" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.053053 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.188257 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ba4b545-6574-461b-88fb-11801374835a-operator-scripts\") pod \"6ba4b545-6574-461b-88fb-11801374835a\" (UID: \"6ba4b545-6574-461b-88fb-11801374835a\") " Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.188606 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f7rn\" (UniqueName: \"kubernetes.io/projected/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-kube-api-access-6f7rn\") pod \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\" (UID: \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\") " Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.188639 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-operator-scripts\") pod \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\" (UID: \"0464b93f-ba11-4e92-8b47-5eac3f0e08c6\") " Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.188685 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d9rp\" (UniqueName: \"kubernetes.io/projected/6ba4b545-6574-461b-88fb-11801374835a-kube-api-access-4d9rp\") pod \"6ba4b545-6574-461b-88fb-11801374835a\" (UID: \"6ba4b545-6574-461b-88fb-11801374835a\") " Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.189075 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ba4b545-6574-461b-88fb-11801374835a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ba4b545-6574-461b-88fb-11801374835a" (UID: "6ba4b545-6574-461b-88fb-11801374835a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.189114 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0464b93f-ba11-4e92-8b47-5eac3f0e08c6" (UID: "0464b93f-ba11-4e92-8b47-5eac3f0e08c6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.193683 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba4b545-6574-461b-88fb-11801374835a-kube-api-access-4d9rp" (OuterVolumeSpecName: "kube-api-access-4d9rp") pod "6ba4b545-6574-461b-88fb-11801374835a" (UID: "6ba4b545-6574-461b-88fb-11801374835a"). InnerVolumeSpecName "kube-api-access-4d9rp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.193757 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-kube-api-access-6f7rn" (OuterVolumeSpecName: "kube-api-access-6f7rn") pod "0464b93f-ba11-4e92-8b47-5eac3f0e08c6" (UID: "0464b93f-ba11-4e92-8b47-5eac3f0e08c6"). InnerVolumeSpecName "kube-api-access-6f7rn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.290573 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f7rn\" (UniqueName: \"kubernetes.io/projected/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-kube-api-access-6f7rn\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.290627 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0464b93f-ba11-4e92-8b47-5eac3f0e08c6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.290652 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d9rp\" (UniqueName: \"kubernetes.io/projected/6ba4b545-6574-461b-88fb-11801374835a-kube-api-access-4d9rp\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.290661 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ba4b545-6574-461b-88fb-11801374835a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.553259 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-vmff7" event={"ID":"6ba4b545-6574-461b-88fb-11801374835a","Type":"ContainerDied","Data":"3b464d8ae9d2d29d33ab353546cb7e2796582cb0d66eec56e098469a8e0b0a79"} Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.553547 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b464d8ae9d2d29d33ab353546cb7e2796582cb0d66eec56e098469a8e0b0a79" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.553337 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-vmff7" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.558967 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3ec6-account-create-update-mg7k5" event={"ID":"0464b93f-ba11-4e92-8b47-5eac3f0e08c6","Type":"ContainerDied","Data":"bd20496b80fb64b81a4fbf663a84ba6831f33de5ae8a40b2bc774d9adbae8ece"} Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.559134 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd20496b80fb64b81a4fbf663a84ba6831f33de5ae8a40b2bc774d9adbae8ece" Mar 18 08:19:30 crc kubenswrapper[4917]: I0318 08:19:30.559274 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3ec6-account-create-update-mg7k5" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.699061 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-656f87c797-qg9dj"] Mar 18 08:19:31 crc kubenswrapper[4917]: E0318 08:19:31.699399 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0464b93f-ba11-4e92-8b47-5eac3f0e08c6" containerName="mariadb-account-create-update" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.699411 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0464b93f-ba11-4e92-8b47-5eac3f0e08c6" containerName="mariadb-account-create-update" Mar 18 08:19:31 crc kubenswrapper[4917]: E0318 08:19:31.699438 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba4b545-6574-461b-88fb-11801374835a" containerName="mariadb-database-create" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.699445 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba4b545-6574-461b-88fb-11801374835a" containerName="mariadb-database-create" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.699627 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba4b545-6574-461b-88fb-11801374835a" containerName="mariadb-database-create" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.699646 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0464b93f-ba11-4e92-8b47-5eac3f0e08c6" containerName="mariadb-account-create-update" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.700493 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.715389 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-656f87c797-qg9dj"] Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.725549 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-22dvx"] Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.727369 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.729505 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.730714 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bqb5x" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.730732 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.749566 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-22dvx"] Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.820702 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht26d\" (UniqueName: \"kubernetes.io/projected/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-kube-api-access-ht26d\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.820885 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-scripts\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.820937 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-dns-svc\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.821089 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-sb\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.821261 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds8xn\" (UniqueName: \"kubernetes.io/projected/db65ec91-ca07-49f0-8863-4c1a210f92b4-kube-api-access-ds8xn\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.821314 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-combined-ca-bundle\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.821386 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-nb\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.821447 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-config-data\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.821512 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-config\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.821571 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db65ec91-ca07-49f0-8863-4c1a210f92b4-logs\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.922724 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-config\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.922791 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db65ec91-ca07-49f0-8863-4c1a210f92b4-logs\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.922851 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht26d\" (UniqueName: \"kubernetes.io/projected/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-kube-api-access-ht26d\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.922930 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-scripts\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.922959 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-dns-svc\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.923001 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-sb\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.923046 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds8xn\" (UniqueName: \"kubernetes.io/projected/db65ec91-ca07-49f0-8863-4c1a210f92b4-kube-api-access-ds8xn\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.923072 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-combined-ca-bundle\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.923108 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-nb\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.923142 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-config-data\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.923641 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db65ec91-ca07-49f0-8863-4c1a210f92b4-logs\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.924048 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-config\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.924090 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-dns-svc\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.924229 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-nb\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.924278 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-sb\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.935299 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-scripts\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.935663 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-config-data\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.936438 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-combined-ca-bundle\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.941269 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds8xn\" (UniqueName: \"kubernetes.io/projected/db65ec91-ca07-49f0-8863-4c1a210f92b4-kube-api-access-ds8xn\") pod \"placement-db-sync-22dvx\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:31 crc kubenswrapper[4917]: I0318 08:19:31.944863 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht26d\" (UniqueName: \"kubernetes.io/projected/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-kube-api-access-ht26d\") pod \"dnsmasq-dns-656f87c797-qg9dj\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:32 crc kubenswrapper[4917]: I0318 08:19:32.018265 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:32 crc kubenswrapper[4917]: I0318 08:19:32.053974 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:32 crc kubenswrapper[4917]: I0318 08:19:32.524750 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-656f87c797-qg9dj"] Mar 18 08:19:32 crc kubenswrapper[4917]: W0318 08:19:32.533406 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb65ec91_ca07_49f0_8863_4c1a210f92b4.slice/crio-13f48cc1d4a504b9312e8bf061bd5109b6eae3c3c28bddea5e42c4c673e0e777 WatchSource:0}: Error finding container 13f48cc1d4a504b9312e8bf061bd5109b6eae3c3c28bddea5e42c4c673e0e777: Status 404 returned error can't find the container with id 13f48cc1d4a504b9312e8bf061bd5109b6eae3c3c28bddea5e42c4c673e0e777 Mar 18 08:19:32 crc kubenswrapper[4917]: I0318 08:19:32.534028 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-22dvx"] Mar 18 08:19:32 crc kubenswrapper[4917]: I0318 08:19:32.583065 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" event={"ID":"55cdbe8e-2d42-49f7-aedc-e6874b0d7971","Type":"ContainerStarted","Data":"fc0738fbcb1cc1d6b8352b4c0553b59a1d9a961b814827504ba4236d31013ddb"} Mar 18 08:19:32 crc kubenswrapper[4917]: I0318 08:19:32.584159 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-22dvx" event={"ID":"db65ec91-ca07-49f0-8863-4c1a210f92b4","Type":"ContainerStarted","Data":"13f48cc1d4a504b9312e8bf061bd5109b6eae3c3c28bddea5e42c4c673e0e777"} Mar 18 08:19:33 crc kubenswrapper[4917]: I0318 08:19:33.601754 4917 generic.go:334] "Generic (PLEG): container finished" podID="55cdbe8e-2d42-49f7-aedc-e6874b0d7971" containerID="b73ed3f71dbcf66e2fa8472bccd53e3ee537e769418029d3c8095a2d76a23412" exitCode=0 Mar 18 08:19:33 crc kubenswrapper[4917]: I0318 08:19:33.601827 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" event={"ID":"55cdbe8e-2d42-49f7-aedc-e6874b0d7971","Type":"ContainerDied","Data":"b73ed3f71dbcf66e2fa8472bccd53e3ee537e769418029d3c8095a2d76a23412"} Mar 18 08:19:34 crc kubenswrapper[4917]: I0318 08:19:34.621100 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" event={"ID":"55cdbe8e-2d42-49f7-aedc-e6874b0d7971","Type":"ContainerStarted","Data":"d27d4f5ff6fdcc566e05d08dd48d289282686789257ceab5375bb4e4e4b0f9bc"} Mar 18 08:19:34 crc kubenswrapper[4917]: I0318 08:19:34.623136 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:35 crc kubenswrapper[4917]: I0318 08:19:35.802709 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" podStartSLOduration=4.802663636 podStartE2EDuration="4.802663636s" podCreationTimestamp="2026-03-18 08:19:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:19:34.645965204 +0000 UTC m=+5559.587119978" watchObservedRunningTime="2026-03-18 08:19:35.802663636 +0000 UTC m=+5560.743818360" Mar 18 08:19:36 crc kubenswrapper[4917]: I0318 08:19:36.642223 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-22dvx" event={"ID":"db65ec91-ca07-49f0-8863-4c1a210f92b4","Type":"ContainerStarted","Data":"2a468d7267b1c821da59bbc01f67fa4a484b687a8fa6268586999baa993da4e7"} Mar 18 08:19:36 crc kubenswrapper[4917]: I0318 08:19:36.667569 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-22dvx" podStartSLOduration=1.96057631 podStartE2EDuration="5.667546692s" podCreationTimestamp="2026-03-18 08:19:31 +0000 UTC" firstStartedPulling="2026-03-18 08:19:32.537442427 +0000 UTC m=+5557.478597131" lastFinishedPulling="2026-03-18 08:19:36.244412799 +0000 UTC m=+5561.185567513" observedRunningTime="2026-03-18 08:19:36.663667808 +0000 UTC m=+5561.604822562" watchObservedRunningTime="2026-03-18 08:19:36.667546692 +0000 UTC m=+5561.608701426" Mar 18 08:19:38 crc kubenswrapper[4917]: I0318 08:19:38.662511 4917 generic.go:334] "Generic (PLEG): container finished" podID="db65ec91-ca07-49f0-8863-4c1a210f92b4" containerID="2a468d7267b1c821da59bbc01f67fa4a484b687a8fa6268586999baa993da4e7" exitCode=0 Mar 18 08:19:38 crc kubenswrapper[4917]: I0318 08:19:38.662571 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-22dvx" event={"ID":"db65ec91-ca07-49f0-8863-4c1a210f92b4","Type":"ContainerDied","Data":"2a468d7267b1c821da59bbc01f67fa4a484b687a8fa6268586999baa993da4e7"} Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.085411 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.202213 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db65ec91-ca07-49f0-8863-4c1a210f92b4-logs\") pod \"db65ec91-ca07-49f0-8863-4c1a210f92b4\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.202342 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds8xn\" (UniqueName: \"kubernetes.io/projected/db65ec91-ca07-49f0-8863-4c1a210f92b4-kube-api-access-ds8xn\") pod \"db65ec91-ca07-49f0-8863-4c1a210f92b4\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.202383 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-config-data\") pod \"db65ec91-ca07-49f0-8863-4c1a210f92b4\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.202422 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-scripts\") pod \"db65ec91-ca07-49f0-8863-4c1a210f92b4\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.202505 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-combined-ca-bundle\") pod \"db65ec91-ca07-49f0-8863-4c1a210f92b4\" (UID: \"db65ec91-ca07-49f0-8863-4c1a210f92b4\") " Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.203159 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db65ec91-ca07-49f0-8863-4c1a210f92b4-logs" (OuterVolumeSpecName: "logs") pod "db65ec91-ca07-49f0-8863-4c1a210f92b4" (UID: "db65ec91-ca07-49f0-8863-4c1a210f92b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.208864 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-scripts" (OuterVolumeSpecName: "scripts") pod "db65ec91-ca07-49f0-8863-4c1a210f92b4" (UID: "db65ec91-ca07-49f0-8863-4c1a210f92b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.210892 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db65ec91-ca07-49f0-8863-4c1a210f92b4-kube-api-access-ds8xn" (OuterVolumeSpecName: "kube-api-access-ds8xn") pod "db65ec91-ca07-49f0-8863-4c1a210f92b4" (UID: "db65ec91-ca07-49f0-8863-4c1a210f92b4"). InnerVolumeSpecName "kube-api-access-ds8xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.241400 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-config-data" (OuterVolumeSpecName: "config-data") pod "db65ec91-ca07-49f0-8863-4c1a210f92b4" (UID: "db65ec91-ca07-49f0-8863-4c1a210f92b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.242374 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db65ec91-ca07-49f0-8863-4c1a210f92b4" (UID: "db65ec91-ca07-49f0-8863-4c1a210f92b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.304148 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.304183 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db65ec91-ca07-49f0-8863-4c1a210f92b4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.304194 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds8xn\" (UniqueName: \"kubernetes.io/projected/db65ec91-ca07-49f0-8863-4c1a210f92b4-kube-api-access-ds8xn\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.304203 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.304214 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db65ec91-ca07-49f0-8863-4c1a210f92b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.685558 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-22dvx" event={"ID":"db65ec91-ca07-49f0-8863-4c1a210f92b4","Type":"ContainerDied","Data":"13f48cc1d4a504b9312e8bf061bd5109b6eae3c3c28bddea5e42c4c673e0e777"} Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.686321 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13f48cc1d4a504b9312e8bf061bd5109b6eae3c3c28bddea5e42c4c673e0e777" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.685695 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-22dvx" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.919179 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d866b6d76-b4474"] Mar 18 08:19:40 crc kubenswrapper[4917]: E0318 08:19:40.920379 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db65ec91-ca07-49f0-8863-4c1a210f92b4" containerName="placement-db-sync" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.920407 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="db65ec91-ca07-49f0-8863-4c1a210f92b4" containerName="placement-db-sync" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.920676 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="db65ec91-ca07-49f0-8863-4c1a210f92b4" containerName="placement-db-sync" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.922776 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.924939 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.925660 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-bqb5x" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.926168 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.927919 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.928671 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 18 08:19:40 crc kubenswrapper[4917]: I0318 08:19:40.953846 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d866b6d76-b4474"] Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.015722 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94f6q\" (UniqueName: \"kubernetes.io/projected/907d7650-17ad-415e-b034-7545f9a0df95-kube-api-access-94f6q\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.015790 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-internal-tls-certs\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.015819 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-scripts\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.016011 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907d7650-17ad-415e-b034-7545f9a0df95-logs\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.016159 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-public-tls-certs\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.016208 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-combined-ca-bundle\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.016412 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-config-data\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.118781 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94f6q\" (UniqueName: \"kubernetes.io/projected/907d7650-17ad-415e-b034-7545f9a0df95-kube-api-access-94f6q\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.118867 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-internal-tls-certs\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.118921 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-scripts\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.118952 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907d7650-17ad-415e-b034-7545f9a0df95-logs\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.119022 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-public-tls-certs\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.119050 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-combined-ca-bundle\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.119163 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-config-data\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.119939 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/907d7650-17ad-415e-b034-7545f9a0df95-logs\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.123740 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-internal-tls-certs\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.123999 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-public-tls-certs\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.124482 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-combined-ca-bundle\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.124624 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-config-data\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.124894 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/907d7650-17ad-415e-b034-7545f9a0df95-scripts\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.140845 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94f6q\" (UniqueName: \"kubernetes.io/projected/907d7650-17ad-415e-b034-7545f9a0df95-kube-api-access-94f6q\") pod \"placement-d866b6d76-b4474\" (UID: \"907d7650-17ad-415e-b034-7545f9a0df95\") " pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.265303 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.770174 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d866b6d76-b4474"] Mar 18 08:19:41 crc kubenswrapper[4917]: I0318 08:19:41.773532 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:19:41 crc kubenswrapper[4917]: E0318 08:19:41.800836 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.020940 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.086489 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d8db65bf-qn8c9"] Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.087112 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" podUID="3dac1607-cb12-4d64-aef7-321e74512588" containerName="dnsmasq-dns" containerID="cri-o://fbf4a49cf2ccf29f65fc784a461789ed30313cbb669fc1751c34d5b7c02a9989" gracePeriod=10 Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.713478 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d866b6d76-b4474" event={"ID":"907d7650-17ad-415e-b034-7545f9a0df95","Type":"ContainerStarted","Data":"092b281c46e6c2627c2eb3d30dfdba033d17110beab0022a6420391849e4a5a8"} Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.713821 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d866b6d76-b4474" event={"ID":"907d7650-17ad-415e-b034-7545f9a0df95","Type":"ContainerStarted","Data":"af819c57e0bf21ce54b782007d3c285f6b88a0a4ecd2719922140c59bb23cc41"} Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.713843 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.713856 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d866b6d76-b4474" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.713866 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d866b6d76-b4474" event={"ID":"907d7650-17ad-415e-b034-7545f9a0df95","Type":"ContainerStarted","Data":"58e1ed11803b105537b2e1889325415c90517ae5848b5cf1342108a0c4f8fc66"} Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.719003 4917 generic.go:334] "Generic (PLEG): container finished" podID="3dac1607-cb12-4d64-aef7-321e74512588" containerID="fbf4a49cf2ccf29f65fc784a461789ed30313cbb669fc1751c34d5b7c02a9989" exitCode=0 Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.719072 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" event={"ID":"3dac1607-cb12-4d64-aef7-321e74512588","Type":"ContainerDied","Data":"fbf4a49cf2ccf29f65fc784a461789ed30313cbb669fc1751c34d5b7c02a9989"} Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.719127 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" event={"ID":"3dac1607-cb12-4d64-aef7-321e74512588","Type":"ContainerDied","Data":"df3aad02ea7b05b1f90d0335d5bc77c7e24255208e212b6df0448a0d977bbd32"} Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.721539 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df3aad02ea7b05b1f90d0335d5bc77c7e24255208e212b6df0448a0d977bbd32" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.732105 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.735160 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d866b6d76-b4474" podStartSLOduration=2.735141075 podStartE2EDuration="2.735141075s" podCreationTimestamp="2026-03-18 08:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:19:42.731963548 +0000 UTC m=+5567.673118272" watchObservedRunningTime="2026-03-18 08:19:42.735141075 +0000 UTC m=+5567.676295799" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.852256 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnsnj\" (UniqueName: \"kubernetes.io/projected/3dac1607-cb12-4d64-aef7-321e74512588-kube-api-access-bnsnj\") pod \"3dac1607-cb12-4d64-aef7-321e74512588\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.852358 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-dns-svc\") pod \"3dac1607-cb12-4d64-aef7-321e74512588\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.852414 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-nb\") pod \"3dac1607-cb12-4d64-aef7-321e74512588\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.852476 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-sb\") pod \"3dac1607-cb12-4d64-aef7-321e74512588\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.852508 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-config\") pod \"3dac1607-cb12-4d64-aef7-321e74512588\" (UID: \"3dac1607-cb12-4d64-aef7-321e74512588\") " Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.858205 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dac1607-cb12-4d64-aef7-321e74512588-kube-api-access-bnsnj" (OuterVolumeSpecName: "kube-api-access-bnsnj") pod "3dac1607-cb12-4d64-aef7-321e74512588" (UID: "3dac1607-cb12-4d64-aef7-321e74512588"). InnerVolumeSpecName "kube-api-access-bnsnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.897180 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3dac1607-cb12-4d64-aef7-321e74512588" (UID: "3dac1607-cb12-4d64-aef7-321e74512588"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.904280 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-config" (OuterVolumeSpecName: "config") pod "3dac1607-cb12-4d64-aef7-321e74512588" (UID: "3dac1607-cb12-4d64-aef7-321e74512588"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.908839 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3dac1607-cb12-4d64-aef7-321e74512588" (UID: "3dac1607-cb12-4d64-aef7-321e74512588"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.923728 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3dac1607-cb12-4d64-aef7-321e74512588" (UID: "3dac1607-cb12-4d64-aef7-321e74512588"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.954017 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.954055 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.954067 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.954081 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dac1607-cb12-4d64-aef7-321e74512588-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:42 crc kubenswrapper[4917]: I0318 08:19:42.954094 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnsnj\" (UniqueName: \"kubernetes.io/projected/3dac1607-cb12-4d64-aef7-321e74512588-kube-api-access-bnsnj\") on node \"crc\" DevicePath \"\"" Mar 18 08:19:43 crc kubenswrapper[4917]: I0318 08:19:43.726843 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d8db65bf-qn8c9" Mar 18 08:19:43 crc kubenswrapper[4917]: I0318 08:19:43.760659 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d8db65bf-qn8c9"] Mar 18 08:19:43 crc kubenswrapper[4917]: I0318 08:19:43.784468 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d8db65bf-qn8c9"] Mar 18 08:19:45 crc kubenswrapper[4917]: I0318 08:19:45.792161 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dac1607-cb12-4d64-aef7-321e74512588" path="/var/lib/kubelet/pods/3dac1607-cb12-4d64-aef7-321e74512588/volumes" Mar 18 08:19:56 crc kubenswrapper[4917]: I0318 08:19:56.773335 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:19:56 crc kubenswrapper[4917]: E0318 08:19:56.774609 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.136160 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563700-skttq"] Mar 18 08:20:00 crc kubenswrapper[4917]: E0318 08:20:00.137763 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dac1607-cb12-4d64-aef7-321e74512588" containerName="dnsmasq-dns" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.137785 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dac1607-cb12-4d64-aef7-321e74512588" containerName="dnsmasq-dns" Mar 18 08:20:00 crc kubenswrapper[4917]: E0318 08:20:00.137828 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dac1607-cb12-4d64-aef7-321e74512588" containerName="init" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.137841 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dac1607-cb12-4d64-aef7-321e74512588" containerName="init" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.138157 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dac1607-cb12-4d64-aef7-321e74512588" containerName="dnsmasq-dns" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.139197 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563700-skttq" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.141562 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.142013 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.144932 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.145157 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563700-skttq"] Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.276323 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z54ft\" (UniqueName: \"kubernetes.io/projected/2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c-kube-api-access-z54ft\") pod \"auto-csr-approver-29563700-skttq\" (UID: \"2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c\") " pod="openshift-infra/auto-csr-approver-29563700-skttq" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.378227 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z54ft\" (UniqueName: \"kubernetes.io/projected/2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c-kube-api-access-z54ft\") pod \"auto-csr-approver-29563700-skttq\" (UID: \"2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c\") " pod="openshift-infra/auto-csr-approver-29563700-skttq" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.409617 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z54ft\" (UniqueName: \"kubernetes.io/projected/2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c-kube-api-access-z54ft\") pod \"auto-csr-approver-29563700-skttq\" (UID: \"2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c\") " pod="openshift-infra/auto-csr-approver-29563700-skttq" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.461771 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563700-skttq" Mar 18 08:20:00 crc kubenswrapper[4917]: I0318 08:20:00.970149 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563700-skttq"] Mar 18 08:20:01 crc kubenswrapper[4917]: I0318 08:20:01.927280 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563700-skttq" event={"ID":"2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c","Type":"ContainerStarted","Data":"5e9ffbd164506435a3a43edea42295c977b4f2d3dff02511c1747cef509284ef"} Mar 18 08:20:02 crc kubenswrapper[4917]: I0318 08:20:02.943726 4917 generic.go:334] "Generic (PLEG): container finished" podID="2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c" containerID="060da4ba1c5057d1e59507b771cb775090141d7ea0ac5f28357ed96dbf517d9b" exitCode=0 Mar 18 08:20:02 crc kubenswrapper[4917]: I0318 08:20:02.943883 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563700-skttq" event={"ID":"2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c","Type":"ContainerDied","Data":"060da4ba1c5057d1e59507b771cb775090141d7ea0ac5f28357ed96dbf517d9b"} Mar 18 08:20:04 crc kubenswrapper[4917]: I0318 08:20:04.389599 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563700-skttq" Mar 18 08:20:04 crc kubenswrapper[4917]: I0318 08:20:04.564457 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z54ft\" (UniqueName: \"kubernetes.io/projected/2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c-kube-api-access-z54ft\") pod \"2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c\" (UID: \"2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c\") " Mar 18 08:20:04 crc kubenswrapper[4917]: I0318 08:20:04.570113 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c-kube-api-access-z54ft" (OuterVolumeSpecName: "kube-api-access-z54ft") pod "2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c" (UID: "2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c"). InnerVolumeSpecName "kube-api-access-z54ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:20:04 crc kubenswrapper[4917]: I0318 08:20:04.666550 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z54ft\" (UniqueName: \"kubernetes.io/projected/2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c-kube-api-access-z54ft\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:04 crc kubenswrapper[4917]: I0318 08:20:04.979082 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563700-skttq" event={"ID":"2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c","Type":"ContainerDied","Data":"5e9ffbd164506435a3a43edea42295c977b4f2d3dff02511c1747cef509284ef"} Mar 18 08:20:04 crc kubenswrapper[4917]: I0318 08:20:04.979444 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e9ffbd164506435a3a43edea42295c977b4f2d3dff02511c1747cef509284ef" Mar 18 08:20:04 crc kubenswrapper[4917]: I0318 08:20:04.979527 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563700-skttq" Mar 18 08:20:05 crc kubenswrapper[4917]: I0318 08:20:05.466085 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563694-dc4zx"] Mar 18 08:20:05 crc kubenswrapper[4917]: I0318 08:20:05.472992 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563694-dc4zx"] Mar 18 08:20:05 crc kubenswrapper[4917]: I0318 08:20:05.789318 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f759add-12d3-4da3-928e-f0effea94b71" path="/var/lib/kubelet/pods/5f759add-12d3-4da3-928e-f0effea94b71/volumes" Mar 18 08:20:08 crc kubenswrapper[4917]: I0318 08:20:08.302214 4917 scope.go:117] "RemoveContainer" containerID="54ef452b1e86be2b2a8d3dadd0902c59e0ec75b1480d7193a1ff2bd6338ebf08" Mar 18 08:20:08 crc kubenswrapper[4917]: I0318 08:20:08.772161 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:20:08 crc kubenswrapper[4917]: E0318 08:20:08.774164 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:20:12 crc kubenswrapper[4917]: I0318 08:20:12.326736 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d866b6d76-b4474" Mar 18 08:20:12 crc kubenswrapper[4917]: I0318 08:20:12.330325 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d866b6d76-b4474" Mar 18 08:20:21 crc kubenswrapper[4917]: I0318 08:20:21.772866 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:20:21 crc kubenswrapper[4917]: E0318 08:20:21.773818 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:20:33 crc kubenswrapper[4917]: I0318 08:20:33.772349 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:20:33 crc kubenswrapper[4917]: E0318 08:20:33.773125 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.298887 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-fkll6"] Mar 18 08:20:34 crc kubenswrapper[4917]: E0318 08:20:34.299363 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c" containerName="oc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.299389 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c" containerName="oc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.299643 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c" containerName="oc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.300304 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.313830 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fkll6"] Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.369972 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2741122-0601-450f-832a-ce3d4f5f3d01-operator-scripts\") pod \"nova-api-db-create-fkll6\" (UID: \"d2741122-0601-450f-832a-ce3d4f5f3d01\") " pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.370509 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfrx9\" (UniqueName: \"kubernetes.io/projected/d2741122-0601-450f-832a-ce3d4f5f3d01-kube-api-access-tfrx9\") pod \"nova-api-db-create-fkll6\" (UID: \"d2741122-0601-450f-832a-ce3d4f5f3d01\") " pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.389565 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-xfjbc"] Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.390604 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.404920 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xfjbc"] Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.472220 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2741122-0601-450f-832a-ce3d4f5f3d01-operator-scripts\") pod \"nova-api-db-create-fkll6\" (UID: \"d2741122-0601-450f-832a-ce3d4f5f3d01\") " pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.472287 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfrx9\" (UniqueName: \"kubernetes.io/projected/d2741122-0601-450f-832a-ce3d4f5f3d01-kube-api-access-tfrx9\") pod \"nova-api-db-create-fkll6\" (UID: \"d2741122-0601-450f-832a-ce3d4f5f3d01\") " pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.472329 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6wlm\" (UniqueName: \"kubernetes.io/projected/91286214-725d-4737-b3d0-ddc623a822a5-kube-api-access-t6wlm\") pod \"nova-cell0-db-create-xfjbc\" (UID: \"91286214-725d-4737-b3d0-ddc623a822a5\") " pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.472358 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91286214-725d-4737-b3d0-ddc623a822a5-operator-scripts\") pod \"nova-cell0-db-create-xfjbc\" (UID: \"91286214-725d-4737-b3d0-ddc623a822a5\") " pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.473076 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2741122-0601-450f-832a-ce3d4f5f3d01-operator-scripts\") pod \"nova-api-db-create-fkll6\" (UID: \"d2741122-0601-450f-832a-ce3d4f5f3d01\") " pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.500884 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfrx9\" (UniqueName: \"kubernetes.io/projected/d2741122-0601-450f-832a-ce3d4f5f3d01-kube-api-access-tfrx9\") pod \"nova-api-db-create-fkll6\" (UID: \"d2741122-0601-450f-832a-ce3d4f5f3d01\") " pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.510641 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nj8qx"] Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.512153 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.525027 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-25be-account-create-update-xsm5n"] Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.526100 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.527873 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.534210 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nj8qx"] Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.543648 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-25be-account-create-update-xsm5n"] Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.573666 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hclth\" (UniqueName: \"kubernetes.io/projected/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-kube-api-access-hclth\") pod \"nova-cell1-db-create-nj8qx\" (UID: \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\") " pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.573712 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-operator-scripts\") pod \"nova-cell1-db-create-nj8qx\" (UID: \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\") " pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.573784 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6wlm\" (UniqueName: \"kubernetes.io/projected/91286214-725d-4737-b3d0-ddc623a822a5-kube-api-access-t6wlm\") pod \"nova-cell0-db-create-xfjbc\" (UID: \"91286214-725d-4737-b3d0-ddc623a822a5\") " pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.573811 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91286214-725d-4737-b3d0-ddc623a822a5-operator-scripts\") pod \"nova-cell0-db-create-xfjbc\" (UID: \"91286214-725d-4737-b3d0-ddc623a822a5\") " pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.574548 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91286214-725d-4737-b3d0-ddc623a822a5-operator-scripts\") pod \"nova-cell0-db-create-xfjbc\" (UID: \"91286214-725d-4737-b3d0-ddc623a822a5\") " pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.598538 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6wlm\" (UniqueName: \"kubernetes.io/projected/91286214-725d-4737-b3d0-ddc623a822a5-kube-api-access-t6wlm\") pod \"nova-cell0-db-create-xfjbc\" (UID: \"91286214-725d-4737-b3d0-ddc623a822a5\") " pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.617071 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.675928 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hclth\" (UniqueName: \"kubernetes.io/projected/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-kube-api-access-hclth\") pod \"nova-cell1-db-create-nj8qx\" (UID: \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\") " pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.675995 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-operator-scripts\") pod \"nova-cell1-db-create-nj8qx\" (UID: \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\") " pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.676075 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn2xb\" (UniqueName: \"kubernetes.io/projected/0549cd0a-fd3c-401a-8caf-9e0348faccb9-kube-api-access-hn2xb\") pod \"nova-api-25be-account-create-update-xsm5n\" (UID: \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\") " pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.676101 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0549cd0a-fd3c-401a-8caf-9e0348faccb9-operator-scripts\") pod \"nova-api-25be-account-create-update-xsm5n\" (UID: \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\") " pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.677271 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-operator-scripts\") pod \"nova-cell1-db-create-nj8qx\" (UID: \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\") " pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.704567 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.775249 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hclth\" (UniqueName: \"kubernetes.io/projected/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-kube-api-access-hclth\") pod \"nova-cell1-db-create-nj8qx\" (UID: \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\") " pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.782173 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn2xb\" (UniqueName: \"kubernetes.io/projected/0549cd0a-fd3c-401a-8caf-9e0348faccb9-kube-api-access-hn2xb\") pod \"nova-api-25be-account-create-update-xsm5n\" (UID: \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\") " pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.782444 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0549cd0a-fd3c-401a-8caf-9e0348faccb9-operator-scripts\") pod \"nova-api-25be-account-create-update-xsm5n\" (UID: \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\") " pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.783174 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0549cd0a-fd3c-401a-8caf-9e0348faccb9-operator-scripts\") pod \"nova-api-25be-account-create-update-xsm5n\" (UID: \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\") " pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.855219 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn2xb\" (UniqueName: \"kubernetes.io/projected/0549cd0a-fd3c-401a-8caf-9e0348faccb9-kube-api-access-hn2xb\") pod \"nova-api-25be-account-create-update-xsm5n\" (UID: \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\") " pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.871210 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.878964 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.952387 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-d6b7-account-create-update-c8b9k"] Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.953455 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.967973 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 18 08:20:34 crc kubenswrapper[4917]: I0318 08:20:34.985650 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d6b7-account-create-update-c8b9k"] Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.055873 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-fkll6"] Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.097071 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwpds\" (UniqueName: \"kubernetes.io/projected/e472482c-2d27-49bc-816c-29e6cd155b1d-kube-api-access-gwpds\") pod \"nova-cell0-d6b7-account-create-update-c8b9k\" (UID: \"e472482c-2d27-49bc-816c-29e6cd155b1d\") " pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.097145 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e472482c-2d27-49bc-816c-29e6cd155b1d-operator-scripts\") pod \"nova-cell0-d6b7-account-create-update-c8b9k\" (UID: \"e472482c-2d27-49bc-816c-29e6cd155b1d\") " pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.177492 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a0e3-account-create-update-wvbdw"] Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.182531 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.184841 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.190316 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a0e3-account-create-update-wvbdw"] Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.198252 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e472482c-2d27-49bc-816c-29e6cd155b1d-operator-scripts\") pod \"nova-cell0-d6b7-account-create-update-c8b9k\" (UID: \"e472482c-2d27-49bc-816c-29e6cd155b1d\") " pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.198711 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwpds\" (UniqueName: \"kubernetes.io/projected/e472482c-2d27-49bc-816c-29e6cd155b1d-kube-api-access-gwpds\") pod \"nova-cell0-d6b7-account-create-update-c8b9k\" (UID: \"e472482c-2d27-49bc-816c-29e6cd155b1d\") " pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.199413 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e472482c-2d27-49bc-816c-29e6cd155b1d-operator-scripts\") pod \"nova-cell0-d6b7-account-create-update-c8b9k\" (UID: \"e472482c-2d27-49bc-816c-29e6cd155b1d\") " pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.230567 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwpds\" (UniqueName: \"kubernetes.io/projected/e472482c-2d27-49bc-816c-29e6cd155b1d-kube-api-access-gwpds\") pod \"nova-cell0-d6b7-account-create-update-c8b9k\" (UID: \"e472482c-2d27-49bc-816c-29e6cd155b1d\") " pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.285679 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fkll6" event={"ID":"d2741122-0601-450f-832a-ce3d4f5f3d01","Type":"ContainerStarted","Data":"04b5ea474aa6c18f54426b912e3f5536300782f30a684c63d698ae170805324c"} Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.295706 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.300040 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kjwr\" (UniqueName: \"kubernetes.io/projected/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-kube-api-access-6kjwr\") pod \"nova-cell1-a0e3-account-create-update-wvbdw\" (UID: \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\") " pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:35 crc kubenswrapper[4917]: I0318 08:20:35.300083 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-operator-scripts\") pod \"nova-cell1-a0e3-account-create-update-wvbdw\" (UID: \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\") " pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:35.403111 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kjwr\" (UniqueName: \"kubernetes.io/projected/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-kube-api-access-6kjwr\") pod \"nova-cell1-a0e3-account-create-update-wvbdw\" (UID: \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\") " pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:35.403161 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-operator-scripts\") pod \"nova-cell1-a0e3-account-create-update-wvbdw\" (UID: \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\") " pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:35.404201 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-operator-scripts\") pod \"nova-cell1-a0e3-account-create-update-wvbdw\" (UID: \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\") " pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:35.422138 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kjwr\" (UniqueName: \"kubernetes.io/projected/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-kube-api-access-6kjwr\") pod \"nova-cell1-a0e3-account-create-update-wvbdw\" (UID: \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\") " pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:35.467502 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-xfjbc"] Mar 18 08:20:36 crc kubenswrapper[4917]: W0318 08:20:35.469099 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91286214_725d_4737_b3d0_ddc623a822a5.slice/crio-86f0b70b9b5adcda89a26f463eb9c1c64f009ad2c51d777d742c30de5f9a3b15 WatchSource:0}: Error finding container 86f0b70b9b5adcda89a26f463eb9c1c64f009ad2c51d777d742c30de5f9a3b15: Status 404 returned error can't find the container with id 86f0b70b9b5adcda89a26f463eb9c1c64f009ad2c51d777d742c30de5f9a3b15 Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:35.519504 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:35.544646 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nj8qx"] Mar 18 08:20:36 crc kubenswrapper[4917]: W0318 08:20:35.573984 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod001a88e1_5ca3_42d1_a8f9_3f7ff1da3612.slice/crio-aa7504cfdffa131d76fd926a3ef4b828e71aa7a36840487233851ea0b40702ed WatchSource:0}: Error finding container aa7504cfdffa131d76fd926a3ef4b828e71aa7a36840487233851ea0b40702ed: Status 404 returned error can't find the container with id aa7504cfdffa131d76fd926a3ef4b828e71aa7a36840487233851ea0b40702ed Mar 18 08:20:36 crc kubenswrapper[4917]: W0318 08:20:35.577213 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0549cd0a_fd3c_401a_8caf_9e0348faccb9.slice/crio-8df4cbeb36b1cfb4c3d57223f253846227ca47e31f1865907a1b82a91c3c6c13 WatchSource:0}: Error finding container 8df4cbeb36b1cfb4c3d57223f253846227ca47e31f1865907a1b82a91c3c6c13: Status 404 returned error can't find the container with id 8df4cbeb36b1cfb4c3d57223f253846227ca47e31f1865907a1b82a91c3c6c13 Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:35.580789 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-25be-account-create-update-xsm5n"] Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.297773 4917 generic.go:334] "Generic (PLEG): container finished" podID="d2741122-0601-450f-832a-ce3d4f5f3d01" containerID="a4f79f03aaef63c09b94a5298ce1defc82d65f122022ab9c9ff89ed5d46f2202" exitCode=0 Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.297845 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fkll6" event={"ID":"d2741122-0601-450f-832a-ce3d4f5f3d01","Type":"ContainerDied","Data":"a4f79f03aaef63c09b94a5298ce1defc82d65f122022ab9c9ff89ed5d46f2202"} Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.300187 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xfjbc" event={"ID":"91286214-725d-4737-b3d0-ddc623a822a5","Type":"ContainerStarted","Data":"b664062f8079bd223dc49e33078611e48a8a728cc8a32d56308a3c108cf1c6b5"} Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.300219 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xfjbc" event={"ID":"91286214-725d-4737-b3d0-ddc623a822a5","Type":"ContainerStarted","Data":"86f0b70b9b5adcda89a26f463eb9c1c64f009ad2c51d777d742c30de5f9a3b15"} Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.308079 4917 generic.go:334] "Generic (PLEG): container finished" podID="0549cd0a-fd3c-401a-8caf-9e0348faccb9" containerID="5f0678f82975df2dc116603e4b78f74d3b22b82db440e4024b19859d27f1ebf7" exitCode=0 Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.308406 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25be-account-create-update-xsm5n" event={"ID":"0549cd0a-fd3c-401a-8caf-9e0348faccb9","Type":"ContainerDied","Data":"5f0678f82975df2dc116603e4b78f74d3b22b82db440e4024b19859d27f1ebf7"} Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.308435 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25be-account-create-update-xsm5n" event={"ID":"0549cd0a-fd3c-401a-8caf-9e0348faccb9","Type":"ContainerStarted","Data":"8df4cbeb36b1cfb4c3d57223f253846227ca47e31f1865907a1b82a91c3c6c13"} Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.315366 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nj8qx" event={"ID":"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612","Type":"ContainerStarted","Data":"4f95bf531ae7f8eda3c1949ee691b3a05cc67a9583bc60a9e9b52cefea3d7d4a"} Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.315411 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nj8qx" event={"ID":"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612","Type":"ContainerStarted","Data":"aa7504cfdffa131d76fd926a3ef4b828e71aa7a36840487233851ea0b40702ed"} Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.343322 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-xfjbc" podStartSLOduration=2.343303902 podStartE2EDuration="2.343303902s" podCreationTimestamp="2026-03-18 08:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:20:36.338499065 +0000 UTC m=+5621.279653779" watchObservedRunningTime="2026-03-18 08:20:36.343303902 +0000 UTC m=+5621.284458616" Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.382961 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-nj8qx" podStartSLOduration=2.382942563 podStartE2EDuration="2.382942563s" podCreationTimestamp="2026-03-18 08:20:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:20:36.364199248 +0000 UTC m=+5621.305353962" watchObservedRunningTime="2026-03-18 08:20:36.382942563 +0000 UTC m=+5621.324097287" Mar 18 08:20:36 crc kubenswrapper[4917]: W0318 08:20:36.491083 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode472482c_2d27_49bc_816c_29e6cd155b1d.slice/crio-1ac8e6e13625ce4691d8a6fae2ffbfbff8297f5e7c15894418c750cba64f5f89 WatchSource:0}: Error finding container 1ac8e6e13625ce4691d8a6fae2ffbfbff8297f5e7c15894418c750cba64f5f89: Status 404 returned error can't find the container with id 1ac8e6e13625ce4691d8a6fae2ffbfbff8297f5e7c15894418c750cba64f5f89 Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.491232 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-d6b7-account-create-update-c8b9k"] Mar 18 08:20:36 crc kubenswrapper[4917]: I0318 08:20:36.504183 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a0e3-account-create-update-wvbdw"] Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.332885 4917 generic.go:334] "Generic (PLEG): container finished" podID="91286214-725d-4737-b3d0-ddc623a822a5" containerID="b664062f8079bd223dc49e33078611e48a8a728cc8a32d56308a3c108cf1c6b5" exitCode=0 Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.332960 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xfjbc" event={"ID":"91286214-725d-4737-b3d0-ddc623a822a5","Type":"ContainerDied","Data":"b664062f8079bd223dc49e33078611e48a8a728cc8a32d56308a3c108cf1c6b5"} Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.339018 4917 generic.go:334] "Generic (PLEG): container finished" podID="001a88e1-5ca3-42d1-a8f9-3f7ff1da3612" containerID="4f95bf531ae7f8eda3c1949ee691b3a05cc67a9583bc60a9e9b52cefea3d7d4a" exitCode=0 Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.339164 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nj8qx" event={"ID":"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612","Type":"ContainerDied","Data":"4f95bf531ae7f8eda3c1949ee691b3a05cc67a9583bc60a9e9b52cefea3d7d4a"} Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.341746 4917 generic.go:334] "Generic (PLEG): container finished" podID="e472482c-2d27-49bc-816c-29e6cd155b1d" containerID="a07cba8f2bf2f194d04b9cc291b68f59a5f81b62b1efce386af26283985188a0" exitCode=0 Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.341847 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" event={"ID":"e472482c-2d27-49bc-816c-29e6cd155b1d","Type":"ContainerDied","Data":"a07cba8f2bf2f194d04b9cc291b68f59a5f81b62b1efce386af26283985188a0"} Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.341880 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" event={"ID":"e472482c-2d27-49bc-816c-29e6cd155b1d","Type":"ContainerStarted","Data":"1ac8e6e13625ce4691d8a6fae2ffbfbff8297f5e7c15894418c750cba64f5f89"} Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.345033 4917 generic.go:334] "Generic (PLEG): container finished" podID="702e1fec-e89d-4ff3-9e73-25e19d2d1fdd" containerID="06d880291bfbccd6b0c51cb447fbaddd8bebd8e00c143310d62f7d3c313ac133" exitCode=0 Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.345119 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" event={"ID":"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd","Type":"ContainerDied","Data":"06d880291bfbccd6b0c51cb447fbaddd8bebd8e00c143310d62f7d3c313ac133"} Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.345179 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" event={"ID":"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd","Type":"ContainerStarted","Data":"905c168d434c3c34e280f247c3b6499cf5481585b397988abbf6add55cb13ae1"} Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.824195 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.830446 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.861983 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2741122-0601-450f-832a-ce3d4f5f3d01-operator-scripts\") pod \"d2741122-0601-450f-832a-ce3d4f5f3d01\" (UID: \"d2741122-0601-450f-832a-ce3d4f5f3d01\") " Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.862074 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfrx9\" (UniqueName: \"kubernetes.io/projected/d2741122-0601-450f-832a-ce3d4f5f3d01-kube-api-access-tfrx9\") pod \"d2741122-0601-450f-832a-ce3d4f5f3d01\" (UID: \"d2741122-0601-450f-832a-ce3d4f5f3d01\") " Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.864792 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2741122-0601-450f-832a-ce3d4f5f3d01-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d2741122-0601-450f-832a-ce3d4f5f3d01" (UID: "d2741122-0601-450f-832a-ce3d4f5f3d01"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.871423 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2741122-0601-450f-832a-ce3d4f5f3d01-kube-api-access-tfrx9" (OuterVolumeSpecName: "kube-api-access-tfrx9") pod "d2741122-0601-450f-832a-ce3d4f5f3d01" (UID: "d2741122-0601-450f-832a-ce3d4f5f3d01"). InnerVolumeSpecName "kube-api-access-tfrx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.963871 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn2xb\" (UniqueName: \"kubernetes.io/projected/0549cd0a-fd3c-401a-8caf-9e0348faccb9-kube-api-access-hn2xb\") pod \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\" (UID: \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\") " Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.963978 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0549cd0a-fd3c-401a-8caf-9e0348faccb9-operator-scripts\") pod \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\" (UID: \"0549cd0a-fd3c-401a-8caf-9e0348faccb9\") " Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.964660 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0549cd0a-fd3c-401a-8caf-9e0348faccb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0549cd0a-fd3c-401a-8caf-9e0348faccb9" (UID: "0549cd0a-fd3c-401a-8caf-9e0348faccb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.964779 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfrx9\" (UniqueName: \"kubernetes.io/projected/d2741122-0601-450f-832a-ce3d4f5f3d01-kube-api-access-tfrx9\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.964802 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d2741122-0601-450f-832a-ce3d4f5f3d01-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.964817 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0549cd0a-fd3c-401a-8caf-9e0348faccb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:37 crc kubenswrapper[4917]: I0318 08:20:37.966919 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0549cd0a-fd3c-401a-8caf-9e0348faccb9-kube-api-access-hn2xb" (OuterVolumeSpecName: "kube-api-access-hn2xb") pod "0549cd0a-fd3c-401a-8caf-9e0348faccb9" (UID: "0549cd0a-fd3c-401a-8caf-9e0348faccb9"). InnerVolumeSpecName "kube-api-access-hn2xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.066559 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn2xb\" (UniqueName: \"kubernetes.io/projected/0549cd0a-fd3c-401a-8caf-9e0348faccb9-kube-api-access-hn2xb\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.357339 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-25be-account-create-update-xsm5n" event={"ID":"0549cd0a-fd3c-401a-8caf-9e0348faccb9","Type":"ContainerDied","Data":"8df4cbeb36b1cfb4c3d57223f253846227ca47e31f1865907a1b82a91c3c6c13"} Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.358433 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8df4cbeb36b1cfb4c3d57223f253846227ca47e31f1865907a1b82a91c3c6c13" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.358527 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-25be-account-create-update-xsm5n" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.361266 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-fkll6" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.361790 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-fkll6" event={"ID":"d2741122-0601-450f-832a-ce3d4f5f3d01","Type":"ContainerDied","Data":"04b5ea474aa6c18f54426b912e3f5536300782f30a684c63d698ae170805324c"} Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.361879 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04b5ea474aa6c18f54426b912e3f5536300782f30a684c63d698ae170805324c" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.682985 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.781110 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6wlm\" (UniqueName: \"kubernetes.io/projected/91286214-725d-4737-b3d0-ddc623a822a5-kube-api-access-t6wlm\") pod \"91286214-725d-4737-b3d0-ddc623a822a5\" (UID: \"91286214-725d-4737-b3d0-ddc623a822a5\") " Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.781519 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91286214-725d-4737-b3d0-ddc623a822a5-operator-scripts\") pod \"91286214-725d-4737-b3d0-ddc623a822a5\" (UID: \"91286214-725d-4737-b3d0-ddc623a822a5\") " Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.782982 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91286214-725d-4737-b3d0-ddc623a822a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "91286214-725d-4737-b3d0-ddc623a822a5" (UID: "91286214-725d-4737-b3d0-ddc623a822a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.788860 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91286214-725d-4737-b3d0-ddc623a822a5-kube-api-access-t6wlm" (OuterVolumeSpecName: "kube-api-access-t6wlm") pod "91286214-725d-4737-b3d0-ddc623a822a5" (UID: "91286214-725d-4737-b3d0-ddc623a822a5"). InnerVolumeSpecName "kube-api-access-t6wlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.883944 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6wlm\" (UniqueName: \"kubernetes.io/projected/91286214-725d-4737-b3d0-ddc623a822a5-kube-api-access-t6wlm\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.884181 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/91286214-725d-4737-b3d0-ddc623a822a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.892690 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.898139 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.902356 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.985643 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hclth\" (UniqueName: \"kubernetes.io/projected/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-kube-api-access-hclth\") pod \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\" (UID: \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\") " Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.985690 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kjwr\" (UniqueName: \"kubernetes.io/projected/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-kube-api-access-6kjwr\") pod \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\" (UID: \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\") " Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.985801 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e472482c-2d27-49bc-816c-29e6cd155b1d-operator-scripts\") pod \"e472482c-2d27-49bc-816c-29e6cd155b1d\" (UID: \"e472482c-2d27-49bc-816c-29e6cd155b1d\") " Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.985844 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-operator-scripts\") pod \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\" (UID: \"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612\") " Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.985981 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-operator-scripts\") pod \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\" (UID: \"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd\") " Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.986015 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwpds\" (UniqueName: \"kubernetes.io/projected/e472482c-2d27-49bc-816c-29e6cd155b1d-kube-api-access-gwpds\") pod \"e472482c-2d27-49bc-816c-29e6cd155b1d\" (UID: \"e472482c-2d27-49bc-816c-29e6cd155b1d\") " Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.986384 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "001a88e1-5ca3-42d1-a8f9-3f7ff1da3612" (UID: "001a88e1-5ca3-42d1-a8f9-3f7ff1da3612"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.986438 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e472482c-2d27-49bc-816c-29e6cd155b1d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e472482c-2d27-49bc-816c-29e6cd155b1d" (UID: "e472482c-2d27-49bc-816c-29e6cd155b1d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.986495 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "702e1fec-e89d-4ff3-9e73-25e19d2d1fdd" (UID: "702e1fec-e89d-4ff3-9e73-25e19d2d1fdd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.988860 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-kube-api-access-hclth" (OuterVolumeSpecName: "kube-api-access-hclth") pod "001a88e1-5ca3-42d1-a8f9-3f7ff1da3612" (UID: "001a88e1-5ca3-42d1-a8f9-3f7ff1da3612"). InnerVolumeSpecName "kube-api-access-hclth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.993146 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e472482c-2d27-49bc-816c-29e6cd155b1d-kube-api-access-gwpds" (OuterVolumeSpecName: "kube-api-access-gwpds") pod "e472482c-2d27-49bc-816c-29e6cd155b1d" (UID: "e472482c-2d27-49bc-816c-29e6cd155b1d"). InnerVolumeSpecName "kube-api-access-gwpds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:20:38 crc kubenswrapper[4917]: I0318 08:20:38.993273 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-kube-api-access-6kjwr" (OuterVolumeSpecName: "kube-api-access-6kjwr") pod "702e1fec-e89d-4ff3-9e73-25e19d2d1fdd" (UID: "702e1fec-e89d-4ff3-9e73-25e19d2d1fdd"). InnerVolumeSpecName "kube-api-access-6kjwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.087729 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.087764 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwpds\" (UniqueName: \"kubernetes.io/projected/e472482c-2d27-49bc-816c-29e6cd155b1d-kube-api-access-gwpds\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.087779 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hclth\" (UniqueName: \"kubernetes.io/projected/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-kube-api-access-hclth\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.087791 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kjwr\" (UniqueName: \"kubernetes.io/projected/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd-kube-api-access-6kjwr\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.087802 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e472482c-2d27-49bc-816c-29e6cd155b1d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.087810 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.372331 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-xfjbc" event={"ID":"91286214-725d-4737-b3d0-ddc623a822a5","Type":"ContainerDied","Data":"86f0b70b9b5adcda89a26f463eb9c1c64f009ad2c51d777d742c30de5f9a3b15"} Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.372377 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-xfjbc" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.372406 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86f0b70b9b5adcda89a26f463eb9c1c64f009ad2c51d777d742c30de5f9a3b15" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.374446 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nj8qx" event={"ID":"001a88e1-5ca3-42d1-a8f9-3f7ff1da3612","Type":"ContainerDied","Data":"aa7504cfdffa131d76fd926a3ef4b828e71aa7a36840487233851ea0b40702ed"} Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.374477 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa7504cfdffa131d76fd926a3ef4b828e71aa7a36840487233851ea0b40702ed" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.374540 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nj8qx" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.383988 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" event={"ID":"e472482c-2d27-49bc-816c-29e6cd155b1d","Type":"ContainerDied","Data":"1ac8e6e13625ce4691d8a6fae2ffbfbff8297f5e7c15894418c750cba64f5f89"} Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.384060 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ac8e6e13625ce4691d8a6fae2ffbfbff8297f5e7c15894418c750cba64f5f89" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.384177 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-d6b7-account-create-update-c8b9k" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.388045 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" event={"ID":"702e1fec-e89d-4ff3-9e73-25e19d2d1fdd","Type":"ContainerDied","Data":"905c168d434c3c34e280f247c3b6499cf5481585b397988abbf6add55cb13ae1"} Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.388104 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="905c168d434c3c34e280f247c3b6499cf5481585b397988abbf6add55cb13ae1" Mar 18 08:20:39 crc kubenswrapper[4917]: I0318 08:20:39.388195 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a0e3-account-create-update-wvbdw" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.665481 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrwwh"] Mar 18 08:20:44 crc kubenswrapper[4917]: E0318 08:20:44.666480 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2741122-0601-450f-832a-ce3d4f5f3d01" containerName="mariadb-database-create" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666498 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2741122-0601-450f-832a-ce3d4f5f3d01" containerName="mariadb-database-create" Mar 18 08:20:44 crc kubenswrapper[4917]: E0318 08:20:44.666512 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702e1fec-e89d-4ff3-9e73-25e19d2d1fdd" containerName="mariadb-account-create-update" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666522 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="702e1fec-e89d-4ff3-9e73-25e19d2d1fdd" containerName="mariadb-account-create-update" Mar 18 08:20:44 crc kubenswrapper[4917]: E0318 08:20:44.666546 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0549cd0a-fd3c-401a-8caf-9e0348faccb9" containerName="mariadb-account-create-update" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666553 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0549cd0a-fd3c-401a-8caf-9e0348faccb9" containerName="mariadb-account-create-update" Mar 18 08:20:44 crc kubenswrapper[4917]: E0318 08:20:44.666574 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001a88e1-5ca3-42d1-a8f9-3f7ff1da3612" containerName="mariadb-database-create" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666600 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="001a88e1-5ca3-42d1-a8f9-3f7ff1da3612" containerName="mariadb-database-create" Mar 18 08:20:44 crc kubenswrapper[4917]: E0318 08:20:44.666615 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e472482c-2d27-49bc-816c-29e6cd155b1d" containerName="mariadb-account-create-update" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666623 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e472482c-2d27-49bc-816c-29e6cd155b1d" containerName="mariadb-account-create-update" Mar 18 08:20:44 crc kubenswrapper[4917]: E0318 08:20:44.666634 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91286214-725d-4737-b3d0-ddc623a822a5" containerName="mariadb-database-create" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666641 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="91286214-725d-4737-b3d0-ddc623a822a5" containerName="mariadb-database-create" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666839 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="001a88e1-5ca3-42d1-a8f9-3f7ff1da3612" containerName="mariadb-database-create" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666859 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e472482c-2d27-49bc-816c-29e6cd155b1d" containerName="mariadb-account-create-update" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666876 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2741122-0601-450f-832a-ce3d4f5f3d01" containerName="mariadb-database-create" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666888 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="702e1fec-e89d-4ff3-9e73-25e19d2d1fdd" containerName="mariadb-account-create-update" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666905 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="91286214-725d-4737-b3d0-ddc623a822a5" containerName="mariadb-database-create" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.666920 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0549cd0a-fd3c-401a-8caf-9e0348faccb9" containerName="mariadb-account-create-update" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.667727 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.675175 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.684539 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.684918 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ttr6m" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.692658 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrwwh"] Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.825956 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-scripts\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.826024 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-config-data\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.826065 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.826145 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74m9k\" (UniqueName: \"kubernetes.io/projected/3ed9261d-f69b-4758-97e2-156e4ad13843-kube-api-access-74m9k\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.928508 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74m9k\" (UniqueName: \"kubernetes.io/projected/3ed9261d-f69b-4758-97e2-156e4ad13843-kube-api-access-74m9k\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.928655 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-scripts\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.928724 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-config-data\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.928841 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.935163 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-scripts\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.935220 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.944270 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-config-data\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.950932 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74m9k\" (UniqueName: \"kubernetes.io/projected/3ed9261d-f69b-4758-97e2-156e4ad13843-kube-api-access-74m9k\") pod \"nova-cell0-conductor-db-sync-mrwwh\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:44 crc kubenswrapper[4917]: I0318 08:20:44.989309 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:20:45 crc kubenswrapper[4917]: I0318 08:20:45.453787 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrwwh"] Mar 18 08:20:45 crc kubenswrapper[4917]: W0318 08:20:45.458618 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ed9261d_f69b_4758_97e2_156e4ad13843.slice/crio-c0524ae48b87a94d7f9653e5a13d238e67ce10255b93a75b804e128062ed117c WatchSource:0}: Error finding container c0524ae48b87a94d7f9653e5a13d238e67ce10255b93a75b804e128062ed117c: Status 404 returned error can't find the container with id c0524ae48b87a94d7f9653e5a13d238e67ce10255b93a75b804e128062ed117c Mar 18 08:20:46 crc kubenswrapper[4917]: I0318 08:20:46.459660 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrwwh" event={"ID":"3ed9261d-f69b-4758-97e2-156e4ad13843","Type":"ContainerStarted","Data":"c0524ae48b87a94d7f9653e5a13d238e67ce10255b93a75b804e128062ed117c"} Mar 18 08:20:47 crc kubenswrapper[4917]: I0318 08:20:47.774020 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:20:47 crc kubenswrapper[4917]: E0318 08:20:47.774346 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:20:54 crc kubenswrapper[4917]: I0318 08:20:54.534509 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrwwh" event={"ID":"3ed9261d-f69b-4758-97e2-156e4ad13843","Type":"ContainerStarted","Data":"609ab9fb6d7e0e7ec7909a91693f05d4960d5ad597a2350644394478e2988fa5"} Mar 18 08:20:54 crc kubenswrapper[4917]: I0318 08:20:54.562292 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mrwwh" podStartSLOduration=2.378878447 podStartE2EDuration="10.562267744s" podCreationTimestamp="2026-03-18 08:20:44 +0000 UTC" firstStartedPulling="2026-03-18 08:20:45.461168441 +0000 UTC m=+5630.402323155" lastFinishedPulling="2026-03-18 08:20:53.644557738 +0000 UTC m=+5638.585712452" observedRunningTime="2026-03-18 08:20:54.555978252 +0000 UTC m=+5639.497133056" watchObservedRunningTime="2026-03-18 08:20:54.562267744 +0000 UTC m=+5639.503422498" Mar 18 08:20:59 crc kubenswrapper[4917]: I0318 08:20:59.587757 4917 generic.go:334] "Generic (PLEG): container finished" podID="3ed9261d-f69b-4758-97e2-156e4ad13843" containerID="609ab9fb6d7e0e7ec7909a91693f05d4960d5ad597a2350644394478e2988fa5" exitCode=0 Mar 18 08:20:59 crc kubenswrapper[4917]: I0318 08:20:59.587862 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrwwh" event={"ID":"3ed9261d-f69b-4758-97e2-156e4ad13843","Type":"ContainerDied","Data":"609ab9fb6d7e0e7ec7909a91693f05d4960d5ad597a2350644394478e2988fa5"} Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.106388 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.154039 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-scripts\") pod \"3ed9261d-f69b-4758-97e2-156e4ad13843\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.154404 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74m9k\" (UniqueName: \"kubernetes.io/projected/3ed9261d-f69b-4758-97e2-156e4ad13843-kube-api-access-74m9k\") pod \"3ed9261d-f69b-4758-97e2-156e4ad13843\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.154707 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-combined-ca-bundle\") pod \"3ed9261d-f69b-4758-97e2-156e4ad13843\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.155057 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-config-data\") pod \"3ed9261d-f69b-4758-97e2-156e4ad13843\" (UID: \"3ed9261d-f69b-4758-97e2-156e4ad13843\") " Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.160667 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-scripts" (OuterVolumeSpecName: "scripts") pod "3ed9261d-f69b-4758-97e2-156e4ad13843" (UID: "3ed9261d-f69b-4758-97e2-156e4ad13843"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.160725 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed9261d-f69b-4758-97e2-156e4ad13843-kube-api-access-74m9k" (OuterVolumeSpecName: "kube-api-access-74m9k") pod "3ed9261d-f69b-4758-97e2-156e4ad13843" (UID: "3ed9261d-f69b-4758-97e2-156e4ad13843"). InnerVolumeSpecName "kube-api-access-74m9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.180401 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ed9261d-f69b-4758-97e2-156e4ad13843" (UID: "3ed9261d-f69b-4758-97e2-156e4ad13843"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.191384 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-config-data" (OuterVolumeSpecName: "config-data") pod "3ed9261d-f69b-4758-97e2-156e4ad13843" (UID: "3ed9261d-f69b-4758-97e2-156e4ad13843"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.258010 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.258054 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74m9k\" (UniqueName: \"kubernetes.io/projected/3ed9261d-f69b-4758-97e2-156e4ad13843-kube-api-access-74m9k\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.258070 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.258080 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ed9261d-f69b-4758-97e2-156e4ad13843-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.615696 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mrwwh" event={"ID":"3ed9261d-f69b-4758-97e2-156e4ad13843","Type":"ContainerDied","Data":"c0524ae48b87a94d7f9653e5a13d238e67ce10255b93a75b804e128062ed117c"} Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.616006 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0524ae48b87a94d7f9653e5a13d238e67ce10255b93a75b804e128062ed117c" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.615755 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mrwwh" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.696085 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 08:21:01 crc kubenswrapper[4917]: E0318 08:21:01.696561 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed9261d-f69b-4758-97e2-156e4ad13843" containerName="nova-cell0-conductor-db-sync" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.696600 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed9261d-f69b-4758-97e2-156e4ad13843" containerName="nova-cell0-conductor-db-sync" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.696865 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed9261d-f69b-4758-97e2-156e4ad13843" containerName="nova-cell0-conductor-db-sync" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.697651 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.701265 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-ttr6m" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.701643 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.707757 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.767242 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.767343 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgtrt\" (UniqueName: \"kubernetes.io/projected/fb39efb5-af75-4cea-a6c7-68f10cf1732c-kube-api-access-pgtrt\") pod \"nova-cell0-conductor-0\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.767385 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.773882 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:21:01 crc kubenswrapper[4917]: E0318 08:21:01.774128 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.869151 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.869249 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgtrt\" (UniqueName: \"kubernetes.io/projected/fb39efb5-af75-4cea-a6c7-68f10cf1732c-kube-api-access-pgtrt\") pod \"nova-cell0-conductor-0\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.869286 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.875224 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.880279 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:01 crc kubenswrapper[4917]: I0318 08:21:01.886540 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgtrt\" (UniqueName: \"kubernetes.io/projected/fb39efb5-af75-4cea-a6c7-68f10cf1732c-kube-api-access-pgtrt\") pod \"nova-cell0-conductor-0\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:02 crc kubenswrapper[4917]: I0318 08:21:02.026724 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:02 crc kubenswrapper[4917]: I0318 08:21:02.514968 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 08:21:02 crc kubenswrapper[4917]: I0318 08:21:02.634375 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fb39efb5-af75-4cea-a6c7-68f10cf1732c","Type":"ContainerStarted","Data":"617e4b560411148dab62f735a40f03ea9fcbbd4213545cfc85a843a1c0b9766c"} Mar 18 08:21:03 crc kubenswrapper[4917]: I0318 08:21:03.651803 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fb39efb5-af75-4cea-a6c7-68f10cf1732c","Type":"ContainerStarted","Data":"57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4"} Mar 18 08:21:03 crc kubenswrapper[4917]: I0318 08:21:03.652691 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:03 crc kubenswrapper[4917]: I0318 08:21:03.686820 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.686787405 podStartE2EDuration="2.686787405s" podCreationTimestamp="2026-03-18 08:21:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:03.679802976 +0000 UTC m=+5648.620957750" watchObservedRunningTime="2026-03-18 08:21:03.686787405 +0000 UTC m=+5648.627942159" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.054897 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.671465 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-r4467"] Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.672735 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.676436 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.676908 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.688745 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-r4467"] Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.694081 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.694132 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-scripts\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.694308 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-config-data\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.694762 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtrcw\" (UniqueName: \"kubernetes.io/projected/884cd94d-843d-4872-b61a-caa5a9f39d3c-kube-api-access-xtrcw\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.804074 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtrcw\" (UniqueName: \"kubernetes.io/projected/884cd94d-843d-4872-b61a-caa5a9f39d3c-kube-api-access-xtrcw\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.804187 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.804233 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-scripts\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.804363 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-config-data\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.812622 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.813927 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.818570 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.838685 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-scripts\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.848228 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.852751 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.854324 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.854442 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtrcw\" (UniqueName: \"kubernetes.io/projected/884cd94d-843d-4872-b61a-caa5a9f39d3c-kube-api-access-xtrcw\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.859505 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.866483 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-config-data\") pod \"nova-cell0-cell-mapping-r4467\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.876864 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.903748 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.905718 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.905784 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.905844 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x66fz\" (UniqueName: \"kubernetes.io/projected/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-kube-api-access-x66fz\") pod \"nova-scheduler-0\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.905895 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfcqt\" (UniqueName: \"kubernetes.io/projected/d2048ba1-f4fc-49b6-93e6-69782915e13f-kube-api-access-wfcqt\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.905918 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-config-data\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.905939 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-config-data\") pod \"nova-scheduler-0\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:07 crc kubenswrapper[4917]: I0318 08:21:07.905964 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2048ba1-f4fc-49b6-93e6-69782915e13f-logs\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:07.997380 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.011316 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x66fz\" (UniqueName: \"kubernetes.io/projected/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-kube-api-access-x66fz\") pod \"nova-scheduler-0\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.011419 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfcqt\" (UniqueName: \"kubernetes.io/projected/d2048ba1-f4fc-49b6-93e6-69782915e13f-kube-api-access-wfcqt\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.011455 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-config-data\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.011483 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-config-data\") pod \"nova-scheduler-0\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.011522 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2048ba1-f4fc-49b6-93e6-69782915e13f-logs\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.011567 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.011641 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.024191 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2048ba1-f4fc-49b6-93e6-69782915e13f-logs\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.026837 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-config-data\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.029995 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.031553 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.055651 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x66fz\" (UniqueName: \"kubernetes.io/projected/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-kube-api-access-x66fz\") pod \"nova-scheduler-0\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.057305 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.060405 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.062641 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfcqt\" (UniqueName: \"kubernetes.io/projected/d2048ba1-f4fc-49b6-93e6-69782915e13f-kube-api-access-wfcqt\") pod \"nova-api-0\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.074449 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.085275 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.130595 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-config-data\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.130656 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.130686 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-logs\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.130727 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98cdx\" (UniqueName: \"kubernetes.io/projected/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-kube-api-access-98cdx\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.136221 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.138400 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-config-data\") pod \"nova-scheduler-0\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.142319 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.160216 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.185131 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64b44d78fc-dcmbb"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.186872 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.222672 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232370 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5w56\" (UniqueName: \"kubernetes.io/projected/f4f42d03-8055-41d2-a094-2795879fac0f-kube-api-access-x5w56\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232425 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232465 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-config-data\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232505 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232534 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-dns-svc\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232558 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq9sm\" (UniqueName: \"kubernetes.io/projected/8b02e527-0708-4e8f-869b-49991c87958c-kube-api-access-gq9sm\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232579 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-logs\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232638 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98cdx\" (UniqueName: \"kubernetes.io/projected/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-kube-api-access-98cdx\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232694 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-config\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232711 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-sb\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232767 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.232797 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-nb\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.233499 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-logs\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.239244 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-config-data\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.241854 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.243032 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b44d78fc-dcmbb"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.251657 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.253910 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98cdx\" (UniqueName: \"kubernetes.io/projected/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-kube-api-access-98cdx\") pod \"nova-metadata-0\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.266654 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.334617 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.334679 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-nb\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.334732 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5w56\" (UniqueName: \"kubernetes.io/projected/f4f42d03-8055-41d2-a094-2795879fac0f-kube-api-access-x5w56\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.334758 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.334808 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-dns-svc\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.334837 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq9sm\" (UniqueName: \"kubernetes.io/projected/8b02e527-0708-4e8f-869b-49991c87958c-kube-api-access-gq9sm\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.334914 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-config\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.334937 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-sb\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.335887 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-sb\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.339077 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-nb\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.340127 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-dns-svc\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.340521 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.341636 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-config\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.342411 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.360132 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5w56\" (UniqueName: \"kubernetes.io/projected/f4f42d03-8055-41d2-a094-2795879fac0f-kube-api-access-x5w56\") pod \"dnsmasq-dns-64b44d78fc-dcmbb\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.360540 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq9sm\" (UniqueName: \"kubernetes.io/projected/8b02e527-0708-4e8f-869b-49991c87958c-kube-api-access-gq9sm\") pod \"nova-cell1-novncproxy-0\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.505678 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.522000 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.531812 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.617742 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-r4467"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.716459 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-r4467" event={"ID":"884cd94d-843d-4872-b61a-caa5a9f39d3c","Type":"ContainerStarted","Data":"0cfd3c21a309fb81383e8b8a4127b5d83ff72a33242a65380dc3d12af8c7e37c"} Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.820910 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n6nkf"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.822211 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.824818 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.825061 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.830005 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.837855 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n6nkf"] Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.854228 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-scripts\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.854419 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.854466 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4wh8\" (UniqueName: \"kubernetes.io/projected/4d282ff2-1c74-458b-a879-c01075f8f136-kube-api-access-q4wh8\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.854504 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-config-data\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.887034 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.955850 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-scripts\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.956225 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.956258 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4wh8\" (UniqueName: \"kubernetes.io/projected/4d282ff2-1c74-458b-a879-c01075f8f136-kube-api-access-q4wh8\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.956284 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-config-data\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.961765 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-scripts\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.969940 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-config-data\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.970161 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:08 crc kubenswrapper[4917]: I0318 08:21:08.981826 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4wh8\" (UniqueName: \"kubernetes.io/projected/4d282ff2-1c74-458b-a879-c01075f8f136-kube-api-access-q4wh8\") pod \"nova-cell1-conductor-db-sync-n6nkf\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.081518 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.145092 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.262675 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 08:21:09 crc kubenswrapper[4917]: W0318 08:21:09.266188 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b02e527_0708_4e8f_869b_49991c87958c.slice/crio-2e71d8ac323595c3e4a5270daa09d87c97277f97534062c1d92b5fb52a5c3ae7 WatchSource:0}: Error finding container 2e71d8ac323595c3e4a5270daa09d87c97277f97534062c1d92b5fb52a5c3ae7: Status 404 returned error can't find the container with id 2e71d8ac323595c3e4a5270daa09d87c97277f97534062c1d92b5fb52a5c3ae7 Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.288548 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.410861 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64b44d78fc-dcmbb"] Mar 18 08:21:09 crc kubenswrapper[4917]: W0318 08:21:09.429016 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f42d03_8055_41d2_a094_2795879fac0f.slice/crio-6b98a91601d7b644bffce5f74b6269dbfcc2bce91260f7d54c7a4684870712a2 WatchSource:0}: Error finding container 6b98a91601d7b644bffce5f74b6269dbfcc2bce91260f7d54c7a4684870712a2: Status 404 returned error can't find the container with id 6b98a91601d7b644bffce5f74b6269dbfcc2bce91260f7d54c7a4684870712a2 Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.682834 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n6nkf"] Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.816059 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01a395c0-4c37-46e4-9d1a-cad0fe7b4885","Type":"ContainerStarted","Data":"1364c6e5b139c413bd8e97ca54963ec11c5e60d1a499e6ce031cc77524764c42"} Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.817652 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d96c1cb-dd21-4d62-9256-4c21bcbfca32","Type":"ContainerStarted","Data":"fd606d1e2745ed8b28897c818bbc55eda84a77e75204193638c329465ca8b4e6"} Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.817743 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n6nkf" event={"ID":"4d282ff2-1c74-458b-a879-c01075f8f136","Type":"ContainerStarted","Data":"1ad2986f3f431e3cb2937177c15f8844e8176dc5bd0ebb384bd5523db233fcef"} Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.830530 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b02e527-0708-4e8f-869b-49991c87958c","Type":"ContainerStarted","Data":"2e71d8ac323595c3e4a5270daa09d87c97277f97534062c1d92b5fb52a5c3ae7"} Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.832659 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" event={"ID":"f4f42d03-8055-41d2-a094-2795879fac0f","Type":"ContainerStarted","Data":"6b98a91601d7b644bffce5f74b6269dbfcc2bce91260f7d54c7a4684870712a2"} Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.836142 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-r4467" event={"ID":"884cd94d-843d-4872-b61a-caa5a9f39d3c","Type":"ContainerStarted","Data":"bae5b3dfbf128b43befe51d9cd05ba1f2ba369e66724cc6a761a17534c62a93f"} Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.838354 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2048ba1-f4fc-49b6-93e6-69782915e13f","Type":"ContainerStarted","Data":"e7e5adca22baa890482f4347b3c37a6839d8c4088addebd3b3f2467fe0336aad"} Mar 18 08:21:09 crc kubenswrapper[4917]: I0318 08:21:09.867033 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-r4467" podStartSLOduration=2.8670171509999998 podStartE2EDuration="2.867017151s" podCreationTimestamp="2026-03-18 08:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:09.865737719 +0000 UTC m=+5654.806892433" watchObservedRunningTime="2026-03-18 08:21:09.867017151 +0000 UTC m=+5654.808171855" Mar 18 08:21:10 crc kubenswrapper[4917]: I0318 08:21:10.852733 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4f42d03-8055-41d2-a094-2795879fac0f" containerID="8f4b0c6cbcbe8cb448c0e814455a023e956292e0a6331cf93393314cdaf04871" exitCode=0 Mar 18 08:21:10 crc kubenswrapper[4917]: I0318 08:21:10.853343 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" event={"ID":"f4f42d03-8055-41d2-a094-2795879fac0f","Type":"ContainerDied","Data":"8f4b0c6cbcbe8cb448c0e814455a023e956292e0a6331cf93393314cdaf04871"} Mar 18 08:21:10 crc kubenswrapper[4917]: I0318 08:21:10.860988 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n6nkf" event={"ID":"4d282ff2-1c74-458b-a879-c01075f8f136","Type":"ContainerStarted","Data":"aa4c8355032f08349a9fb5c70ed5784082d5e7beae1e22d888e720ec008cd596"} Mar 18 08:21:10 crc kubenswrapper[4917]: I0318 08:21:10.905820 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-n6nkf" podStartSLOduration=2.905789494 podStartE2EDuration="2.905789494s" podCreationTimestamp="2026-03-18 08:21:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:10.897424891 +0000 UTC m=+5655.838579635" watchObservedRunningTime="2026-03-18 08:21:10.905789494 +0000 UTC m=+5655.846944208" Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.259418 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.269810 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.878311 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" event={"ID":"f4f42d03-8055-41d2-a094-2795879fac0f","Type":"ContainerStarted","Data":"1f83e74f415788165f2245ebe9f12b53e625bace09100edd3e46077225459a5f"} Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.878831 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.880481 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2048ba1-f4fc-49b6-93e6-69782915e13f","Type":"ContainerStarted","Data":"7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687"} Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.880542 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2048ba1-f4fc-49b6-93e6-69782915e13f","Type":"ContainerStarted","Data":"0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713"} Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.882388 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01a395c0-4c37-46e4-9d1a-cad0fe7b4885","Type":"ContainerStarted","Data":"b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873"} Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.882420 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01a395c0-4c37-46e4-9d1a-cad0fe7b4885","Type":"ContainerStarted","Data":"ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d"} Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.882471 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerName="nova-metadata-log" containerID="cri-o://ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d" gracePeriod=30 Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.882491 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerName="nova-metadata-metadata" containerID="cri-o://b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873" gracePeriod=30 Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.884143 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d96c1cb-dd21-4d62-9256-4c21bcbfca32","Type":"ContainerStarted","Data":"50ecefe888f53dc6dc9088cae95be5067f50b75050b8cd982beb97f14d9d1aba"} Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.885609 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b02e527-0708-4e8f-869b-49991c87958c","Type":"ContainerStarted","Data":"a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b"} Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.885698 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="8b02e527-0708-4e8f-869b-49991c87958c" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b" gracePeriod=30 Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.918464 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.726876733 podStartE2EDuration="5.918448475s" podCreationTimestamp="2026-03-18 08:21:07 +0000 UTC" firstStartedPulling="2026-03-18 08:21:08.886822129 +0000 UTC m=+5653.827976843" lastFinishedPulling="2026-03-18 08:21:12.078393871 +0000 UTC m=+5657.019548585" observedRunningTime="2026-03-18 08:21:12.914215702 +0000 UTC m=+5657.855370436" watchObservedRunningTime="2026-03-18 08:21:12.918448475 +0000 UTC m=+5657.859603189" Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.921959 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" podStartSLOduration=5.9219484300000005 podStartE2EDuration="5.92194843s" podCreationTimestamp="2026-03-18 08:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:12.897713313 +0000 UTC m=+5657.838868017" watchObservedRunningTime="2026-03-18 08:21:12.92194843 +0000 UTC m=+5657.863103144" Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.946781 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.165019908 podStartE2EDuration="5.946756722s" podCreationTimestamp="2026-03-18 08:21:07 +0000 UTC" firstStartedPulling="2026-03-18 08:21:09.294393563 +0000 UTC m=+5654.235548277" lastFinishedPulling="2026-03-18 08:21:12.076130377 +0000 UTC m=+5657.017285091" observedRunningTime="2026-03-18 08:21:12.933140152 +0000 UTC m=+5657.874294866" watchObservedRunningTime="2026-03-18 08:21:12.946756722 +0000 UTC m=+5657.887911436" Mar 18 08:21:12 crc kubenswrapper[4917]: I0318 08:21:12.991706 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.069552163 podStartE2EDuration="5.991688192s" podCreationTimestamp="2026-03-18 08:21:07 +0000 UTC" firstStartedPulling="2026-03-18 08:21:09.155676959 +0000 UTC m=+5654.096831673" lastFinishedPulling="2026-03-18 08:21:12.077812988 +0000 UTC m=+5657.018967702" observedRunningTime="2026-03-18 08:21:12.973560552 +0000 UTC m=+5657.914715266" watchObservedRunningTime="2026-03-18 08:21:12.991688192 +0000 UTC m=+5657.932842906" Mar 18 08:21:13 crc kubenswrapper[4917]: I0318 08:21:13.038673 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.230422063 podStartE2EDuration="6.03865057s" podCreationTimestamp="2026-03-18 08:21:07 +0000 UTC" firstStartedPulling="2026-03-18 08:21:09.269673763 +0000 UTC m=+5654.210828477" lastFinishedPulling="2026-03-18 08:21:12.07790226 +0000 UTC m=+5657.019056984" observedRunningTime="2026-03-18 08:21:13.01720129 +0000 UTC m=+5657.958356004" watchObservedRunningTime="2026-03-18 08:21:13.03865057 +0000 UTC m=+5657.979805294" Mar 18 08:21:13 crc kubenswrapper[4917]: I0318 08:21:13.253739 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 08:21:13 crc kubenswrapper[4917]: I0318 08:21:13.523323 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:13 crc kubenswrapper[4917]: I0318 08:21:13.775090 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:21:13 crc kubenswrapper[4917]: E0318 08:21:13.775519 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:21:13 crc kubenswrapper[4917]: I0318 08:21:13.900105 4917 generic.go:334] "Generic (PLEG): container finished" podID="4d282ff2-1c74-458b-a879-c01075f8f136" containerID="aa4c8355032f08349a9fb5c70ed5784082d5e7beae1e22d888e720ec008cd596" exitCode=0 Mar 18 08:21:13 crc kubenswrapper[4917]: I0318 08:21:13.900207 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n6nkf" event={"ID":"4d282ff2-1c74-458b-a879-c01075f8f136","Type":"ContainerDied","Data":"aa4c8355032f08349a9fb5c70ed5784082d5e7beae1e22d888e720ec008cd596"} Mar 18 08:21:13 crc kubenswrapper[4917]: I0318 08:21:13.907880 4917 generic.go:334] "Generic (PLEG): container finished" podID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerID="ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d" exitCode=143 Mar 18 08:21:13 crc kubenswrapper[4917]: I0318 08:21:13.907953 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01a395c0-4c37-46e4-9d1a-cad0fe7b4885","Type":"ContainerDied","Data":"ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d"} Mar 18 08:21:14 crc kubenswrapper[4917]: I0318 08:21:14.923057 4917 generic.go:334] "Generic (PLEG): container finished" podID="884cd94d-843d-4872-b61a-caa5a9f39d3c" containerID="bae5b3dfbf128b43befe51d9cd05ba1f2ba369e66724cc6a761a17534c62a93f" exitCode=0 Mar 18 08:21:14 crc kubenswrapper[4917]: I0318 08:21:14.923157 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-r4467" event={"ID":"884cd94d-843d-4872-b61a-caa5a9f39d3c","Type":"ContainerDied","Data":"bae5b3dfbf128b43befe51d9cd05ba1f2ba369e66724cc6a761a17534c62a93f"} Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.344379 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.502573 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4wh8\" (UniqueName: \"kubernetes.io/projected/4d282ff2-1c74-458b-a879-c01075f8f136-kube-api-access-q4wh8\") pod \"4d282ff2-1c74-458b-a879-c01075f8f136\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.502692 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-config-data\") pod \"4d282ff2-1c74-458b-a879-c01075f8f136\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.502736 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-scripts\") pod \"4d282ff2-1c74-458b-a879-c01075f8f136\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.502882 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-combined-ca-bundle\") pod \"4d282ff2-1c74-458b-a879-c01075f8f136\" (UID: \"4d282ff2-1c74-458b-a879-c01075f8f136\") " Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.511792 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d282ff2-1c74-458b-a879-c01075f8f136-kube-api-access-q4wh8" (OuterVolumeSpecName: "kube-api-access-q4wh8") pod "4d282ff2-1c74-458b-a879-c01075f8f136" (UID: "4d282ff2-1c74-458b-a879-c01075f8f136"). InnerVolumeSpecName "kube-api-access-q4wh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.512118 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-scripts" (OuterVolumeSpecName: "scripts") pod "4d282ff2-1c74-458b-a879-c01075f8f136" (UID: "4d282ff2-1c74-458b-a879-c01075f8f136"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.556793 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-config-data" (OuterVolumeSpecName: "config-data") pod "4d282ff2-1c74-458b-a879-c01075f8f136" (UID: "4d282ff2-1c74-458b-a879-c01075f8f136"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.559800 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d282ff2-1c74-458b-a879-c01075f8f136" (UID: "4d282ff2-1c74-458b-a879-c01075f8f136"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.605738 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.605840 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4wh8\" (UniqueName: \"kubernetes.io/projected/4d282ff2-1c74-458b-a879-c01075f8f136-kube-api-access-q4wh8\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.605862 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.605878 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d282ff2-1c74-458b-a879-c01075f8f136-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.932983 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-n6nkf" event={"ID":"4d282ff2-1c74-458b-a879-c01075f8f136","Type":"ContainerDied","Data":"1ad2986f3f431e3cb2937177c15f8844e8176dc5bd0ebb384bd5523db233fcef"} Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.933017 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-n6nkf" Mar 18 08:21:15 crc kubenswrapper[4917]: I0318 08:21:15.933039 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ad2986f3f431e3cb2937177c15f8844e8176dc5bd0ebb384bd5523db233fcef" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.002418 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 08:21:16 crc kubenswrapper[4917]: E0318 08:21:16.003155 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d282ff2-1c74-458b-a879-c01075f8f136" containerName="nova-cell1-conductor-db-sync" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.003173 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d282ff2-1c74-458b-a879-c01075f8f136" containerName="nova-cell1-conductor-db-sync" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.003398 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d282ff2-1c74-458b-a879-c01075f8f136" containerName="nova-cell1-conductor-db-sync" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.004059 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.006464 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.010969 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.063814 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7strl"] Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.076924 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7strl"] Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.116050 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c88tg\" (UniqueName: \"kubernetes.io/projected/4b871c34-192d-4fe7-84ca-d2728b5f3900-kube-api-access-c88tg\") pod \"nova-cell1-conductor-0\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.116102 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.116132 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.217804 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c88tg\" (UniqueName: \"kubernetes.io/projected/4b871c34-192d-4fe7-84ca-d2728b5f3900-kube-api-access-c88tg\") pod \"nova-cell1-conductor-0\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.217859 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.217887 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.225065 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.225989 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.233457 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c88tg\" (UniqueName: \"kubernetes.io/projected/4b871c34-192d-4fe7-84ca-d2728b5f3900-kube-api-access-c88tg\") pod \"nova-cell1-conductor-0\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.325725 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.337728 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.522853 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-config-data\") pod \"884cd94d-843d-4872-b61a-caa5a9f39d3c\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.522937 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtrcw\" (UniqueName: \"kubernetes.io/projected/884cd94d-843d-4872-b61a-caa5a9f39d3c-kube-api-access-xtrcw\") pod \"884cd94d-843d-4872-b61a-caa5a9f39d3c\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.523046 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-scripts\") pod \"884cd94d-843d-4872-b61a-caa5a9f39d3c\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.523169 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-combined-ca-bundle\") pod \"884cd94d-843d-4872-b61a-caa5a9f39d3c\" (UID: \"884cd94d-843d-4872-b61a-caa5a9f39d3c\") " Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.529469 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-scripts" (OuterVolumeSpecName: "scripts") pod "884cd94d-843d-4872-b61a-caa5a9f39d3c" (UID: "884cd94d-843d-4872-b61a-caa5a9f39d3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.531448 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/884cd94d-843d-4872-b61a-caa5a9f39d3c-kube-api-access-xtrcw" (OuterVolumeSpecName: "kube-api-access-xtrcw") pod "884cd94d-843d-4872-b61a-caa5a9f39d3c" (UID: "884cd94d-843d-4872-b61a-caa5a9f39d3c"). InnerVolumeSpecName "kube-api-access-xtrcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.556612 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "884cd94d-843d-4872-b61a-caa5a9f39d3c" (UID: "884cd94d-843d-4872-b61a-caa5a9f39d3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.563754 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-config-data" (OuterVolumeSpecName: "config-data") pod "884cd94d-843d-4872-b61a-caa5a9f39d3c" (UID: "884cd94d-843d-4872-b61a-caa5a9f39d3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.625278 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.625659 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.625672 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtrcw\" (UniqueName: \"kubernetes.io/projected/884cd94d-843d-4872-b61a-caa5a9f39d3c-kube-api-access-xtrcw\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.625683 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/884cd94d-843d-4872-b61a-caa5a9f39d3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.794097 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 08:21:16 crc kubenswrapper[4917]: W0318 08:21:16.801902 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b871c34_192d_4fe7_84ca_d2728b5f3900.slice/crio-8e87d7dfd210988cb5e80527460bb96a736e2242d9b6629d3049f8878d9b9692 WatchSource:0}: Error finding container 8e87d7dfd210988cb5e80527460bb96a736e2242d9b6629d3049f8878d9b9692: Status 404 returned error can't find the container with id 8e87d7dfd210988cb5e80527460bb96a736e2242d9b6629d3049f8878d9b9692 Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.951807 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-r4467" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.952071 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-r4467" event={"ID":"884cd94d-843d-4872-b61a-caa5a9f39d3c","Type":"ContainerDied","Data":"0cfd3c21a309fb81383e8b8a4127b5d83ff72a33242a65380dc3d12af8c7e37c"} Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.952185 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cfd3c21a309fb81383e8b8a4127b5d83ff72a33242a65380dc3d12af8c7e37c" Mar 18 08:21:16 crc kubenswrapper[4917]: I0318 08:21:16.953812 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4b871c34-192d-4fe7-84ca-d2728b5f3900","Type":"ContainerStarted","Data":"8e87d7dfd210988cb5e80527460bb96a736e2242d9b6629d3049f8878d9b9692"} Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.026241 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-d2b6-account-create-update-8ntxl"] Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.037003 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-d2b6-account-create-update-8ntxl"] Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.166821 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.167384 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1d96c1cb-dd21-4d62-9256-4c21bcbfca32" containerName="nova-scheduler-scheduler" containerID="cri-o://50ecefe888f53dc6dc9088cae95be5067f50b75050b8cd982beb97f14d9d1aba" gracePeriod=30 Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.180061 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.180418 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerName="nova-api-api" containerID="cri-o://7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687" gracePeriod=30 Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.180646 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerName="nova-api-log" containerID="cri-o://0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713" gracePeriod=30 Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.708922 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.785829 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45aa8dcb-e441-4437-a0ec-948efd209a2d" path="/var/lib/kubelet/pods/45aa8dcb-e441-4437-a0ec-948efd209a2d/volumes" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.786915 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c" path="/var/lib/kubelet/pods/fc36f0db-bfd4-4d6a-a5b0-13adcf72e33c/volumes" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.854021 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2048ba1-f4fc-49b6-93e6-69782915e13f-logs\") pod \"d2048ba1-f4fc-49b6-93e6-69782915e13f\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.854662 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2048ba1-f4fc-49b6-93e6-69782915e13f-logs" (OuterVolumeSpecName: "logs") pod "d2048ba1-f4fc-49b6-93e6-69782915e13f" (UID: "d2048ba1-f4fc-49b6-93e6-69782915e13f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.854811 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfcqt\" (UniqueName: \"kubernetes.io/projected/d2048ba1-f4fc-49b6-93e6-69782915e13f-kube-api-access-wfcqt\") pod \"d2048ba1-f4fc-49b6-93e6-69782915e13f\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.854963 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-config-data\") pod \"d2048ba1-f4fc-49b6-93e6-69782915e13f\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.855014 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-combined-ca-bundle\") pod \"d2048ba1-f4fc-49b6-93e6-69782915e13f\" (UID: \"d2048ba1-f4fc-49b6-93e6-69782915e13f\") " Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.856178 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2048ba1-f4fc-49b6-93e6-69782915e13f-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.861048 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2048ba1-f4fc-49b6-93e6-69782915e13f-kube-api-access-wfcqt" (OuterVolumeSpecName: "kube-api-access-wfcqt") pod "d2048ba1-f4fc-49b6-93e6-69782915e13f" (UID: "d2048ba1-f4fc-49b6-93e6-69782915e13f"). InnerVolumeSpecName "kube-api-access-wfcqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.880482 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2048ba1-f4fc-49b6-93e6-69782915e13f" (UID: "d2048ba1-f4fc-49b6-93e6-69782915e13f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.880900 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-config-data" (OuterVolumeSpecName: "config-data") pod "d2048ba1-f4fc-49b6-93e6-69782915e13f" (UID: "d2048ba1-f4fc-49b6-93e6-69782915e13f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.961472 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.961514 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfcqt\" (UniqueName: \"kubernetes.io/projected/d2048ba1-f4fc-49b6-93e6-69782915e13f-kube-api-access-wfcqt\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.961529 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2048ba1-f4fc-49b6-93e6-69782915e13f-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.966054 4917 generic.go:334] "Generic (PLEG): container finished" podID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerID="7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687" exitCode=0 Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.966084 4917 generic.go:334] "Generic (PLEG): container finished" podID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerID="0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713" exitCode=143 Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.966114 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.966143 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2048ba1-f4fc-49b6-93e6-69782915e13f","Type":"ContainerDied","Data":"7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687"} Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.966172 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2048ba1-f4fc-49b6-93e6-69782915e13f","Type":"ContainerDied","Data":"0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713"} Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.966186 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d2048ba1-f4fc-49b6-93e6-69782915e13f","Type":"ContainerDied","Data":"e7e5adca22baa890482f4347b3c37a6839d8c4088addebd3b3f2467fe0336aad"} Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.966201 4917 scope.go:117] "RemoveContainer" containerID="7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687" Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.970994 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4b871c34-192d-4fe7-84ca-d2728b5f3900","Type":"ContainerStarted","Data":"d6958e39261689a5e8457d30ee0c9a46f632a159b5914fabb8a8b980ae467217"} Mar 18 08:21:17 crc kubenswrapper[4917]: I0318 08:21:17.971205 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.002969 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.002951627 podStartE2EDuration="3.002951627s" podCreationTimestamp="2026-03-18 08:21:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:18.001268866 +0000 UTC m=+5662.942423590" watchObservedRunningTime="2026-03-18 08:21:18.002951627 +0000 UTC m=+5662.944106371" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.004173 4917 scope.go:117] "RemoveContainer" containerID="0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.027159 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.043076 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.054700 4917 scope.go:117] "RemoveContainer" containerID="7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687" Mar 18 08:21:18 crc kubenswrapper[4917]: E0318 08:21:18.055223 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687\": container with ID starting with 7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687 not found: ID does not exist" containerID="7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.055261 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687"} err="failed to get container status \"7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687\": rpc error: code = NotFound desc = could not find container \"7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687\": container with ID starting with 7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687 not found: ID does not exist" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.055288 4917 scope.go:117] "RemoveContainer" containerID="0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713" Mar 18 08:21:18 crc kubenswrapper[4917]: E0318 08:21:18.055703 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713\": container with ID starting with 0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713 not found: ID does not exist" containerID="0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.055736 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713"} err="failed to get container status \"0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713\": rpc error: code = NotFound desc = could not find container \"0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713\": container with ID starting with 0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713 not found: ID does not exist" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.055758 4917 scope.go:117] "RemoveContainer" containerID="7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.056012 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687"} err="failed to get container status \"7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687\": rpc error: code = NotFound desc = could not find container \"7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687\": container with ID starting with 7dffd1e6a6f5d0ee8d14f08a0054061c82503874f83bb5e04d1581bb1ca6b687 not found: ID does not exist" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.056038 4917 scope.go:117] "RemoveContainer" containerID="0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.056256 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713"} err="failed to get container status \"0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713\": rpc error: code = NotFound desc = could not find container \"0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713\": container with ID starting with 0f09ae72de65e2ee36bfd8daf1936448d7f1a3262ce0bd1095a70e959876b713 not found: ID does not exist" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.056808 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 08:21:18 crc kubenswrapper[4917]: E0318 08:21:18.057309 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerName="nova-api-log" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.057335 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerName="nova-api-log" Mar 18 08:21:18 crc kubenswrapper[4917]: E0318 08:21:18.057370 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="884cd94d-843d-4872-b61a-caa5a9f39d3c" containerName="nova-manage" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.057379 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="884cd94d-843d-4872-b61a-caa5a9f39d3c" containerName="nova-manage" Mar 18 08:21:18 crc kubenswrapper[4917]: E0318 08:21:18.057394 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerName="nova-api-api" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.057401 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerName="nova-api-api" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.057649 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerName="nova-api-log" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.057682 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2048ba1-f4fc-49b6-93e6-69782915e13f" containerName="nova-api-api" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.057702 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="884cd94d-843d-4872-b61a-caa5a9f39d3c" containerName="nova-manage" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.058916 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.065127 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsbgm\" (UniqueName: \"kubernetes.io/projected/38288673-b4e2-4816-8d0b-70b06458f8b9-kube-api-access-wsbgm\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.065223 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-config-data\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.065325 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38288673-b4e2-4816-8d0b-70b06458f8b9-logs\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.065451 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.066507 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.070884 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.167483 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.167764 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsbgm\" (UniqueName: \"kubernetes.io/projected/38288673-b4e2-4816-8d0b-70b06458f8b9-kube-api-access-wsbgm\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.167805 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-config-data\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.167880 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38288673-b4e2-4816-8d0b-70b06458f8b9-logs\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.168816 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38288673-b4e2-4816-8d0b-70b06458f8b9-logs\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.172659 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.172717 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-config-data\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.185059 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsbgm\" (UniqueName: \"kubernetes.io/projected/38288673-b4e2-4816-8d0b-70b06458f8b9-kube-api-access-wsbgm\") pod \"nova-api-0\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.382981 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.534552 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.606729 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-656f87c797-qg9dj"] Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.607001 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" podUID="55cdbe8e-2d42-49f7-aedc-e6874b0d7971" containerName="dnsmasq-dns" containerID="cri-o://d27d4f5ff6fdcc566e05d08dd48d289282686789257ceab5375bb4e4e4b0f9bc" gracePeriod=10 Mar 18 08:21:18 crc kubenswrapper[4917]: I0318 08:21:18.877069 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.005753 4917 generic.go:334] "Generic (PLEG): container finished" podID="55cdbe8e-2d42-49f7-aedc-e6874b0d7971" containerID="d27d4f5ff6fdcc566e05d08dd48d289282686789257ceab5375bb4e4e4b0f9bc" exitCode=0 Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.006049 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" event={"ID":"55cdbe8e-2d42-49f7-aedc-e6874b0d7971","Type":"ContainerDied","Data":"d27d4f5ff6fdcc566e05d08dd48d289282686789257ceab5375bb4e4e4b0f9bc"} Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.014072 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38288673-b4e2-4816-8d0b-70b06458f8b9","Type":"ContainerStarted","Data":"4899d928bafa09032ca5a5876581baebc6f069b4f453826b7e5e10f2c1573b8e"} Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.066949 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.184699 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-dns-svc\") pod \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.184730 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-nb\") pod \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.184750 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-sb\") pod \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.184836 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-config\") pod \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.184874 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht26d\" (UniqueName: \"kubernetes.io/projected/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-kube-api-access-ht26d\") pod \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\" (UID: \"55cdbe8e-2d42-49f7-aedc-e6874b0d7971\") " Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.190501 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-kube-api-access-ht26d" (OuterVolumeSpecName: "kube-api-access-ht26d") pod "55cdbe8e-2d42-49f7-aedc-e6874b0d7971" (UID: "55cdbe8e-2d42-49f7-aedc-e6874b0d7971"). InnerVolumeSpecName "kube-api-access-ht26d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.236239 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-config" (OuterVolumeSpecName: "config") pod "55cdbe8e-2d42-49f7-aedc-e6874b0d7971" (UID: "55cdbe8e-2d42-49f7-aedc-e6874b0d7971"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.236864 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "55cdbe8e-2d42-49f7-aedc-e6874b0d7971" (UID: "55cdbe8e-2d42-49f7-aedc-e6874b0d7971"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.237117 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "55cdbe8e-2d42-49f7-aedc-e6874b0d7971" (UID: "55cdbe8e-2d42-49f7-aedc-e6874b0d7971"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.240095 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "55cdbe8e-2d42-49f7-aedc-e6874b0d7971" (UID: "55cdbe8e-2d42-49f7-aedc-e6874b0d7971"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.288265 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.288692 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.288752 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.288824 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.288884 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht26d\" (UniqueName: \"kubernetes.io/projected/55cdbe8e-2d42-49f7-aedc-e6874b0d7971-kube-api-access-ht26d\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:19 crc kubenswrapper[4917]: I0318 08:21:19.791758 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2048ba1-f4fc-49b6-93e6-69782915e13f" path="/var/lib/kubelet/pods/d2048ba1-f4fc-49b6-93e6-69782915e13f/volumes" Mar 18 08:21:20 crc kubenswrapper[4917]: I0318 08:21:20.041207 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" event={"ID":"55cdbe8e-2d42-49f7-aedc-e6874b0d7971","Type":"ContainerDied","Data":"fc0738fbcb1cc1d6b8352b4c0553b59a1d9a961b814827504ba4236d31013ddb"} Mar 18 08:21:20 crc kubenswrapper[4917]: I0318 08:21:20.041272 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-656f87c797-qg9dj" Mar 18 08:21:20 crc kubenswrapper[4917]: I0318 08:21:20.041313 4917 scope.go:117] "RemoveContainer" containerID="d27d4f5ff6fdcc566e05d08dd48d289282686789257ceab5375bb4e4e4b0f9bc" Mar 18 08:21:20 crc kubenswrapper[4917]: I0318 08:21:20.049049 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38288673-b4e2-4816-8d0b-70b06458f8b9","Type":"ContainerStarted","Data":"7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc"} Mar 18 08:21:20 crc kubenswrapper[4917]: I0318 08:21:20.049135 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38288673-b4e2-4816-8d0b-70b06458f8b9","Type":"ContainerStarted","Data":"f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031"} Mar 18 08:21:20 crc kubenswrapper[4917]: I0318 08:21:20.083394 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-656f87c797-qg9dj"] Mar 18 08:21:20 crc kubenswrapper[4917]: I0318 08:21:20.086786 4917 scope.go:117] "RemoveContainer" containerID="b73ed3f71dbcf66e2fa8472bccd53e3ee537e769418029d3c8095a2d76a23412" Mar 18 08:21:20 crc kubenswrapper[4917]: I0318 08:21:20.103866 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-656f87c797-qg9dj"] Mar 18 08:21:20 crc kubenswrapper[4917]: I0318 08:21:20.123948 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.123925895 podStartE2EDuration="2.123925895s" podCreationTimestamp="2026-03-18 08:21:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:20.088825734 +0000 UTC m=+5665.029980468" watchObservedRunningTime="2026-03-18 08:21:20.123925895 +0000 UTC m=+5665.065080619" Mar 18 08:21:21 crc kubenswrapper[4917]: I0318 08:21:21.793859 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55cdbe8e-2d42-49f7-aedc-e6874b0d7971" path="/var/lib/kubelet/pods/55cdbe8e-2d42-49f7-aedc-e6874b0d7971/volumes" Mar 18 08:21:26 crc kubenswrapper[4917]: I0318 08:21:26.377083 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 08:21:26 crc kubenswrapper[4917]: I0318 08:21:26.507009 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 08:21:26 crc kubenswrapper[4917]: I0318 08:21:26.507071 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 08:21:26 crc kubenswrapper[4917]: I0318 08:21:26.773093 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:21:26 crc kubenswrapper[4917]: E0318 08:21:26.773493 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:21:27 crc kubenswrapper[4917]: I0318 08:21:27.057892 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-zjqck"] Mar 18 08:21:27 crc kubenswrapper[4917]: I0318 08:21:27.074940 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-zjqck"] Mar 18 08:21:27 crc kubenswrapper[4917]: I0318 08:21:27.793250 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25b11861-9ee4-4b04-a9b4-8736aeac1e15" path="/var/lib/kubelet/pods/25b11861-9ee4-4b04-a9b4-8736aeac1e15/volumes" Mar 18 08:21:28 crc kubenswrapper[4917]: I0318 08:21:28.383340 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 08:21:28 crc kubenswrapper[4917]: I0318 08:21:28.383418 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 08:21:29 crc kubenswrapper[4917]: I0318 08:21:29.426422 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:29 crc kubenswrapper[4917]: I0318 08:21:29.466946 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:36 crc kubenswrapper[4917]: I0318 08:21:36.383911 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 08:21:36 crc kubenswrapper[4917]: I0318 08:21:36.384490 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 08:21:39 crc kubenswrapper[4917]: I0318 08:21:39.468801 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:39 crc kubenswrapper[4917]: I0318 08:21:39.469040 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:40 crc kubenswrapper[4917]: I0318 08:21:40.055239 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-5fnxk"] Mar 18 08:21:40 crc kubenswrapper[4917]: I0318 08:21:40.074833 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-5fnxk"] Mar 18 08:21:40 crc kubenswrapper[4917]: I0318 08:21:40.773364 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:21:41 crc kubenswrapper[4917]: I0318 08:21:41.318540 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"2d945d8f3fb288f4abf13f7d56da1feb3070f1b5588474b04c510533002bdbef"} Mar 18 08:21:41 crc kubenswrapper[4917]: I0318 08:21:41.793618 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f189520b-8456-4b79-ad3d-72c93b09a4d1" path="/var/lib/kubelet/pods/f189520b-8456-4b79-ad3d-72c93b09a4d1/volumes" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.286001 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.352737 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.363238 4917 generic.go:334] "Generic (PLEG): container finished" podID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerID="b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873" exitCode=137 Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.363309 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01a395c0-4c37-46e4-9d1a-cad0fe7b4885","Type":"ContainerDied","Data":"b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873"} Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.363338 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"01a395c0-4c37-46e4-9d1a-cad0fe7b4885","Type":"ContainerDied","Data":"1364c6e5b139c413bd8e97ca54963ec11c5e60d1a499e6ce031cc77524764c42"} Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.363356 4917 scope.go:117] "RemoveContainer" containerID="b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.363489 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.370980 4917 generic.go:334] "Generic (PLEG): container finished" podID="8b02e527-0708-4e8f-869b-49991c87958c" containerID="a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b" exitCode=137 Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.371059 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b02e527-0708-4e8f-869b-49991c87958c","Type":"ContainerDied","Data":"a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b"} Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.371117 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8b02e527-0708-4e8f-869b-49991c87958c","Type":"ContainerDied","Data":"2e71d8ac323595c3e4a5270daa09d87c97277f97534062c1d92b5fb52a5c3ae7"} Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.371401 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.394498 4917 scope.go:117] "RemoveContainer" containerID="ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.415535 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-config-data\") pod \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.415714 4917 scope.go:117] "RemoveContainer" containerID="b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.415727 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-config-data\") pod \"8b02e527-0708-4e8f-869b-49991c87958c\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.415889 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-combined-ca-bundle\") pod \"8b02e527-0708-4e8f-869b-49991c87958c\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.415933 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-logs\") pod \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.415967 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq9sm\" (UniqueName: \"kubernetes.io/projected/8b02e527-0708-4e8f-869b-49991c87958c-kube-api-access-gq9sm\") pod \"8b02e527-0708-4e8f-869b-49991c87958c\" (UID: \"8b02e527-0708-4e8f-869b-49991c87958c\") " Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.416013 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-combined-ca-bundle\") pod \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.416086 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98cdx\" (UniqueName: \"kubernetes.io/projected/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-kube-api-access-98cdx\") pod \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\" (UID: \"01a395c0-4c37-46e4-9d1a-cad0fe7b4885\") " Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.416376 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-logs" (OuterVolumeSpecName: "logs") pod "01a395c0-4c37-46e4-9d1a-cad0fe7b4885" (UID: "01a395c0-4c37-46e4-9d1a-cad0fe7b4885"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.416648 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:43 crc kubenswrapper[4917]: E0318 08:21:43.416718 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873\": container with ID starting with b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873 not found: ID does not exist" containerID="b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.416773 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873"} err="failed to get container status \"b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873\": rpc error: code = NotFound desc = could not find container \"b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873\": container with ID starting with b2aedf98888a3919edaa36b492659325a899ea561dea6a8d48bb688c562d2873 not found: ID does not exist" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.416795 4917 scope.go:117] "RemoveContainer" containerID="ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d" Mar 18 08:21:43 crc kubenswrapper[4917]: E0318 08:21:43.417217 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d\": container with ID starting with ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d not found: ID does not exist" containerID="ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.417251 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d"} err="failed to get container status \"ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d\": rpc error: code = NotFound desc = could not find container \"ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d\": container with ID starting with ed6292ccf734b8cebcffc2d5b8280cfe60d4321b0e1845db949bd2c5105be02d not found: ID does not exist" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.417275 4917 scope.go:117] "RemoveContainer" containerID="a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.421086 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-kube-api-access-98cdx" (OuterVolumeSpecName: "kube-api-access-98cdx") pod "01a395c0-4c37-46e4-9d1a-cad0fe7b4885" (UID: "01a395c0-4c37-46e4-9d1a-cad0fe7b4885"). InnerVolumeSpecName "kube-api-access-98cdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.421163 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b02e527-0708-4e8f-869b-49991c87958c-kube-api-access-gq9sm" (OuterVolumeSpecName: "kube-api-access-gq9sm") pod "8b02e527-0708-4e8f-869b-49991c87958c" (UID: "8b02e527-0708-4e8f-869b-49991c87958c"). InnerVolumeSpecName "kube-api-access-gq9sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.435501 4917 scope.go:117] "RemoveContainer" containerID="a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b" Mar 18 08:21:43 crc kubenswrapper[4917]: E0318 08:21:43.436038 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b\": container with ID starting with a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b not found: ID does not exist" containerID="a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.436088 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b"} err="failed to get container status \"a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b\": rpc error: code = NotFound desc = could not find container \"a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b\": container with ID starting with a68abf60b36f301486c7d313aadec9be7a508400c67771d5111a1718dd7f0d6b not found: ID does not exist" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.440148 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b02e527-0708-4e8f-869b-49991c87958c" (UID: "8b02e527-0708-4e8f-869b-49991c87958c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.445994 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-config-data" (OuterVolumeSpecName: "config-data") pod "01a395c0-4c37-46e4-9d1a-cad0fe7b4885" (UID: "01a395c0-4c37-46e4-9d1a-cad0fe7b4885"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.446573 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-config-data" (OuterVolumeSpecName: "config-data") pod "8b02e527-0708-4e8f-869b-49991c87958c" (UID: "8b02e527-0708-4e8f-869b-49991c87958c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.447081 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01a395c0-4c37-46e4-9d1a-cad0fe7b4885" (UID: "01a395c0-4c37-46e4-9d1a-cad0fe7b4885"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.518373 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98cdx\" (UniqueName: \"kubernetes.io/projected/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-kube-api-access-98cdx\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.518427 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.518440 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.518450 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b02e527-0708-4e8f-869b-49991c87958c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.518460 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq9sm\" (UniqueName: \"kubernetes.io/projected/8b02e527-0708-4e8f-869b-49991c87958c-kube-api-access-gq9sm\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.518470 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a395c0-4c37-46e4-9d1a-cad0fe7b4885-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.722956 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.744251 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.762096 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.785215 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b02e527-0708-4e8f-869b-49991c87958c" path="/var/lib/kubelet/pods/8b02e527-0708-4e8f-869b-49991c87958c/volumes" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.785781 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.785810 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 08:21:43 crc kubenswrapper[4917]: E0318 08:21:43.786043 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerName="nova-metadata-log" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786057 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerName="nova-metadata-log" Mar 18 08:21:43 crc kubenswrapper[4917]: E0318 08:21:43.786077 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cdbe8e-2d42-49f7-aedc-e6874b0d7971" containerName="init" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786082 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cdbe8e-2d42-49f7-aedc-e6874b0d7971" containerName="init" Mar 18 08:21:43 crc kubenswrapper[4917]: E0318 08:21:43.786091 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55cdbe8e-2d42-49f7-aedc-e6874b0d7971" containerName="dnsmasq-dns" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786096 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="55cdbe8e-2d42-49f7-aedc-e6874b0d7971" containerName="dnsmasq-dns" Mar 18 08:21:43 crc kubenswrapper[4917]: E0318 08:21:43.786121 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerName="nova-metadata-metadata" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786127 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerName="nova-metadata-metadata" Mar 18 08:21:43 crc kubenswrapper[4917]: E0318 08:21:43.786136 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b02e527-0708-4e8f-869b-49991c87958c" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786141 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b02e527-0708-4e8f-869b-49991c87958c" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786288 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerName="nova-metadata-log" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786308 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="55cdbe8e-2d42-49f7-aedc-e6874b0d7971" containerName="dnsmasq-dns" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786319 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b02e527-0708-4e8f-869b-49991c87958c" containerName="nova-cell1-novncproxy-novncproxy" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786328 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" containerName="nova-metadata-metadata" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.786934 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.788807 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.789864 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.790178 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.798533 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.799949 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.800027 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.802171 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.802358 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.864919 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.932928 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.932969 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.933000 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.933027 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-logs\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.933059 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-config-data\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.933084 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.933128 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.933151 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.933166 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5m8n\" (UniqueName: \"kubernetes.io/projected/944ee733-175d-44d6-bd03-1f55c6282343-kube-api-access-j5m8n\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:43 crc kubenswrapper[4917]: I0318 08:21:43.933196 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhkl\" (UniqueName: \"kubernetes.io/projected/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-kube-api-access-kdhkl\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.034983 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.035038 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.035064 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5m8n\" (UniqueName: \"kubernetes.io/projected/944ee733-175d-44d6-bd03-1f55c6282343-kube-api-access-j5m8n\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.035107 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdhkl\" (UniqueName: \"kubernetes.io/projected/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-kube-api-access-kdhkl\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.035196 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.035223 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.035246 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.035271 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-logs\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.035320 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-config-data\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.035358 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.039781 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-logs\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.041638 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.042250 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.042521 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.042769 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.043106 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/944ee733-175d-44d6-bd03-1f55c6282343-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.046096 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.047933 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-config-data\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.055361 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdhkl\" (UniqueName: \"kubernetes.io/projected/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-kube-api-access-kdhkl\") pod \"nova-metadata-0\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.059760 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5m8n\" (UniqueName: \"kubernetes.io/projected/944ee733-175d-44d6-bd03-1f55c6282343-kube-api-access-j5m8n\") pod \"nova-cell1-novncproxy-0\" (UID: \"944ee733-175d-44d6-bd03-1f55c6282343\") " pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.151111 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.168887 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.466891 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 18 08:21:44 crc kubenswrapper[4917]: W0318 08:21:44.470447 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod944ee733_175d_44d6_bd03_1f55c6282343.slice/crio-0673fc7bc07efbec7fcd9aaca8995ed0ad04e6c66d3d0ed3ed9f7d359250ef6a WatchSource:0}: Error finding container 0673fc7bc07efbec7fcd9aaca8995ed0ad04e6c66d3d0ed3ed9f7d359250ef6a: Status 404 returned error can't find the container with id 0673fc7bc07efbec7fcd9aaca8995ed0ad04e6c66d3d0ed3ed9f7d359250ef6a Mar 18 08:21:44 crc kubenswrapper[4917]: W0318 08:21:44.522353 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdf2a778_c56d_4648_bb9d_3749abcc5ab4.slice/crio-5c4177eedde97149d90228a19c483e668220398368a11ce7665e0c485b14cbff WatchSource:0}: Error finding container 5c4177eedde97149d90228a19c483e668220398368a11ce7665e0c485b14cbff: Status 404 returned error can't find the container with id 5c4177eedde97149d90228a19c483e668220398368a11ce7665e0c485b14cbff Mar 18 08:21:44 crc kubenswrapper[4917]: I0318 08:21:44.524355 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:21:45 crc kubenswrapper[4917]: I0318 08:21:45.399256 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"944ee733-175d-44d6-bd03-1f55c6282343","Type":"ContainerStarted","Data":"9140490038b541d97e47cc9dcc2480eff3a55d243d46148550786989c0eb25c4"} Mar 18 08:21:45 crc kubenswrapper[4917]: I0318 08:21:45.399863 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"944ee733-175d-44d6-bd03-1f55c6282343","Type":"ContainerStarted","Data":"0673fc7bc07efbec7fcd9aaca8995ed0ad04e6c66d3d0ed3ed9f7d359250ef6a"} Mar 18 08:21:45 crc kubenswrapper[4917]: I0318 08:21:45.403951 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bdf2a778-c56d-4648-bb9d-3749abcc5ab4","Type":"ContainerStarted","Data":"a26e01fa5bef84357e1705e752a1986eb3c10aa325805749bd6763853cb7e6f3"} Mar 18 08:21:45 crc kubenswrapper[4917]: I0318 08:21:45.404023 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bdf2a778-c56d-4648-bb9d-3749abcc5ab4","Type":"ContainerStarted","Data":"7b08ad1249c59b5a7e65fc8ea1aec2131606293b33d09167c0e08b56af3b99a5"} Mar 18 08:21:45 crc kubenswrapper[4917]: I0318 08:21:45.404044 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bdf2a778-c56d-4648-bb9d-3749abcc5ab4","Type":"ContainerStarted","Data":"5c4177eedde97149d90228a19c483e668220398368a11ce7665e0c485b14cbff"} Mar 18 08:21:45 crc kubenswrapper[4917]: I0318 08:21:45.458434 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.458397016 podStartE2EDuration="2.458397016s" podCreationTimestamp="2026-03-18 08:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:45.426984933 +0000 UTC m=+5690.368139717" watchObservedRunningTime="2026-03-18 08:21:45.458397016 +0000 UTC m=+5690.399551800" Mar 18 08:21:45 crc kubenswrapper[4917]: I0318 08:21:45.480708 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.4806878660000002 podStartE2EDuration="2.480687866s" podCreationTimestamp="2026-03-18 08:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:45.449192292 +0000 UTC m=+5690.390347006" watchObservedRunningTime="2026-03-18 08:21:45.480687866 +0000 UTC m=+5690.421842590" Mar 18 08:21:45 crc kubenswrapper[4917]: I0318 08:21:45.784564 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a395c0-4c37-46e4-9d1a-cad0fe7b4885" path="/var/lib/kubelet/pods/01a395c0-4c37-46e4-9d1a-cad0fe7b4885/volumes" Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.447698 4917 generic.go:334] "Generic (PLEG): container finished" podID="1d96c1cb-dd21-4d62-9256-4c21bcbfca32" containerID="50ecefe888f53dc6dc9088cae95be5067f50b75050b8cd982beb97f14d9d1aba" exitCode=137 Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.447798 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d96c1cb-dd21-4d62-9256-4c21bcbfca32","Type":"ContainerDied","Data":"50ecefe888f53dc6dc9088cae95be5067f50b75050b8cd982beb97f14d9d1aba"} Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.625141 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.746964 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x66fz\" (UniqueName: \"kubernetes.io/projected/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-kube-api-access-x66fz\") pod \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.747027 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-config-data\") pod \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.747074 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-combined-ca-bundle\") pod \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\" (UID: \"1d96c1cb-dd21-4d62-9256-4c21bcbfca32\") " Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.754568 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-kube-api-access-x66fz" (OuterVolumeSpecName: "kube-api-access-x66fz") pod "1d96c1cb-dd21-4d62-9256-4c21bcbfca32" (UID: "1d96c1cb-dd21-4d62-9256-4c21bcbfca32"). InnerVolumeSpecName "kube-api-access-x66fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.784817 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-config-data" (OuterVolumeSpecName: "config-data") pod "1d96c1cb-dd21-4d62-9256-4c21bcbfca32" (UID: "1d96c1cb-dd21-4d62-9256-4c21bcbfca32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.803827 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d96c1cb-dd21-4d62-9256-4c21bcbfca32" (UID: "1d96c1cb-dd21-4d62-9256-4c21bcbfca32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.849346 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x66fz\" (UniqueName: \"kubernetes.io/projected/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-kube-api-access-x66fz\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.849378 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:47 crc kubenswrapper[4917]: I0318 08:21:47.849388 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d96c1cb-dd21-4d62-9256-4c21bcbfca32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.460104 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1d96c1cb-dd21-4d62-9256-4c21bcbfca32","Type":"ContainerDied","Data":"fd606d1e2745ed8b28897c818bbc55eda84a77e75204193638c329465ca8b4e6"} Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.460182 4917 scope.go:117] "RemoveContainer" containerID="50ecefe888f53dc6dc9088cae95be5067f50b75050b8cd982beb97f14d9d1aba" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.460390 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.505963 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.524476 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.540364 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:21:48 crc kubenswrapper[4917]: E0318 08:21:48.541043 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d96c1cb-dd21-4d62-9256-4c21bcbfca32" containerName="nova-scheduler-scheduler" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.541075 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d96c1cb-dd21-4d62-9256-4c21bcbfca32" containerName="nova-scheduler-scheduler" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.541411 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d96c1cb-dd21-4d62-9256-4c21bcbfca32" containerName="nova-scheduler-scheduler" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.542510 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.548068 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.551771 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.669097 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f6rq\" (UniqueName: \"kubernetes.io/projected/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-kube-api-access-4f6rq\") pod \"nova-scheduler-0\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.669390 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-config-data\") pod \"nova-scheduler-0\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.669486 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.771436 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f6rq\" (UniqueName: \"kubernetes.io/projected/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-kube-api-access-4f6rq\") pod \"nova-scheduler-0\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.771527 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-config-data\") pod \"nova-scheduler-0\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.771672 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.789554 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.789829 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-config-data\") pod \"nova-scheduler-0\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.791709 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f6rq\" (UniqueName: \"kubernetes.io/projected/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-kube-api-access-4f6rq\") pod \"nova-scheduler-0\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " pod="openstack/nova-scheduler-0" Mar 18 08:21:48 crc kubenswrapper[4917]: I0318 08:21:48.876044 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:21:49 crc kubenswrapper[4917]: I0318 08:21:49.151910 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:49 crc kubenswrapper[4917]: I0318 08:21:49.339925 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:21:49 crc kubenswrapper[4917]: I0318 08:21:49.466765 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:49 crc kubenswrapper[4917]: I0318 08:21:49.466794 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:49 crc kubenswrapper[4917]: I0318 08:21:49.492127 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd4b1f8-f994-49aa-906e-ea6b53b9af40","Type":"ContainerStarted","Data":"32d56e513dc35fd23ed93a76140b51c64c95f05f9a19741ea775853643541530"} Mar 18 08:21:49 crc kubenswrapper[4917]: I0318 08:21:49.795408 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d96c1cb-dd21-4d62-9256-4c21bcbfca32" path="/var/lib/kubelet/pods/1d96c1cb-dd21-4d62-9256-4c21bcbfca32/volumes" Mar 18 08:21:50 crc kubenswrapper[4917]: I0318 08:21:50.507454 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd4b1f8-f994-49aa-906e-ea6b53b9af40","Type":"ContainerStarted","Data":"8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b"} Mar 18 08:21:50 crc kubenswrapper[4917]: I0318 08:21:50.537474 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.537446604 podStartE2EDuration="2.537446604s" podCreationTimestamp="2026-03-18 08:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:50.529987134 +0000 UTC m=+5695.471141858" watchObservedRunningTime="2026-03-18 08:21:50.537446604 +0000 UTC m=+5695.478601348" Mar 18 08:21:53 crc kubenswrapper[4917]: I0318 08:21:53.876618 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.152795 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.170083 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.170144 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.170209 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.582526 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.753142 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5r96w"] Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.758365 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.760486 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.760720 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.768963 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5r96w"] Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.920010 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.920070 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-scripts\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.920102 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-config-data\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:54 crc kubenswrapper[4917]: I0318 08:21:54.920177 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hkjx\" (UniqueName: \"kubernetes.io/projected/7150ade9-aa2b-4bed-89b7-eee73cc12aec-kube-api-access-5hkjx\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.021576 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hkjx\" (UniqueName: \"kubernetes.io/projected/7150ade9-aa2b-4bed-89b7-eee73cc12aec-kube-api-access-5hkjx\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.021712 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.021746 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-scripts\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.021777 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-config-data\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.027527 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-scripts\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.028227 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.035490 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-config-data\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.039346 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hkjx\" (UniqueName: \"kubernetes.io/projected/7150ade9-aa2b-4bed-89b7-eee73cc12aec-kube-api-access-5hkjx\") pod \"nova-cell1-cell-mapping-5r96w\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.080926 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.184766 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.125:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.185339 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.125:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.546680 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5r96w"] Mar 18 08:21:55 crc kubenswrapper[4917]: I0318 08:21:55.581732 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5r96w" event={"ID":"7150ade9-aa2b-4bed-89b7-eee73cc12aec","Type":"ContainerStarted","Data":"f084d1507ceff1ece203448463912c3b797817cadcc19e7b2d427927eedc2b59"} Mar 18 08:21:56 crc kubenswrapper[4917]: I0318 08:21:56.596057 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5r96w" event={"ID":"7150ade9-aa2b-4bed-89b7-eee73cc12aec","Type":"ContainerStarted","Data":"b02c5707f7015d86527767982899b91f560279436ab036838438cb5612a49bab"} Mar 18 08:21:56 crc kubenswrapper[4917]: I0318 08:21:56.619225 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5r96w" podStartSLOduration=2.619206932 podStartE2EDuration="2.619206932s" podCreationTimestamp="2026-03-18 08:21:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:21:56.609719381 +0000 UTC m=+5701.550874095" watchObservedRunningTime="2026-03-18 08:21:56.619206932 +0000 UTC m=+5701.560361656" Mar 18 08:21:58 crc kubenswrapper[4917]: I0318 08:21:58.876660 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 08:21:58 crc kubenswrapper[4917]: I0318 08:21:58.920470 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 08:21:59 crc kubenswrapper[4917]: I0318 08:21:59.466397 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:59 crc kubenswrapper[4917]: I0318 08:21:59.466414 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.123:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:21:59 crc kubenswrapper[4917]: I0318 08:21:59.682712 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.139049 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563702-ff84h"] Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.140976 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563702-ff84h" Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.142985 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.143385 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.143497 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.159163 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563702-ff84h"] Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.161818 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28mm\" (UniqueName: \"kubernetes.io/projected/9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee-kube-api-access-f28mm\") pod \"auto-csr-approver-29563702-ff84h\" (UID: \"9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee\") " pod="openshift-infra/auto-csr-approver-29563702-ff84h" Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.263309 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28mm\" (UniqueName: \"kubernetes.io/projected/9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee-kube-api-access-f28mm\") pod \"auto-csr-approver-29563702-ff84h\" (UID: \"9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee\") " pod="openshift-infra/auto-csr-approver-29563702-ff84h" Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.281878 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28mm\" (UniqueName: \"kubernetes.io/projected/9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee-kube-api-access-f28mm\") pod \"auto-csr-approver-29563702-ff84h\" (UID: \"9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee\") " pod="openshift-infra/auto-csr-approver-29563702-ff84h" Mar 18 08:22:00 crc kubenswrapper[4917]: E0318 08:22:00.405596 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7150ade9_aa2b_4bed_89b7_eee73cc12aec.slice/crio-b02c5707f7015d86527767982899b91f560279436ab036838438cb5612a49bab.scope\": RecentStats: unable to find data in memory cache]" Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.470501 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563702-ff84h" Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.655484 4917 generic.go:334] "Generic (PLEG): container finished" podID="7150ade9-aa2b-4bed-89b7-eee73cc12aec" containerID="b02c5707f7015d86527767982899b91f560279436ab036838438cb5612a49bab" exitCode=0 Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.655607 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5r96w" event={"ID":"7150ade9-aa2b-4bed-89b7-eee73cc12aec","Type":"ContainerDied","Data":"b02c5707f7015d86527767982899b91f560279436ab036838438cb5612a49bab"} Mar 18 08:22:00 crc kubenswrapper[4917]: I0318 08:22:00.949192 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563702-ff84h"] Mar 18 08:22:00 crc kubenswrapper[4917]: W0318 08:22:00.955016 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d1371e1_80c1_4eb2_b37a_6f59d5cb56ee.slice/crio-6f24f0a9c7ed2bdf31a3e04fda42f4b4761e0971495ddae47cd85e9983fc4920 WatchSource:0}: Error finding container 6f24f0a9c7ed2bdf31a3e04fda42f4b4761e0971495ddae47cd85e9983fc4920: Status 404 returned error can't find the container with id 6f24f0a9c7ed2bdf31a3e04fda42f4b4761e0971495ddae47cd85e9983fc4920 Mar 18 08:22:01 crc kubenswrapper[4917]: I0318 08:22:01.679706 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563702-ff84h" event={"ID":"9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee","Type":"ContainerStarted","Data":"6f24f0a9c7ed2bdf31a3e04fda42f4b4761e0971495ddae47cd85e9983fc4920"} Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.092050 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.100277 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-scripts\") pod \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.100447 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-config-data\") pod \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.100495 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-combined-ca-bundle\") pod \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.100553 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hkjx\" (UniqueName: \"kubernetes.io/projected/7150ade9-aa2b-4bed-89b7-eee73cc12aec-kube-api-access-5hkjx\") pod \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\" (UID: \"7150ade9-aa2b-4bed-89b7-eee73cc12aec\") " Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.111986 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-scripts" (OuterVolumeSpecName: "scripts") pod "7150ade9-aa2b-4bed-89b7-eee73cc12aec" (UID: "7150ade9-aa2b-4bed-89b7-eee73cc12aec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.112008 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7150ade9-aa2b-4bed-89b7-eee73cc12aec-kube-api-access-5hkjx" (OuterVolumeSpecName: "kube-api-access-5hkjx") pod "7150ade9-aa2b-4bed-89b7-eee73cc12aec" (UID: "7150ade9-aa2b-4bed-89b7-eee73cc12aec"). InnerVolumeSpecName "kube-api-access-5hkjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.132346 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-config-data" (OuterVolumeSpecName: "config-data") pod "7150ade9-aa2b-4bed-89b7-eee73cc12aec" (UID: "7150ade9-aa2b-4bed-89b7-eee73cc12aec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.145198 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7150ade9-aa2b-4bed-89b7-eee73cc12aec" (UID: "7150ade9-aa2b-4bed-89b7-eee73cc12aec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.169463 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.169555 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.206885 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.206919 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.206932 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7150ade9-aa2b-4bed-89b7-eee73cc12aec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.206943 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hkjx\" (UniqueName: \"kubernetes.io/projected/7150ade9-aa2b-4bed-89b7-eee73cc12aec-kube-api-access-5hkjx\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.699899 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5r96w" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.699879 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5r96w" event={"ID":"7150ade9-aa2b-4bed-89b7-eee73cc12aec","Type":"ContainerDied","Data":"f084d1507ceff1ece203448463912c3b797817cadcc19e7b2d427927eedc2b59"} Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.701073 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f084d1507ceff1ece203448463912c3b797817cadcc19e7b2d427927eedc2b59" Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.702480 4917 generic.go:334] "Generic (PLEG): container finished" podID="9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee" containerID="e0eda9d1182ddb76052050161544ff533285454c7d324252a80e38b3f21a460d" exitCode=0 Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.702697 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563702-ff84h" event={"ID":"9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee","Type":"ContainerDied","Data":"e0eda9d1182ddb76052050161544ff533285454c7d324252a80e38b3f21a460d"} Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.878197 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.878615 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-log" containerID="cri-o://f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031" gracePeriod=30 Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.879254 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-api" containerID="cri-o://7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc" gracePeriod=30 Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.888736 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.888955 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" containerID="cri-o://8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" gracePeriod=30 Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.978680 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.978878 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-log" containerID="cri-o://7b08ad1249c59b5a7e65fc8ea1aec2131606293b33d09167c0e08b56af3b99a5" gracePeriod=30 Mar 18 08:22:02 crc kubenswrapper[4917]: I0318 08:22:02.978997 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-metadata" containerID="cri-o://a26e01fa5bef84357e1705e752a1986eb3c10aa325805749bd6763853cb7e6f3" gracePeriod=30 Mar 18 08:22:03 crc kubenswrapper[4917]: I0318 08:22:03.714899 4917 generic.go:334] "Generic (PLEG): container finished" podID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerID="f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031" exitCode=143 Mar 18 08:22:03 crc kubenswrapper[4917]: I0318 08:22:03.715072 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38288673-b4e2-4816-8d0b-70b06458f8b9","Type":"ContainerDied","Data":"f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031"} Mar 18 08:22:03 crc kubenswrapper[4917]: I0318 08:22:03.721646 4917 generic.go:334] "Generic (PLEG): container finished" podID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerID="7b08ad1249c59b5a7e65fc8ea1aec2131606293b33d09167c0e08b56af3b99a5" exitCode=143 Mar 18 08:22:03 crc kubenswrapper[4917]: I0318 08:22:03.721668 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bdf2a778-c56d-4648-bb9d-3749abcc5ab4","Type":"ContainerDied","Data":"7b08ad1249c59b5a7e65fc8ea1aec2131606293b33d09167c0e08b56af3b99a5"} Mar 18 08:22:03 crc kubenswrapper[4917]: E0318 08:22:03.879992 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:03 crc kubenswrapper[4917]: E0318 08:22:03.882329 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:03 crc kubenswrapper[4917]: E0318 08:22:03.883382 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:03 crc kubenswrapper[4917]: E0318 08:22:03.883418 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" Mar 18 08:22:04 crc kubenswrapper[4917]: I0318 08:22:04.049301 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563702-ff84h" Mar 18 08:22:04 crc kubenswrapper[4917]: I0318 08:22:04.162833 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f28mm\" (UniqueName: \"kubernetes.io/projected/9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee-kube-api-access-f28mm\") pod \"9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee\" (UID: \"9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee\") " Mar 18 08:22:04 crc kubenswrapper[4917]: I0318 08:22:04.168914 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee-kube-api-access-f28mm" (OuterVolumeSpecName: "kube-api-access-f28mm") pod "9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee" (UID: "9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee"). InnerVolumeSpecName "kube-api-access-f28mm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:22:04 crc kubenswrapper[4917]: I0318 08:22:04.266168 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f28mm\" (UniqueName: \"kubernetes.io/projected/9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee-kube-api-access-f28mm\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:04 crc kubenswrapper[4917]: I0318 08:22:04.735848 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563702-ff84h" event={"ID":"9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee","Type":"ContainerDied","Data":"6f24f0a9c7ed2bdf31a3e04fda42f4b4761e0971495ddae47cd85e9983fc4920"} Mar 18 08:22:04 crc kubenswrapper[4917]: I0318 08:22:04.737107 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f24f0a9c7ed2bdf31a3e04fda42f4b4761e0971495ddae47cd85e9983fc4920" Mar 18 08:22:04 crc kubenswrapper[4917]: I0318 08:22:04.736026 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563702-ff84h" Mar 18 08:22:05 crc kubenswrapper[4917]: I0318 08:22:05.131519 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563696-xqzn4"] Mar 18 08:22:05 crc kubenswrapper[4917]: I0318 08:22:05.142477 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563696-xqzn4"] Mar 18 08:22:05 crc kubenswrapper[4917]: I0318 08:22:05.797257 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2133003-1d0c-4244-9516-e3e55cf9491f" path="/var/lib/kubelet/pods/e2133003-1d0c-4244-9516-e3e55cf9491f/volumes" Mar 18 08:22:08 crc kubenswrapper[4917]: I0318 08:22:08.433425 4917 scope.go:117] "RemoveContainer" containerID="5dfaeb703555123358909bc01552eefaf89db4d72d91255704679e066111fa31" Mar 18 08:22:08 crc kubenswrapper[4917]: I0318 08:22:08.479134 4917 scope.go:117] "RemoveContainer" containerID="8c428e6693deab537b168147994caa690dd7a782e495fa5a41b0ab97962aea4b" Mar 18 08:22:08 crc kubenswrapper[4917]: I0318 08:22:08.562894 4917 scope.go:117] "RemoveContainer" containerID="9ba15e42b4f651241178cffa01129b1597ae47958d127e4e9ef3a95a61449001" Mar 18 08:22:08 crc kubenswrapper[4917]: I0318 08:22:08.596509 4917 scope.go:117] "RemoveContainer" containerID="a5d0067c66f7e11569b5cc505da1a8d3342fc11ba0785dd4c2f68c765355accf" Mar 18 08:22:08 crc kubenswrapper[4917]: I0318 08:22:08.641804 4917 scope.go:117] "RemoveContainer" containerID="81add7f27779c98728d36361f3d6ec42c0115e7a6edae7ac5e88059e5e59917c" Mar 18 08:22:08 crc kubenswrapper[4917]: E0318 08:22:08.889553 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:08 crc kubenswrapper[4917]: E0318 08:22:08.893671 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:08 crc kubenswrapper[4917]: E0318 08:22:08.898803 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:08 crc kubenswrapper[4917]: E0318 08:22:08.898881 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" Mar 18 08:22:13 crc kubenswrapper[4917]: E0318 08:22:13.878929 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:13 crc kubenswrapper[4917]: E0318 08:22:13.881904 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:13 crc kubenswrapper[4917]: E0318 08:22:13.885061 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:13 crc kubenswrapper[4917]: E0318 08:22:13.885111 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.817606 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.887130 4917 generic.go:334] "Generic (PLEG): container finished" podID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerID="7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc" exitCode=0 Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.887230 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38288673-b4e2-4816-8d0b-70b06458f8b9","Type":"ContainerDied","Data":"7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc"} Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.887267 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"38288673-b4e2-4816-8d0b-70b06458f8b9","Type":"ContainerDied","Data":"4899d928bafa09032ca5a5876581baebc6f069b4f453826b7e5e10f2c1573b8e"} Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.887263 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.887294 4917 scope.go:117] "RemoveContainer" containerID="7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.891198 4917 generic.go:334] "Generic (PLEG): container finished" podID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerID="a26e01fa5bef84357e1705e752a1986eb3c10aa325805749bd6763853cb7e6f3" exitCode=0 Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.891241 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bdf2a778-c56d-4648-bb9d-3749abcc5ab4","Type":"ContainerDied","Data":"a26e01fa5bef84357e1705e752a1986eb3c10aa325805749bd6763853cb7e6f3"} Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.891266 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bdf2a778-c56d-4648-bb9d-3749abcc5ab4","Type":"ContainerDied","Data":"5c4177eedde97149d90228a19c483e668220398368a11ce7665e0c485b14cbff"} Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.891277 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c4177eedde97149d90228a19c483e668220398368a11ce7665e0c485b14cbff" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.905172 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.913099 4917 scope.go:117] "RemoveContainer" containerID="f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.937558 4917 scope.go:117] "RemoveContainer" containerID="7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc" Mar 18 08:22:16 crc kubenswrapper[4917]: E0318 08:22:16.938043 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc\": container with ID starting with 7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc not found: ID does not exist" containerID="7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938102 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc"} err="failed to get container status \"7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc\": rpc error: code = NotFound desc = could not find container \"7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc\": container with ID starting with 7329d47dd98f79b5572fa674355d370caec31ba1e9c88ae16ad2acc9e50a9ccc not found: ID does not exist" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938136 4917 scope.go:117] "RemoveContainer" containerID="f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938141 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdhkl\" (UniqueName: \"kubernetes.io/projected/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-kube-api-access-kdhkl\") pod \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938238 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsbgm\" (UniqueName: \"kubernetes.io/projected/38288673-b4e2-4816-8d0b-70b06458f8b9-kube-api-access-wsbgm\") pod \"38288673-b4e2-4816-8d0b-70b06458f8b9\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938278 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-logs\") pod \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938372 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-config-data\") pod \"38288673-b4e2-4816-8d0b-70b06458f8b9\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938425 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-combined-ca-bundle\") pod \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938456 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-nova-metadata-tls-certs\") pod \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938489 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-config-data\") pod \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\" (UID: \"bdf2a778-c56d-4648-bb9d-3749abcc5ab4\") " Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938655 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-combined-ca-bundle\") pod \"38288673-b4e2-4816-8d0b-70b06458f8b9\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.938752 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38288673-b4e2-4816-8d0b-70b06458f8b9-logs\") pod \"38288673-b4e2-4816-8d0b-70b06458f8b9\" (UID: \"38288673-b4e2-4816-8d0b-70b06458f8b9\") " Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.939035 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-logs" (OuterVolumeSpecName: "logs") pod "bdf2a778-c56d-4648-bb9d-3749abcc5ab4" (UID: "bdf2a778-c56d-4648-bb9d-3749abcc5ab4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.940023 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:16 crc kubenswrapper[4917]: E0318 08:22:16.940123 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031\": container with ID starting with f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031 not found: ID does not exist" containerID="f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.940176 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031"} err="failed to get container status \"f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031\": rpc error: code = NotFound desc = could not find container \"f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031\": container with ID starting with f0a2d29c80a8b2aac271b263d953083076f3aec74018897fa4aa61792492b031 not found: ID does not exist" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.940617 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38288673-b4e2-4816-8d0b-70b06458f8b9-logs" (OuterVolumeSpecName: "logs") pod "38288673-b4e2-4816-8d0b-70b06458f8b9" (UID: "38288673-b4e2-4816-8d0b-70b06458f8b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.944535 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38288673-b4e2-4816-8d0b-70b06458f8b9-kube-api-access-wsbgm" (OuterVolumeSpecName: "kube-api-access-wsbgm") pod "38288673-b4e2-4816-8d0b-70b06458f8b9" (UID: "38288673-b4e2-4816-8d0b-70b06458f8b9"). InnerVolumeSpecName "kube-api-access-wsbgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.944627 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-kube-api-access-kdhkl" (OuterVolumeSpecName: "kube-api-access-kdhkl") pod "bdf2a778-c56d-4648-bb9d-3749abcc5ab4" (UID: "bdf2a778-c56d-4648-bb9d-3749abcc5ab4"). InnerVolumeSpecName "kube-api-access-kdhkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.966423 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-config-data" (OuterVolumeSpecName: "config-data") pod "38288673-b4e2-4816-8d0b-70b06458f8b9" (UID: "38288673-b4e2-4816-8d0b-70b06458f8b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.971681 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-config-data" (OuterVolumeSpecName: "config-data") pod "bdf2a778-c56d-4648-bb9d-3749abcc5ab4" (UID: "bdf2a778-c56d-4648-bb9d-3749abcc5ab4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.971757 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38288673-b4e2-4816-8d0b-70b06458f8b9" (UID: "38288673-b4e2-4816-8d0b-70b06458f8b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.973395 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdf2a778-c56d-4648-bb9d-3749abcc5ab4" (UID: "bdf2a778-c56d-4648-bb9d-3749abcc5ab4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:16 crc kubenswrapper[4917]: I0318 08:22:16.993386 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bdf2a778-c56d-4648-bb9d-3749abcc5ab4" (UID: "bdf2a778-c56d-4648-bb9d-3749abcc5ab4"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.042125 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.042184 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/38288673-b4e2-4816-8d0b-70b06458f8b9-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.042205 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdhkl\" (UniqueName: \"kubernetes.io/projected/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-kube-api-access-kdhkl\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.042224 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsbgm\" (UniqueName: \"kubernetes.io/projected/38288673-b4e2-4816-8d0b-70b06458f8b9-kube-api-access-wsbgm\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.042243 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38288673-b4e2-4816-8d0b-70b06458f8b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.042259 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.042279 4917 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.042297 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf2a778-c56d-4648-bb9d-3749abcc5ab4-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.236298 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.256221 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.267769 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:17 crc kubenswrapper[4917]: E0318 08:22:17.268325 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7150ade9-aa2b-4bed-89b7-eee73cc12aec" containerName="nova-manage" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268357 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7150ade9-aa2b-4bed-89b7-eee73cc12aec" containerName="nova-manage" Mar 18 08:22:17 crc kubenswrapper[4917]: E0318 08:22:17.268381 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-metadata" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268390 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-metadata" Mar 18 08:22:17 crc kubenswrapper[4917]: E0318 08:22:17.268407 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-log" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268416 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-log" Mar 18 08:22:17 crc kubenswrapper[4917]: E0318 08:22:17.268432 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-log" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268440 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-log" Mar 18 08:22:17 crc kubenswrapper[4917]: E0318 08:22:17.268464 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee" containerName="oc" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268472 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee" containerName="oc" Mar 18 08:22:17 crc kubenswrapper[4917]: E0318 08:22:17.268485 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-api" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268492 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-api" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268746 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee" containerName="oc" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268776 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-log" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268787 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-log" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268808 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" containerName="nova-metadata-metadata" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268819 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" containerName="nova-api-api" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.268833 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7150ade9-aa2b-4bed-89b7-eee73cc12aec" containerName="nova-manage" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.270154 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.272201 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.280731 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.347673 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.347760 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-logs\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.347803 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-config-data\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.347969 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmcx4\" (UniqueName: \"kubernetes.io/projected/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-kube-api-access-zmcx4\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.449934 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.450381 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-logs\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.450426 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-config-data\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.450488 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmcx4\" (UniqueName: \"kubernetes.io/projected/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-kube-api-access-zmcx4\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.451031 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-logs\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.460197 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.460390 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-config-data\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.469199 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmcx4\" (UniqueName: \"kubernetes.io/projected/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-kube-api-access-zmcx4\") pod \"nova-api-0\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.627900 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.787784 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38288673-b4e2-4816-8d0b-70b06458f8b9" path="/var/lib/kubelet/pods/38288673-b4e2-4816-8d0b-70b06458f8b9/volumes" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.898763 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.923141 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.930084 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.938697 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.940508 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.943337 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.943433 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 08:22:17 crc kubenswrapper[4917]: I0318 08:22:17.961431 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.061650 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.061967 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.062078 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c74b75-1b23-437f-bdbf-6888f4087bbb-logs\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.062150 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdr56\" (UniqueName: \"kubernetes.io/projected/e5c74b75-1b23-437f-bdbf-6888f4087bbb-kube-api-access-hdr56\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.062280 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-config-data\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.087052 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.163837 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.163914 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c74b75-1b23-437f-bdbf-6888f4087bbb-logs\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.163939 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdr56\" (UniqueName: \"kubernetes.io/projected/e5c74b75-1b23-437f-bdbf-6888f4087bbb-kube-api-access-hdr56\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.164013 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-config-data\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.164094 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.165223 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c74b75-1b23-437f-bdbf-6888f4087bbb-logs\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.168972 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-config-data\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.169070 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.171067 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.182133 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdr56\" (UniqueName: \"kubernetes.io/projected/e5c74b75-1b23-437f-bdbf-6888f4087bbb-kube-api-access-hdr56\") pod \"nova-metadata-0\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.262753 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.815159 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 08:22:18 crc kubenswrapper[4917]: E0318 08:22:18.882732 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:18 crc kubenswrapper[4917]: E0318 08:22:18.884751 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:18 crc kubenswrapper[4917]: E0318 08:22:18.886048 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:18 crc kubenswrapper[4917]: E0318 08:22:18.886094 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.915737 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7c5cb1-7ad2-4435-a02a-b42b4b762942","Type":"ContainerStarted","Data":"b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4"} Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.915778 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7c5cb1-7ad2-4435-a02a-b42b4b762942","Type":"ContainerStarted","Data":"fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a"} Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.915787 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7c5cb1-7ad2-4435-a02a-b42b4b762942","Type":"ContainerStarted","Data":"7e60bb3ae1c1a40311e86cc4935b141e12384475e7273a9ef92394d90df7a6d8"} Mar 18 08:22:18 crc kubenswrapper[4917]: I0318 08:22:18.918460 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c74b75-1b23-437f-bdbf-6888f4087bbb","Type":"ContainerStarted","Data":"b070cc99e96f703a2548a0ab380d8a8b20c85ba95cbd6dd9e4698d8b03b0dc29"} Mar 18 08:22:19 crc kubenswrapper[4917]: I0318 08:22:19.792199 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf2a778-c56d-4648-bb9d-3749abcc5ab4" path="/var/lib/kubelet/pods/bdf2a778-c56d-4648-bb9d-3749abcc5ab4/volumes" Mar 18 08:22:19 crc kubenswrapper[4917]: I0318 08:22:19.949474 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c74b75-1b23-437f-bdbf-6888f4087bbb","Type":"ContainerStarted","Data":"0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a"} Mar 18 08:22:19 crc kubenswrapper[4917]: I0318 08:22:19.949563 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c74b75-1b23-437f-bdbf-6888f4087bbb","Type":"ContainerStarted","Data":"af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd"} Mar 18 08:22:19 crc kubenswrapper[4917]: I0318 08:22:19.986122 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.986086114 podStartE2EDuration="2.986086114s" podCreationTimestamp="2026-03-18 08:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:22:18.940228279 +0000 UTC m=+5723.881383003" watchObservedRunningTime="2026-03-18 08:22:19.986086114 +0000 UTC m=+5724.927240888" Mar 18 08:22:19 crc kubenswrapper[4917]: I0318 08:22:19.999759 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.999728405 podStartE2EDuration="2.999728405s" podCreationTimestamp="2026-03-18 08:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:22:19.982682822 +0000 UTC m=+5724.923837606" watchObservedRunningTime="2026-03-18 08:22:19.999728405 +0000 UTC m=+5724.940883159" Mar 18 08:22:23 crc kubenswrapper[4917]: E0318 08:22:23.879718 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:23 crc kubenswrapper[4917]: E0318 08:22:23.881640 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:23 crc kubenswrapper[4917]: E0318 08:22:23.882943 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:23 crc kubenswrapper[4917]: E0318 08:22:23.882998 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" Mar 18 08:22:27 crc kubenswrapper[4917]: I0318 08:22:27.629394 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 08:22:27 crc kubenswrapper[4917]: I0318 08:22:27.631330 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 08:22:28 crc kubenswrapper[4917]: I0318 08:22:28.263450 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 08:22:28 crc kubenswrapper[4917]: I0318 08:22:28.263493 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 08:22:28 crc kubenswrapper[4917]: I0318 08:22:28.711933 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.129:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:22:28 crc kubenswrapper[4917]: I0318 08:22:28.712511 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.129:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 18 08:22:28 crc kubenswrapper[4917]: E0318 08:22:28.878864 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:28 crc kubenswrapper[4917]: E0318 08:22:28.880881 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:28 crc kubenswrapper[4917]: E0318 08:22:28.882548 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 08:22:28 crc kubenswrapper[4917]: E0318 08:22:28.882630 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" Mar 18 08:22:29 crc kubenswrapper[4917]: I0318 08:22:29.276732 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.130:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 08:22:29 crc kubenswrapper[4917]: I0318 08:22:29.276732 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.130:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.105704 4917 generic.go:334] "Generic (PLEG): container finished" podID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" exitCode=137 Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.105832 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd4b1f8-f994-49aa-906e-ea6b53b9af40","Type":"ContainerDied","Data":"8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b"} Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.424704 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.558049 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-config-data\") pod \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.558152 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-combined-ca-bundle\") pod \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.558385 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f6rq\" (UniqueName: \"kubernetes.io/projected/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-kube-api-access-4f6rq\") pod \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\" (UID: \"dbd4b1f8-f994-49aa-906e-ea6b53b9af40\") " Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.564763 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-kube-api-access-4f6rq" (OuterVolumeSpecName: "kube-api-access-4f6rq") pod "dbd4b1f8-f994-49aa-906e-ea6b53b9af40" (UID: "dbd4b1f8-f994-49aa-906e-ea6b53b9af40"). InnerVolumeSpecName "kube-api-access-4f6rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.592243 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-config-data" (OuterVolumeSpecName: "config-data") pod "dbd4b1f8-f994-49aa-906e-ea6b53b9af40" (UID: "dbd4b1f8-f994-49aa-906e-ea6b53b9af40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.613321 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbd4b1f8-f994-49aa-906e-ea6b53b9af40" (UID: "dbd4b1f8-f994-49aa-906e-ea6b53b9af40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.660581 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.660644 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:33 crc kubenswrapper[4917]: I0318 08:22:33.660665 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f6rq\" (UniqueName: \"kubernetes.io/projected/dbd4b1f8-f994-49aa-906e-ea6b53b9af40-kube-api-access-4f6rq\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.115269 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dbd4b1f8-f994-49aa-906e-ea6b53b9af40","Type":"ContainerDied","Data":"32d56e513dc35fd23ed93a76140b51c64c95f05f9a19741ea775853643541530"} Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.115302 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.115519 4917 scope.go:117] "RemoveContainer" containerID="8a69054e22344218cd129bbe92ab331248f20c7cb5c0062ab9825207988b7f9b" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.136155 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.145872 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.157779 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:22:34 crc kubenswrapper[4917]: E0318 08:22:34.158232 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.158252 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.158453 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" containerName="nova-scheduler-scheduler" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.159117 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.161378 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.166192 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.272344 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szsqs\" (UniqueName: \"kubernetes.io/projected/4c71fad2-8395-413b-8028-b10578eb45e5-kube-api-access-szsqs\") pod \"nova-scheduler-0\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.272426 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-config-data\") pod \"nova-scheduler-0\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.272539 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.373611 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.373981 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szsqs\" (UniqueName: \"kubernetes.io/projected/4c71fad2-8395-413b-8028-b10578eb45e5-kube-api-access-szsqs\") pod \"nova-scheduler-0\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.374102 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-config-data\") pod \"nova-scheduler-0\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.380646 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-config-data\") pod \"nova-scheduler-0\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.380838 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.394791 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szsqs\" (UniqueName: \"kubernetes.io/projected/4c71fad2-8395-413b-8028-b10578eb45e5-kube-api-access-szsqs\") pod \"nova-scheduler-0\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.477024 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 08:22:34 crc kubenswrapper[4917]: I0318 08:22:34.976967 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 08:22:35 crc kubenswrapper[4917]: I0318 08:22:35.148257 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c71fad2-8395-413b-8028-b10578eb45e5","Type":"ContainerStarted","Data":"79eee33b579434d1e3c51a87416451de3d11bf8e0abd9cc4d612b942978d53cb"} Mar 18 08:22:35 crc kubenswrapper[4917]: I0318 08:22:35.629013 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 08:22:35 crc kubenswrapper[4917]: I0318 08:22:35.629130 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 08:22:35 crc kubenswrapper[4917]: I0318 08:22:35.803905 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbd4b1f8-f994-49aa-906e-ea6b53b9af40" path="/var/lib/kubelet/pods/dbd4b1f8-f994-49aa-906e-ea6b53b9af40/volumes" Mar 18 08:22:36 crc kubenswrapper[4917]: I0318 08:22:36.161992 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c71fad2-8395-413b-8028-b10578eb45e5","Type":"ContainerStarted","Data":"56729c5d3ea1726d96d4077b5709e46532c09fde7bfad3870041bb8fde4dd330"} Mar 18 08:22:36 crc kubenswrapper[4917]: I0318 08:22:36.182716 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.182697971 podStartE2EDuration="2.182697971s" podCreationTimestamp="2026-03-18 08:22:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:22:36.176792168 +0000 UTC m=+5741.117946922" watchObservedRunningTime="2026-03-18 08:22:36.182697971 +0000 UTC m=+5741.123852685" Mar 18 08:22:36 crc kubenswrapper[4917]: I0318 08:22:36.263838 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 08:22:36 crc kubenswrapper[4917]: I0318 08:22:36.263911 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 08:22:37 crc kubenswrapper[4917]: I0318 08:22:37.632095 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 08:22:37 crc kubenswrapper[4917]: I0318 08:22:37.634455 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 08:22:37 crc kubenswrapper[4917]: I0318 08:22:37.637317 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.183425 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.308177 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.309805 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.317640 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.396470 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5645c74dc7-9k229"] Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.398317 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.406121 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5645c74dc7-9k229"] Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.464421 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-nb\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.464499 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-config\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.464555 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz2mc\" (UniqueName: \"kubernetes.io/projected/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-kube-api-access-cz2mc\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.464649 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-dns-svc\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.464671 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-sb\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.566251 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-nb\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.566346 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-config\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.566399 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2mc\" (UniqueName: \"kubernetes.io/projected/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-kube-api-access-cz2mc\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.566483 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-dns-svc\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.566507 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-sb\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.567331 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-nb\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.567335 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-sb\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.567416 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-dns-svc\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.567455 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-config\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.597902 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz2mc\" (UniqueName: \"kubernetes.io/projected/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-kube-api-access-cz2mc\") pod \"dnsmasq-dns-5645c74dc7-9k229\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:38 crc kubenswrapper[4917]: I0318 08:22:38.739005 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:39 crc kubenswrapper[4917]: I0318 08:22:39.213022 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 08:22:39 crc kubenswrapper[4917]: I0318 08:22:39.277220 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5645c74dc7-9k229"] Mar 18 08:22:39 crc kubenswrapper[4917]: I0318 08:22:39.477983 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 08:22:40 crc kubenswrapper[4917]: I0318 08:22:40.201891 4917 generic.go:334] "Generic (PLEG): container finished" podID="a87fcd6c-0fa1-46c8-b3e4-460366737a7e" containerID="daa4227352c09a3b53e7569f0ab7e8e74ce833b89b2ccf1c13c729d4285c5b8c" exitCode=0 Mar 18 08:22:40 crc kubenswrapper[4917]: I0318 08:22:40.202018 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" event={"ID":"a87fcd6c-0fa1-46c8-b3e4-460366737a7e","Type":"ContainerDied","Data":"daa4227352c09a3b53e7569f0ab7e8e74ce833b89b2ccf1c13c729d4285c5b8c"} Mar 18 08:22:40 crc kubenswrapper[4917]: I0318 08:22:40.202824 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" event={"ID":"a87fcd6c-0fa1-46c8-b3e4-460366737a7e","Type":"ContainerStarted","Data":"a8d7ac0c39c08add89fc47cac6a1f0f3582313f1c07bbcc905b8acd8d14c30ac"} Mar 18 08:22:41 crc kubenswrapper[4917]: I0318 08:22:41.212815 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" event={"ID":"a87fcd6c-0fa1-46c8-b3e4-460366737a7e","Type":"ContainerStarted","Data":"451583e8f44362961f425c6156cd9956e9f550da9110cb946c3301a38ed2924b"} Mar 18 08:22:41 crc kubenswrapper[4917]: I0318 08:22:41.241365 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" podStartSLOduration=3.241342385 podStartE2EDuration="3.241342385s" podCreationTimestamp="2026-03-18 08:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:22:41.227901729 +0000 UTC m=+5746.169056453" watchObservedRunningTime="2026-03-18 08:22:41.241342385 +0000 UTC m=+5746.182497119" Mar 18 08:22:41 crc kubenswrapper[4917]: I0318 08:22:41.272461 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:41 crc kubenswrapper[4917]: I0318 08:22:41.272675 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-log" containerID="cri-o://fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a" gracePeriod=30 Mar 18 08:22:41 crc kubenswrapper[4917]: I0318 08:22:41.272775 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-api" containerID="cri-o://b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4" gracePeriod=30 Mar 18 08:22:42 crc kubenswrapper[4917]: I0318 08:22:42.228216 4917 generic.go:334] "Generic (PLEG): container finished" podID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerID="fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a" exitCode=143 Mar 18 08:22:42 crc kubenswrapper[4917]: I0318 08:22:42.229876 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7c5cb1-7ad2-4435-a02a-b42b4b762942","Type":"ContainerDied","Data":"fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a"} Mar 18 08:22:42 crc kubenswrapper[4917]: I0318 08:22:42.229939 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:44 crc kubenswrapper[4917]: I0318 08:22:44.477529 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 08:22:44 crc kubenswrapper[4917]: I0318 08:22:44.522296 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:44.999710 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.099821 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-combined-ca-bundle\") pod \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.099965 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-config-data\") pod \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.100019 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmcx4\" (UniqueName: \"kubernetes.io/projected/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-kube-api-access-zmcx4\") pod \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.100068 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-logs\") pod \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\" (UID: \"dc7c5cb1-7ad2-4435-a02a-b42b4b762942\") " Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.100976 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-logs" (OuterVolumeSpecName: "logs") pod "dc7c5cb1-7ad2-4435-a02a-b42b4b762942" (UID: "dc7c5cb1-7ad2-4435-a02a-b42b4b762942"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.106121 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-kube-api-access-zmcx4" (OuterVolumeSpecName: "kube-api-access-zmcx4") pod "dc7c5cb1-7ad2-4435-a02a-b42b4b762942" (UID: "dc7c5cb1-7ad2-4435-a02a-b42b4b762942"). InnerVolumeSpecName "kube-api-access-zmcx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.136613 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-config-data" (OuterVolumeSpecName: "config-data") pod "dc7c5cb1-7ad2-4435-a02a-b42b4b762942" (UID: "dc7c5cb1-7ad2-4435-a02a-b42b4b762942"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.143878 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc7c5cb1-7ad2-4435-a02a-b42b4b762942" (UID: "dc7c5cb1-7ad2-4435-a02a-b42b4b762942"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.203040 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.203091 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.203101 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmcx4\" (UniqueName: \"kubernetes.io/projected/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-kube-api-access-zmcx4\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.203110 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7c5cb1-7ad2-4435-a02a-b42b4b762942-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.266362 4917 generic.go:334] "Generic (PLEG): container finished" podID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerID="b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4" exitCode=0 Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.266467 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7c5cb1-7ad2-4435-a02a-b42b4b762942","Type":"ContainerDied","Data":"b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4"} Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.266486 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.266518 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"dc7c5cb1-7ad2-4435-a02a-b42b4b762942","Type":"ContainerDied","Data":"7e60bb3ae1c1a40311e86cc4935b141e12384475e7273a9ef92394d90df7a6d8"} Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.266536 4917 scope.go:117] "RemoveContainer" containerID="b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.295130 4917 scope.go:117] "RemoveContainer" containerID="fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.300313 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.300683 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.320746 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.329751 4917 scope.go:117] "RemoveContainer" containerID="b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4" Mar 18 08:22:45 crc kubenswrapper[4917]: E0318 08:22:45.330702 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4\": container with ID starting with b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4 not found: ID does not exist" containerID="b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.330772 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4"} err="failed to get container status \"b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4\": rpc error: code = NotFound desc = could not find container \"b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4\": container with ID starting with b8e9f1a6e6a511a6ce81b56d2efddeec29650790c22a76459eb93a587a3893d4 not found: ID does not exist" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.330808 4917 scope.go:117] "RemoveContainer" containerID="fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a" Mar 18 08:22:45 crc kubenswrapper[4917]: E0318 08:22:45.331260 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a\": container with ID starting with fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a not found: ID does not exist" containerID="fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.331303 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a"} err="failed to get container status \"fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a\": rpc error: code = NotFound desc = could not find container \"fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a\": container with ID starting with fdf87a89bcb9ddec5320373b8e37b1c74b4cc592632aae974c11df6501f45f9a not found: ID does not exist" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.339502 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:45 crc kubenswrapper[4917]: E0318 08:22:45.340034 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-log" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.340053 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-log" Mar 18 08:22:45 crc kubenswrapper[4917]: E0318 08:22:45.340069 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-api" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.340077 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-api" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.340247 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-api" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.340265 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" containerName="nova-api-log" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.341380 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.343373 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.343432 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.345194 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.357275 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.409034 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.409339 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qfq8\" (UniqueName: \"kubernetes.io/projected/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-kube-api-access-5qfq8\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.409435 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.409536 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-public-tls-certs\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.409699 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-logs\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.409799 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-config-data\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.511972 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-logs\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.512943 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-config-data\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.512718 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-logs\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.513235 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.513336 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qfq8\" (UniqueName: \"kubernetes.io/projected/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-kube-api-access-5qfq8\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.513415 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.513512 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-public-tls-certs\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.518074 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.518690 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.519223 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-config-data\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.528827 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-public-tls-certs\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.528994 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qfq8\" (UniqueName: \"kubernetes.io/projected/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-kube-api-access-5qfq8\") pod \"nova-api-0\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.663199 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 08:22:45 crc kubenswrapper[4917]: I0318 08:22:45.806248 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7c5cb1-7ad2-4435-a02a-b42b4b762942" path="/var/lib/kubelet/pods/dc7c5cb1-7ad2-4435-a02a-b42b4b762942/volumes" Mar 18 08:22:46 crc kubenswrapper[4917]: I0318 08:22:46.153552 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 08:22:46 crc kubenswrapper[4917]: I0318 08:22:46.276858 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af","Type":"ContainerStarted","Data":"52a2c202febb5a69b862b1c09325f15d64703bb73f7869f33db702a20467a465"} Mar 18 08:22:47 crc kubenswrapper[4917]: I0318 08:22:47.287651 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af","Type":"ContainerStarted","Data":"7fc041ac2395b2c811e5ffcfa31fb4c9ae542eadaba3a16769fb5c397fc14505"} Mar 18 08:22:47 crc kubenswrapper[4917]: I0318 08:22:47.288004 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af","Type":"ContainerStarted","Data":"438f26f42f32838aafb20712ca0eac3c5b39291fc19fc6bc965d2d5fb9b1359d"} Mar 18 08:22:47 crc kubenswrapper[4917]: I0318 08:22:47.330832 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.330803839 podStartE2EDuration="2.330803839s" podCreationTimestamp="2026-03-18 08:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:22:47.312422473 +0000 UTC m=+5752.253577287" watchObservedRunningTime="2026-03-18 08:22:47.330803839 +0000 UTC m=+5752.271958583" Mar 18 08:22:48 crc kubenswrapper[4917]: I0318 08:22:48.741797 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:22:48 crc kubenswrapper[4917]: I0318 08:22:48.800847 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b44d78fc-dcmbb"] Mar 18 08:22:48 crc kubenswrapper[4917]: I0318 08:22:48.801208 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" podUID="f4f42d03-8055-41d2-a094-2795879fac0f" containerName="dnsmasq-dns" containerID="cri-o://1f83e74f415788165f2245ebe9f12b53e625bace09100edd3e46077225459a5f" gracePeriod=10 Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.313212 4917 generic.go:334] "Generic (PLEG): container finished" podID="f4f42d03-8055-41d2-a094-2795879fac0f" containerID="1f83e74f415788165f2245ebe9f12b53e625bace09100edd3e46077225459a5f" exitCode=0 Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.313407 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" event={"ID":"f4f42d03-8055-41d2-a094-2795879fac0f","Type":"ContainerDied","Data":"1f83e74f415788165f2245ebe9f12b53e625bace09100edd3e46077225459a5f"} Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.313946 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" event={"ID":"f4f42d03-8055-41d2-a094-2795879fac0f","Type":"ContainerDied","Data":"6b98a91601d7b644bffce5f74b6269dbfcc2bce91260f7d54c7a4684870712a2"} Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.314038 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b98a91601d7b644bffce5f74b6269dbfcc2bce91260f7d54c7a4684870712a2" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.391973 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.495010 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-nb\") pod \"f4f42d03-8055-41d2-a094-2795879fac0f\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.495080 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-dns-svc\") pod \"f4f42d03-8055-41d2-a094-2795879fac0f\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.495203 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5w56\" (UniqueName: \"kubernetes.io/projected/f4f42d03-8055-41d2-a094-2795879fac0f-kube-api-access-x5w56\") pod \"f4f42d03-8055-41d2-a094-2795879fac0f\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.495224 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-config\") pod \"f4f42d03-8055-41d2-a094-2795879fac0f\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.495294 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-sb\") pod \"f4f42d03-8055-41d2-a094-2795879fac0f\" (UID: \"f4f42d03-8055-41d2-a094-2795879fac0f\") " Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.501861 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f42d03-8055-41d2-a094-2795879fac0f-kube-api-access-x5w56" (OuterVolumeSpecName: "kube-api-access-x5w56") pod "f4f42d03-8055-41d2-a094-2795879fac0f" (UID: "f4f42d03-8055-41d2-a094-2795879fac0f"). InnerVolumeSpecName "kube-api-access-x5w56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.550339 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f4f42d03-8055-41d2-a094-2795879fac0f" (UID: "f4f42d03-8055-41d2-a094-2795879fac0f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.551960 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4f42d03-8055-41d2-a094-2795879fac0f" (UID: "f4f42d03-8055-41d2-a094-2795879fac0f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.552439 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f4f42d03-8055-41d2-a094-2795879fac0f" (UID: "f4f42d03-8055-41d2-a094-2795879fac0f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.555456 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-config" (OuterVolumeSpecName: "config") pod "f4f42d03-8055-41d2-a094-2795879fac0f" (UID: "f4f42d03-8055-41d2-a094-2795879fac0f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.597719 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.597760 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.597771 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5w56\" (UniqueName: \"kubernetes.io/projected/f4f42d03-8055-41d2-a094-2795879fac0f-kube-api-access-x5w56\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.597779 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:49 crc kubenswrapper[4917]: I0318 08:22:49.597791 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4f42d03-8055-41d2-a094-2795879fac0f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:22:50 crc kubenswrapper[4917]: I0318 08:22:50.322314 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64b44d78fc-dcmbb" Mar 18 08:22:50 crc kubenswrapper[4917]: I0318 08:22:50.349753 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64b44d78fc-dcmbb"] Mar 18 08:22:50 crc kubenswrapper[4917]: I0318 08:22:50.356619 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64b44d78fc-dcmbb"] Mar 18 08:22:51 crc kubenswrapper[4917]: I0318 08:22:51.781952 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f42d03-8055-41d2-a094-2795879fac0f" path="/var/lib/kubelet/pods/f4f42d03-8055-41d2-a094-2795879fac0f/volumes" Mar 18 08:22:55 crc kubenswrapper[4917]: I0318 08:22:55.663714 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 08:22:55 crc kubenswrapper[4917]: I0318 08:22:55.664541 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 08:22:56 crc kubenswrapper[4917]: I0318 08:22:56.669788 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.133:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 08:22:56 crc kubenswrapper[4917]: I0318 08:22:56.675831 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.133:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 08:23:03 crc kubenswrapper[4917]: I0318 08:23:03.664285 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 08:23:03 crc kubenswrapper[4917]: I0318 08:23:03.664878 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 08:23:05 crc kubenswrapper[4917]: I0318 08:23:05.671993 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 08:23:05 crc kubenswrapper[4917]: I0318 08:23:05.676225 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 08:23:05 crc kubenswrapper[4917]: I0318 08:23:05.681674 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 08:23:06 crc kubenswrapper[4917]: I0318 08:23:06.529678 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.205503 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fbdf89979-bkddk"] Mar 18 08:23:18 crc kubenswrapper[4917]: E0318 08:23:18.214117 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f42d03-8055-41d2-a094-2795879fac0f" containerName="dnsmasq-dns" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.214146 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f42d03-8055-41d2-a094-2795879fac0f" containerName="dnsmasq-dns" Mar 18 08:23:18 crc kubenswrapper[4917]: E0318 08:23:18.214202 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f42d03-8055-41d2-a094-2795879fac0f" containerName="init" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.214212 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f42d03-8055-41d2-a094-2795879fac0f" containerName="init" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.214623 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f42d03-8055-41d2-a094-2795879fac0f" containerName="dnsmasq-dns" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.216268 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.235511 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.235730 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-w7nr9" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.236039 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.236176 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.236991 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fbdf89979-bkddk"] Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.273159 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.273386 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerName="glance-log" containerID="cri-o://dbfa0f81af6e37352345c1c0ef2ff8f37746bca03d8aed509c46c4bd330e4e42" gracePeriod=30 Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.273520 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerName="glance-httpd" containerID="cri-o://6473b0f3aac70b88f9abd869786c1a73a81aeff81ef68e26ab317d1f4cc5443d" gracePeriod=30 Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.330874 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-566f99d8b7-s264b"] Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.332617 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.338823 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.339065 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerName="glance-log" containerID="cri-o://324fcb1238a9c314e1fd7edcc6d35401edd8a86c3db980db17d27318db0e4249" gracePeriod=30 Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.339114 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerName="glance-httpd" containerID="cri-o://569cb979ff152b171b9c7fd88c371de8e2d06dea0eae265c59852e9973bfc260" gracePeriod=30 Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.364115 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-566f99d8b7-s264b"] Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.415727 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-config-data\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.415796 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-scripts\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.415846 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035a076d-3027-4867-b402-a051746ff481-logs\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.415927 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw4ck\" (UniqueName: \"kubernetes.io/projected/035a076d-3027-4867-b402-a051746ff481-kube-api-access-lw4ck\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.415991 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/035a076d-3027-4867-b402-a051746ff481-horizon-secret-key\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.517844 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fb0d5e-9b30-4796-927a-976f8c03f791-logs\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.517905 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/035a076d-3027-4867-b402-a051746ff481-horizon-secret-key\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.517934 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-config-data\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.518068 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32fb0d5e-9b30-4796-927a-976f8c03f791-horizon-secret-key\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.518142 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-scripts\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.518249 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-config-data\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.518291 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035a076d-3027-4867-b402-a051746ff481-logs\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.518335 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw84z\" (UniqueName: \"kubernetes.io/projected/32fb0d5e-9b30-4796-927a-976f8c03f791-kube-api-access-mw84z\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.518382 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-scripts\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.518416 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw4ck\" (UniqueName: \"kubernetes.io/projected/035a076d-3027-4867-b402-a051746ff481-kube-api-access-lw4ck\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.518687 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-scripts\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.519192 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-config-data\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.519881 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035a076d-3027-4867-b402-a051746ff481-logs\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.529149 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/035a076d-3027-4867-b402-a051746ff481-horizon-secret-key\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.532735 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw4ck\" (UniqueName: \"kubernetes.io/projected/035a076d-3027-4867-b402-a051746ff481-kube-api-access-lw4ck\") pod \"horizon-6fbdf89979-bkddk\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.546049 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.620228 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fb0d5e-9b30-4796-927a-976f8c03f791-logs\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.620955 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32fb0d5e-9b30-4796-927a-976f8c03f791-horizon-secret-key\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.621156 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-config-data\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.621284 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw84z\" (UniqueName: \"kubernetes.io/projected/32fb0d5e-9b30-4796-927a-976f8c03f791-kube-api-access-mw84z\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.621441 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-scripts\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.621819 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fb0d5e-9b30-4796-927a-976f8c03f791-logs\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.622868 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-scripts\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.624925 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32fb0d5e-9b30-4796-927a-976f8c03f791-horizon-secret-key\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.625707 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-config-data\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.649257 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw84z\" (UniqueName: \"kubernetes.io/projected/32fb0d5e-9b30-4796-927a-976f8c03f791-kube-api-access-mw84z\") pod \"horizon-566f99d8b7-s264b\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.651705 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.665492 4917 generic.go:334] "Generic (PLEG): container finished" podID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerID="324fcb1238a9c314e1fd7edcc6d35401edd8a86c3db980db17d27318db0e4249" exitCode=143 Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.665594 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"265abbaf-11cf-4b9d-a72f-5265a57d02e1","Type":"ContainerDied","Data":"324fcb1238a9c314e1fd7edcc6d35401edd8a86c3db980db17d27318db0e4249"} Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.671413 4917 generic.go:334] "Generic (PLEG): container finished" podID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerID="dbfa0f81af6e37352345c1c0ef2ff8f37746bca03d8aed509c46c4bd330e4e42" exitCode=143 Mar 18 08:23:18 crc kubenswrapper[4917]: I0318 08:23:18.671457 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"047f2cb4-a55b-41cf-88b0-34a29702ec38","Type":"ContainerDied","Data":"dbfa0f81af6e37352345c1c0ef2ff8f37746bca03d8aed509c46c4bd330e4e42"} Mar 18 08:23:19 crc kubenswrapper[4917]: I0318 08:23:19.080485 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fbdf89979-bkddk"] Mar 18 08:23:19 crc kubenswrapper[4917]: I0318 08:23:19.175683 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-566f99d8b7-s264b"] Mar 18 08:23:19 crc kubenswrapper[4917]: W0318 08:23:19.177570 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32fb0d5e_9b30_4796_927a_976f8c03f791.slice/crio-addf054e973e98748a9f2b688085f777e736c91115a91d8c6e5d6befa9979550 WatchSource:0}: Error finding container addf054e973e98748a9f2b688085f777e736c91115a91d8c6e5d6befa9979550: Status 404 returned error can't find the container with id addf054e973e98748a9f2b688085f777e736c91115a91d8c6e5d6befa9979550 Mar 18 08:23:19 crc kubenswrapper[4917]: I0318 08:23:19.684620 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbdf89979-bkddk" event={"ID":"035a076d-3027-4867-b402-a051746ff481","Type":"ContainerStarted","Data":"82af7f6f4342561ec662339be2e6d98ee8f879098444532d1240c2475a1db53a"} Mar 18 08:23:19 crc kubenswrapper[4917]: I0318 08:23:19.687795 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566f99d8b7-s264b" event={"ID":"32fb0d5e-9b30-4796-927a-976f8c03f791","Type":"ContainerStarted","Data":"addf054e973e98748a9f2b688085f777e736c91115a91d8c6e5d6befa9979550"} Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.670437 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-566f99d8b7-s264b"] Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.706354 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7d88459b6-s7mzt"] Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.712576 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.716518 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.739736 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d88459b6-s7mzt"] Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.761201 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rn4s\" (UniqueName: \"kubernetes.io/projected/1600979d-13bf-4de3-8a5a-a3211880d6e6-kube-api-access-8rn4s\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.761252 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-combined-ca-bundle\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.761275 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1600979d-13bf-4de3-8a5a-a3211880d6e6-logs\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.761308 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-tls-certs\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.761344 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-secret-key\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.761362 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-scripts\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.761406 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-config-data\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.791224 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fbdf89979-bkddk"] Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.825615 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58cff7766d-4ggqr"] Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.828147 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.860863 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58cff7766d-4ggqr"] Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.862802 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-config-data\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.862885 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-secret-key\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.862932 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-tls-certs\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.862955 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mg6s\" (UniqueName: \"kubernetes.io/projected/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-kube-api-access-7mg6s\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.862984 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rn4s\" (UniqueName: \"kubernetes.io/projected/1600979d-13bf-4de3-8a5a-a3211880d6e6-kube-api-access-8rn4s\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.863014 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-combined-ca-bundle\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.863030 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-config-data\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.863050 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1600979d-13bf-4de3-8a5a-a3211880d6e6-logs\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.863074 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-tls-certs\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.863105 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-logs\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.863132 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-secret-key\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.865674 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-combined-ca-bundle\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.865727 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-scripts\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.866129 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1600979d-13bf-4de3-8a5a-a3211880d6e6-logs\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.866111 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-scripts\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.867088 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-scripts\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.867494 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-config-data\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.871995 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-secret-key\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.872420 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-combined-ca-bundle\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.877211 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-tls-certs\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.883442 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rn4s\" (UniqueName: \"kubernetes.io/projected/1600979d-13bf-4de3-8a5a-a3211880d6e6-kube-api-access-8rn4s\") pod \"horizon-7d88459b6-s7mzt\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.969273 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mg6s\" (UniqueName: \"kubernetes.io/projected/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-kube-api-access-7mg6s\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.969339 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-config-data\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.969397 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-logs\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.969424 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-combined-ca-bundle\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.969449 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-scripts\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.969500 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-secret-key\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.969537 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-tls-certs\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.970296 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-logs\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.970766 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-config-data\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.970995 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-scripts\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.972516 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-tls-certs\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.973244 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-combined-ca-bundle\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.974191 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-secret-key\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:20 crc kubenswrapper[4917]: I0318 08:23:20.984423 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mg6s\" (UniqueName: \"kubernetes.io/projected/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-kube-api-access-7mg6s\") pod \"horizon-58cff7766d-4ggqr\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.049729 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.164113 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.514377 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7d88459b6-s7mzt"] Mar 18 08:23:21 crc kubenswrapper[4917]: W0318 08:23:21.557704 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1600979d_13bf_4de3_8a5a_a3211880d6e6.slice/crio-3a047492f2fedc313a46220a3031d72b2824500759b1f11e402343f2b17f2c87 WatchSource:0}: Error finding container 3a047492f2fedc313a46220a3031d72b2824500759b1f11e402343f2b17f2c87: Status 404 returned error can't find the container with id 3a047492f2fedc313a46220a3031d72b2824500759b1f11e402343f2b17f2c87 Mar 18 08:23:21 crc kubenswrapper[4917]: W0318 08:23:21.598196 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d857256_d8fa_4b44_b66a_5238f6e7ec1d.slice/crio-6f5650f9c1670bd8c745625301e16768aa93c4cbfc5ba301d6052b8019164aeb WatchSource:0}: Error finding container 6f5650f9c1670bd8c745625301e16768aa93c4cbfc5ba301d6052b8019164aeb: Status 404 returned error can't find the container with id 6f5650f9c1670bd8c745625301e16768aa93c4cbfc5ba301d6052b8019164aeb Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.599251 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58cff7766d-4ggqr"] Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.720255 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cff7766d-4ggqr" event={"ID":"6d857256-d8fa-4b44-b66a-5238f6e7ec1d","Type":"ContainerStarted","Data":"6f5650f9c1670bd8c745625301e16768aa93c4cbfc5ba301d6052b8019164aeb"} Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.725271 4917 generic.go:334] "Generic (PLEG): container finished" podID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerID="6473b0f3aac70b88f9abd869786c1a73a81aeff81ef68e26ab317d1f4cc5443d" exitCode=0 Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.725336 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"047f2cb4-a55b-41cf-88b0-34a29702ec38","Type":"ContainerDied","Data":"6473b0f3aac70b88f9abd869786c1a73a81aeff81ef68e26ab317d1f4cc5443d"} Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.727891 4917 generic.go:334] "Generic (PLEG): container finished" podID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerID="569cb979ff152b171b9c7fd88c371de8e2d06dea0eae265c59852e9973bfc260" exitCode=0 Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.727948 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"265abbaf-11cf-4b9d-a72f-5265a57d02e1","Type":"ContainerDied","Data":"569cb979ff152b171b9c7fd88c371de8e2d06dea0eae265c59852e9973bfc260"} Mar 18 08:23:21 crc kubenswrapper[4917]: I0318 08:23:21.729374 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d88459b6-s7mzt" event={"ID":"1600979d-13bf-4de3-8a5a-a3211880d6e6","Type":"ContainerStarted","Data":"3a047492f2fedc313a46220a3031d72b2824500759b1f11e402343f2b17f2c87"} Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.156241 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.194360 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-httpd-run\") pod \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.194401 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-scripts\") pod \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.194439 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prkpr\" (UniqueName: \"kubernetes.io/projected/265abbaf-11cf-4b9d-a72f-5265a57d02e1-kube-api-access-prkpr\") pod \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.194520 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-logs\") pod \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.194540 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-internal-tls-certs\") pod \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.194557 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-combined-ca-bundle\") pod \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.194592 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-config-data\") pod \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\" (UID: \"265abbaf-11cf-4b9d-a72f-5265a57d02e1\") " Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.195900 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "265abbaf-11cf-4b9d-a72f-5265a57d02e1" (UID: "265abbaf-11cf-4b9d-a72f-5265a57d02e1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.195917 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-logs" (OuterVolumeSpecName: "logs") pod "265abbaf-11cf-4b9d-a72f-5265a57d02e1" (UID: "265abbaf-11cf-4b9d-a72f-5265a57d02e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.204202 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/265abbaf-11cf-4b9d-a72f-5265a57d02e1-kube-api-access-prkpr" (OuterVolumeSpecName: "kube-api-access-prkpr") pod "265abbaf-11cf-4b9d-a72f-5265a57d02e1" (UID: "265abbaf-11cf-4b9d-a72f-5265a57d02e1"). InnerVolumeSpecName "kube-api-access-prkpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.206619 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-scripts" (OuterVolumeSpecName: "scripts") pod "265abbaf-11cf-4b9d-a72f-5265a57d02e1" (UID: "265abbaf-11cf-4b9d-a72f-5265a57d02e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.234790 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "265abbaf-11cf-4b9d-a72f-5265a57d02e1" (UID: "265abbaf-11cf-4b9d-a72f-5265a57d02e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.262109 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "265abbaf-11cf-4b9d-a72f-5265a57d02e1" (UID: "265abbaf-11cf-4b9d-a72f-5265a57d02e1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.293978 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-config-data" (OuterVolumeSpecName: "config-data") pod "265abbaf-11cf-4b9d-a72f-5265a57d02e1" (UID: "265abbaf-11cf-4b9d-a72f-5265a57d02e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.296744 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.296785 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.296795 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.296804 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prkpr\" (UniqueName: \"kubernetes.io/projected/265abbaf-11cf-4b9d-a72f-5265a57d02e1-kube-api-access-prkpr\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.296815 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/265abbaf-11cf-4b9d-a72f-5265a57d02e1-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.296822 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.296832 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/265abbaf-11cf-4b9d-a72f-5265a57d02e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.747695 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"265abbaf-11cf-4b9d-a72f-5265a57d02e1","Type":"ContainerDied","Data":"42738e941d36fba83f410ca2c60b873c919eab08fe3424846281353ca6a7d155"} Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.748108 4917 scope.go:117] "RemoveContainer" containerID="569cb979ff152b171b9c7fd88c371de8e2d06dea0eae265c59852e9973bfc260" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.747741 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.811828 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.825229 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.837868 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:23:22 crc kubenswrapper[4917]: E0318 08:23:22.838748 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerName="glance-httpd" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.838762 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerName="glance-httpd" Mar 18 08:23:22 crc kubenswrapper[4917]: E0318 08:23:22.838782 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerName="glance-log" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.838789 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerName="glance-log" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.839086 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerName="glance-httpd" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.839106 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" containerName="glance-log" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.840163 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.851513 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.851733 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 18 08:23:22 crc kubenswrapper[4917]: I0318 08:23:22.872483 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.009859 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.009970 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mp6k\" (UniqueName: \"kubernetes.io/projected/1bc2d62e-408e-48bb-9fce-1901055ecbd0-kube-api-access-7mp6k\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.010073 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.010122 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.010200 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1bc2d62e-408e-48bb-9fce-1901055ecbd0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.010261 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bc2d62e-408e-48bb-9fce-1901055ecbd0-logs\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.010330 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.112617 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bc2d62e-408e-48bb-9fce-1901055ecbd0-logs\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.112679 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.112769 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.112804 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mp6k\" (UniqueName: \"kubernetes.io/projected/1bc2d62e-408e-48bb-9fce-1901055ecbd0-kube-api-access-7mp6k\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.112891 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.112920 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.112941 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1bc2d62e-408e-48bb-9fce-1901055ecbd0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.113494 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1bc2d62e-408e-48bb-9fce-1901055ecbd0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.114089 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1bc2d62e-408e-48bb-9fce-1901055ecbd0-logs\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.121461 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.123935 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.128708 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.131985 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mp6k\" (UniqueName: \"kubernetes.io/projected/1bc2d62e-408e-48bb-9fce-1901055ecbd0-kube-api-access-7mp6k\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.137452 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1bc2d62e-408e-48bb-9fce-1901055ecbd0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1bc2d62e-408e-48bb-9fce-1901055ecbd0\") " pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.178085 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:23 crc kubenswrapper[4917]: I0318 08:23:23.784825 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="265abbaf-11cf-4b9d-a72f-5265a57d02e1" path="/var/lib/kubelet/pods/265abbaf-11cf-4b9d-a72f-5265a57d02e1/volumes" Mar 18 08:23:27 crc kubenswrapper[4917]: I0318 08:23:27.689076 4917 scope.go:117] "RemoveContainer" containerID="324fcb1238a9c314e1fd7edcc6d35401edd8a86c3db980db17d27318db0e4249" Mar 18 08:23:27 crc kubenswrapper[4917]: I0318 08:23:27.819374 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:23:27 crc kubenswrapper[4917]: I0318 08:23:27.819962 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"047f2cb4-a55b-41cf-88b0-34a29702ec38","Type":"ContainerDied","Data":"05fa3c47a0985037f4a4d39835b76c02c47f9ec8d76a533076b03f8155eef3c3"} Mar 18 08:23:27 crc kubenswrapper[4917]: E0318 08:23:27.826056 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"324fcb1238a9c314e1fd7edcc6d35401edd8a86c3db980db17d27318db0e4249\": container with ID starting with 324fcb1238a9c314e1fd7edcc6d35401edd8a86c3db980db17d27318db0e4249 not found: ID does not exist" containerID="324fcb1238a9c314e1fd7edcc6d35401edd8a86c3db980db17d27318db0e4249" Mar 18 08:23:27 crc kubenswrapper[4917]: I0318 08:23:27.826072 4917 scope.go:117] "RemoveContainer" containerID="6473b0f3aac70b88f9abd869786c1a73a81aeff81ef68e26ab317d1f4cc5443d" Mar 18 08:23:27 crc kubenswrapper[4917]: I0318 08:23:27.939049 4917 scope.go:117] "RemoveContainer" containerID="dbfa0f81af6e37352345c1c0ef2ff8f37746bca03d8aed509c46c4bd330e4e42" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.013349 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-scripts\") pod \"047f2cb4-a55b-41cf-88b0-34a29702ec38\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.013410 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-config-data\") pod \"047f2cb4-a55b-41cf-88b0-34a29702ec38\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.013485 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-logs\") pod \"047f2cb4-a55b-41cf-88b0-34a29702ec38\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.013536 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-combined-ca-bundle\") pod \"047f2cb4-a55b-41cf-88b0-34a29702ec38\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.013618 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbcg9\" (UniqueName: \"kubernetes.io/projected/047f2cb4-a55b-41cf-88b0-34a29702ec38-kube-api-access-sbcg9\") pod \"047f2cb4-a55b-41cf-88b0-34a29702ec38\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.013685 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-public-tls-certs\") pod \"047f2cb4-a55b-41cf-88b0-34a29702ec38\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.013706 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-httpd-run\") pod \"047f2cb4-a55b-41cf-88b0-34a29702ec38\" (UID: \"047f2cb4-a55b-41cf-88b0-34a29702ec38\") " Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.014513 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "047f2cb4-a55b-41cf-88b0-34a29702ec38" (UID: "047f2cb4-a55b-41cf-88b0-34a29702ec38"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.017597 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-scripts" (OuterVolumeSpecName: "scripts") pod "047f2cb4-a55b-41cf-88b0-34a29702ec38" (UID: "047f2cb4-a55b-41cf-88b0-34a29702ec38"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.017833 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-logs" (OuterVolumeSpecName: "logs") pod "047f2cb4-a55b-41cf-88b0-34a29702ec38" (UID: "047f2cb4-a55b-41cf-88b0-34a29702ec38"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.020733 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047f2cb4-a55b-41cf-88b0-34a29702ec38-kube-api-access-sbcg9" (OuterVolumeSpecName: "kube-api-access-sbcg9") pod "047f2cb4-a55b-41cf-88b0-34a29702ec38" (UID: "047f2cb4-a55b-41cf-88b0-34a29702ec38"). InnerVolumeSpecName "kube-api-access-sbcg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.063669 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "047f2cb4-a55b-41cf-88b0-34a29702ec38" (UID: "047f2cb4-a55b-41cf-88b0-34a29702ec38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.093606 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "047f2cb4-a55b-41cf-88b0-34a29702ec38" (UID: "047f2cb4-a55b-41cf-88b0-34a29702ec38"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.118738 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.118770 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.118781 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbcg9\" (UniqueName: \"kubernetes.io/projected/047f2cb4-a55b-41cf-88b0-34a29702ec38-kube-api-access-sbcg9\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.118796 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.118808 4917 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/047f2cb4-a55b-41cf-88b0-34a29702ec38-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.118816 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.226935 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-config-data" (OuterVolumeSpecName: "config-data") pod "047f2cb4-a55b-41cf-88b0-34a29702ec38" (UID: "047f2cb4-a55b-41cf-88b0-34a29702ec38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.321800 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/047f2cb4-a55b-41cf-88b0-34a29702ec38-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.328457 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.847037 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d88459b6-s7mzt" event={"ID":"1600979d-13bf-4de3-8a5a-a3211880d6e6","Type":"ContainerStarted","Data":"5276f59e075a312fd67bf5c8f268c16c3fcb0d8c69ce1a60dd467a6b724bcdca"} Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.847558 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d88459b6-s7mzt" event={"ID":"1600979d-13bf-4de3-8a5a-a3211880d6e6","Type":"ContainerStarted","Data":"9f28d231735215eac1634c4ad3ab7a5611eff9e512a158a8921a02a0dda0838a"} Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.850973 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbdf89979-bkddk" event={"ID":"035a076d-3027-4867-b402-a051746ff481","Type":"ContainerStarted","Data":"5a5e2d6de9601a4b348e325efe8c643983d152225f4d268ae22fc57374091857"} Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.851016 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbdf89979-bkddk" event={"ID":"035a076d-3027-4867-b402-a051746ff481","Type":"ContainerStarted","Data":"c87c6fbdf41abeeb527bd0ee80a5037f1b0ea28c1e70a89a4e2e716e92cf1154"} Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.851085 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fbdf89979-bkddk" podUID="035a076d-3027-4867-b402-a051746ff481" containerName="horizon" containerID="cri-o://5a5e2d6de9601a4b348e325efe8c643983d152225f4d268ae22fc57374091857" gracePeriod=30 Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.851070 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fbdf89979-bkddk" podUID="035a076d-3027-4867-b402-a051746ff481" containerName="horizon-log" containerID="cri-o://c87c6fbdf41abeeb527bd0ee80a5037f1b0ea28c1e70a89a4e2e716e92cf1154" gracePeriod=30 Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.857781 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cff7766d-4ggqr" event={"ID":"6d857256-d8fa-4b44-b66a-5238f6e7ec1d","Type":"ContainerStarted","Data":"84fa1262b86e5239f7efe9e5ab2a4f8dbc3014619d0e46f0c362e5529aa4647b"} Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.857834 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cff7766d-4ggqr" event={"ID":"6d857256-d8fa-4b44-b66a-5238f6e7ec1d","Type":"ContainerStarted","Data":"cd0cf440100899f05139ecacf79add51e9cea5ceed002787e7ade1b6767f5dc5"} Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.860875 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.886156 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566f99d8b7-s264b" event={"ID":"32fb0d5e-9b30-4796-927a-976f8c03f791","Type":"ContainerStarted","Data":"4913d0c114f64b7a44b0bab4dcf26a3b661d01e163ea9500f1210c171f574fe8"} Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.886209 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566f99d8b7-s264b" event={"ID":"32fb0d5e-9b30-4796-927a-976f8c03f791","Type":"ContainerStarted","Data":"1d7af1896796da8759893152f077b8ff2d83c6823919864ffcf4e2c53cde5e2d"} Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.886331 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-566f99d8b7-s264b" podUID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerName="horizon-log" containerID="cri-o://1d7af1896796da8759893152f077b8ff2d83c6823919864ffcf4e2c53cde5e2d" gracePeriod=30 Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.886436 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-566f99d8b7-s264b" podUID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerName="horizon" containerID="cri-o://4913d0c114f64b7a44b0bab4dcf26a3b661d01e163ea9500f1210c171f574fe8" gracePeriod=30 Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.888511 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1bc2d62e-408e-48bb-9fce-1901055ecbd0","Type":"ContainerStarted","Data":"a439f19041833fc85ef49b87f170987499be723b225fe74077a790c0f2a69585"} Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.888628 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7d88459b6-s7mzt" podStartSLOduration=2.67545829 podStartE2EDuration="8.888612174s" podCreationTimestamp="2026-03-18 08:23:20 +0000 UTC" firstStartedPulling="2026-03-18 08:23:21.560568252 +0000 UTC m=+5786.501722966" lastFinishedPulling="2026-03-18 08:23:27.773722136 +0000 UTC m=+5792.714876850" observedRunningTime="2026-03-18 08:23:28.874058942 +0000 UTC m=+5793.815213666" watchObservedRunningTime="2026-03-18 08:23:28.888612174 +0000 UTC m=+5793.829766898" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.916603 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6fbdf89979-bkddk" podStartSLOduration=2.097319885 podStartE2EDuration="10.916560842s" podCreationTimestamp="2026-03-18 08:23:18 +0000 UTC" firstStartedPulling="2026-03-18 08:23:19.094486554 +0000 UTC m=+5784.035641278" lastFinishedPulling="2026-03-18 08:23:27.913727521 +0000 UTC m=+5792.854882235" observedRunningTime="2026-03-18 08:23:28.908370543 +0000 UTC m=+5793.849525257" watchObservedRunningTime="2026-03-18 08:23:28.916560842 +0000 UTC m=+5793.857715556" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.944729 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58cff7766d-4ggqr" podStartSLOduration=2.606376776 podStartE2EDuration="8.944705205s" podCreationTimestamp="2026-03-18 08:23:20 +0000 UTC" firstStartedPulling="2026-03-18 08:23:21.600673675 +0000 UTC m=+5786.541828389" lastFinishedPulling="2026-03-18 08:23:27.939002104 +0000 UTC m=+5792.880156818" observedRunningTime="2026-03-18 08:23:28.933407881 +0000 UTC m=+5793.874562595" watchObservedRunningTime="2026-03-18 08:23:28.944705205 +0000 UTC m=+5793.885859919" Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.985189 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:23:28 crc kubenswrapper[4917]: I0318 08:23:28.995677 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.009756 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:23:29 crc kubenswrapper[4917]: E0318 08:23:29.010274 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerName="glance-httpd" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.010298 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerName="glance-httpd" Mar 18 08:23:29 crc kubenswrapper[4917]: E0318 08:23:29.010322 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerName="glance-log" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.010331 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerName="glance-log" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.010619 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerName="glance-log" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.010644 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="047f2cb4-a55b-41cf-88b0-34a29702ec38" containerName="glance-httpd" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.011091 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-566f99d8b7-s264b" podStartSLOduration=2.417716496 podStartE2EDuration="11.011077445s" podCreationTimestamp="2026-03-18 08:23:18 +0000 UTC" firstStartedPulling="2026-03-18 08:23:19.180461209 +0000 UTC m=+5784.121615913" lastFinishedPulling="2026-03-18 08:23:27.773822128 +0000 UTC m=+5792.714976862" observedRunningTime="2026-03-18 08:23:28.97297783 +0000 UTC m=+5793.914132534" watchObservedRunningTime="2026-03-18 08:23:29.011077445 +0000 UTC m=+5793.952232159" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.011985 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.014462 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.014490 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.026592 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.137103 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-scripts\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.137188 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/944c92dd-53dd-4c82-9dd1-b96bfaff4130-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.137215 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6z8t\" (UniqueName: \"kubernetes.io/projected/944c92dd-53dd-4c82-9dd1-b96bfaff4130-kube-api-access-w6z8t\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.137243 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-config-data\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.137267 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.137292 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.137311 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/944c92dd-53dd-4c82-9dd1-b96bfaff4130-logs\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.239843 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-scripts\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.240932 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/944c92dd-53dd-4c82-9dd1-b96bfaff4130-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.240977 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6z8t\" (UniqueName: \"kubernetes.io/projected/944c92dd-53dd-4c82-9dd1-b96bfaff4130-kube-api-access-w6z8t\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.241038 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-config-data\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.241112 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.241164 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.241185 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/944c92dd-53dd-4c82-9dd1-b96bfaff4130-logs\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.242373 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/944c92dd-53dd-4c82-9dd1-b96bfaff4130-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.245683 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/944c92dd-53dd-4c82-9dd1-b96bfaff4130-logs\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.253434 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.253542 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.254786 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-config-data\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.259692 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/944c92dd-53dd-4c82-9dd1-b96bfaff4130-scripts\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.261572 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6z8t\" (UniqueName: \"kubernetes.io/projected/944c92dd-53dd-4c82-9dd1-b96bfaff4130-kube-api-access-w6z8t\") pod \"glance-default-external-api-0\" (UID: \"944c92dd-53dd-4c82-9dd1-b96bfaff4130\") " pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.340548 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.791215 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047f2cb4-a55b-41cf-88b0-34a29702ec38" path="/var/lib/kubelet/pods/047f2cb4-a55b-41cf-88b0-34a29702ec38/volumes" Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.839273 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.912533 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1bc2d62e-408e-48bb-9fce-1901055ecbd0","Type":"ContainerStarted","Data":"bcc023e5594623e4d84c85623d9119a5ba6977eff0ae4d8315bf7086eae275bd"} Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.912576 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1bc2d62e-408e-48bb-9fce-1901055ecbd0","Type":"ContainerStarted","Data":"1d0d4ea417971087006b5127deec5cce631b1f6777f8081bd3c088f128c14eac"} Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.920307 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"944c92dd-53dd-4c82-9dd1-b96bfaff4130","Type":"ContainerStarted","Data":"2e0556453ae8d583a10ce0d3e76e8442a498f728e08db9fa1baeca81d86e4afb"} Mar 18 08:23:29 crc kubenswrapper[4917]: I0318 08:23:29.947527 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.947507605 podStartE2EDuration="7.947507605s" podCreationTimestamp="2026-03-18 08:23:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:23:29.93700515 +0000 UTC m=+5794.878159864" watchObservedRunningTime="2026-03-18 08:23:29.947507605 +0000 UTC m=+5794.888662319" Mar 18 08:23:30 crc kubenswrapper[4917]: I0318 08:23:30.933019 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"944c92dd-53dd-4c82-9dd1-b96bfaff4130","Type":"ContainerStarted","Data":"e7c0c3f4c37190682c4d9427ae396bab84694133a5777bbc62861791aede5f1f"} Mar 18 08:23:31 crc kubenswrapper[4917]: I0318 08:23:31.050652 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:31 crc kubenswrapper[4917]: I0318 08:23:31.050931 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:31 crc kubenswrapper[4917]: I0318 08:23:31.165704 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:31 crc kubenswrapper[4917]: I0318 08:23:31.165765 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:31 crc kubenswrapper[4917]: I0318 08:23:31.960115 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"944c92dd-53dd-4c82-9dd1-b96bfaff4130","Type":"ContainerStarted","Data":"fdd208f306850106239de99e175bb8b9980b9c310ff88a0c376e840dd57bf710"} Mar 18 08:23:32 crc kubenswrapper[4917]: I0318 08:23:32.016842 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.01680884 podStartE2EDuration="4.01680884s" podCreationTimestamp="2026-03-18 08:23:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:23:32.001230323 +0000 UTC m=+5796.942385127" watchObservedRunningTime="2026-03-18 08:23:32.01680884 +0000 UTC m=+5796.957963624" Mar 18 08:23:33 crc kubenswrapper[4917]: I0318 08:23:33.178757 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:33 crc kubenswrapper[4917]: I0318 08:23:33.179125 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:33 crc kubenswrapper[4917]: I0318 08:23:33.218225 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:33 crc kubenswrapper[4917]: I0318 08:23:33.221822 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:33 crc kubenswrapper[4917]: I0318 08:23:33.981623 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:33 crc kubenswrapper[4917]: I0318 08:23:33.982175 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:36 crc kubenswrapper[4917]: I0318 08:23:36.855644 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:36 crc kubenswrapper[4917]: I0318 08:23:36.910964 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 18 08:23:38 crc kubenswrapper[4917]: I0318 08:23:38.547061 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:38 crc kubenswrapper[4917]: I0318 08:23:38.652468 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:39 crc kubenswrapper[4917]: I0318 08:23:39.341244 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 08:23:39 crc kubenswrapper[4917]: I0318 08:23:39.341288 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 18 08:23:39 crc kubenswrapper[4917]: I0318 08:23:39.380867 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 08:23:39 crc kubenswrapper[4917]: I0318 08:23:39.401699 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 18 08:23:40 crc kubenswrapper[4917]: I0318 08:23:40.047702 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 08:23:40 crc kubenswrapper[4917]: I0318 08:23:40.048016 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 18 08:23:41 crc kubenswrapper[4917]: I0318 08:23:41.053814 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7d88459b6-s7mzt" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.136:8443: connect: connection refused" Mar 18 08:23:41 crc kubenswrapper[4917]: I0318 08:23:41.168231 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-58cff7766d-4ggqr" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.137:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.137:8443: connect: connection refused" Mar 18 08:23:42 crc kubenswrapper[4917]: I0318 08:23:42.003096 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 08:23:42 crc kubenswrapper[4917]: I0318 08:23:42.007163 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 18 08:23:52 crc kubenswrapper[4917]: I0318 08:23:52.883887 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:53 crc kubenswrapper[4917]: I0318 08:23:53.035492 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:54 crc kubenswrapper[4917]: I0318 08:23:54.607887 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:23:54 crc kubenswrapper[4917]: I0318 08:23:54.619047 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:23:54 crc kubenswrapper[4917]: I0318 08:23:54.744533 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d88459b6-s7mzt"] Mar 18 08:23:55 crc kubenswrapper[4917]: I0318 08:23:55.246004 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d88459b6-s7mzt" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon-log" containerID="cri-o://9f28d231735215eac1634c4ad3ab7a5611eff9e512a158a8921a02a0dda0838a" gracePeriod=30 Mar 18 08:23:55 crc kubenswrapper[4917]: I0318 08:23:55.246068 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7d88459b6-s7mzt" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon" containerID="cri-o://5276f59e075a312fd67bf5c8f268c16c3fcb0d8c69ce1a60dd467a6b724bcdca" gracePeriod=30 Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.048861 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2baa-account-create-update-ctf24"] Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.061173 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2baa-account-create-update-ctf24"] Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.069197 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-d5rpk"] Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.076425 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-d5rpk"] Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.285731 4917 generic.go:334] "Generic (PLEG): container finished" podID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerID="4913d0c114f64b7a44b0bab4dcf26a3b661d01e163ea9500f1210c171f574fe8" exitCode=137 Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.285754 4917 generic.go:334] "Generic (PLEG): container finished" podID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerID="1d7af1896796da8759893152f077b8ff2d83c6823919864ffcf4e2c53cde5e2d" exitCode=137 Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.285825 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566f99d8b7-s264b" event={"ID":"32fb0d5e-9b30-4796-927a-976f8c03f791","Type":"ContainerDied","Data":"4913d0c114f64b7a44b0bab4dcf26a3b661d01e163ea9500f1210c171f574fe8"} Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.285850 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566f99d8b7-s264b" event={"ID":"32fb0d5e-9b30-4796-927a-976f8c03f791","Type":"ContainerDied","Data":"1d7af1896796da8759893152f077b8ff2d83c6823919864ffcf4e2c53cde5e2d"} Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.287900 4917 generic.go:334] "Generic (PLEG): container finished" podID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerID="5276f59e075a312fd67bf5c8f268c16c3fcb0d8c69ce1a60dd467a6b724bcdca" exitCode=0 Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.287933 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d88459b6-s7mzt" event={"ID":"1600979d-13bf-4de3-8a5a-a3211880d6e6","Type":"ContainerDied","Data":"5276f59e075a312fd67bf5c8f268c16c3fcb0d8c69ce1a60dd467a6b724bcdca"} Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.289869 4917 generic.go:334] "Generic (PLEG): container finished" podID="035a076d-3027-4867-b402-a051746ff481" containerID="5a5e2d6de9601a4b348e325efe8c643983d152225f4d268ae22fc57374091857" exitCode=137 Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.289881 4917 generic.go:334] "Generic (PLEG): container finished" podID="035a076d-3027-4867-b402-a051746ff481" containerID="c87c6fbdf41abeeb527bd0ee80a5037f1b0ea28c1e70a89a4e2e716e92cf1154" exitCode=137 Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.289896 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbdf89979-bkddk" event={"ID":"035a076d-3027-4867-b402-a051746ff481","Type":"ContainerDied","Data":"5a5e2d6de9601a4b348e325efe8c643983d152225f4d268ae22fc57374091857"} Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.289910 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbdf89979-bkddk" event={"ID":"035a076d-3027-4867-b402-a051746ff481","Type":"ContainerDied","Data":"c87c6fbdf41abeeb527bd0ee80a5037f1b0ea28c1e70a89a4e2e716e92cf1154"} Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.289920 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fbdf89979-bkddk" event={"ID":"035a076d-3027-4867-b402-a051746ff481","Type":"ContainerDied","Data":"82af7f6f4342561ec662339be2e6d98ee8f879098444532d1240c2475a1db53a"} Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.289930 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82af7f6f4342561ec662339be2e6d98ee8f879098444532d1240c2475a1db53a" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.367121 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.370168 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523343 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-scripts\") pod \"035a076d-3027-4867-b402-a051746ff481\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523456 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-config-data\") pod \"035a076d-3027-4867-b402-a051746ff481\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523501 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035a076d-3027-4867-b402-a051746ff481-logs\") pod \"035a076d-3027-4867-b402-a051746ff481\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523528 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw4ck\" (UniqueName: \"kubernetes.io/projected/035a076d-3027-4867-b402-a051746ff481-kube-api-access-lw4ck\") pod \"035a076d-3027-4867-b402-a051746ff481\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523619 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw84z\" (UniqueName: \"kubernetes.io/projected/32fb0d5e-9b30-4796-927a-976f8c03f791-kube-api-access-mw84z\") pod \"32fb0d5e-9b30-4796-927a-976f8c03f791\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523657 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-scripts\") pod \"32fb0d5e-9b30-4796-927a-976f8c03f791\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523679 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-config-data\") pod \"32fb0d5e-9b30-4796-927a-976f8c03f791\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523707 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32fb0d5e-9b30-4796-927a-976f8c03f791-horizon-secret-key\") pod \"32fb0d5e-9b30-4796-927a-976f8c03f791\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523730 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/035a076d-3027-4867-b402-a051746ff481-horizon-secret-key\") pod \"035a076d-3027-4867-b402-a051746ff481\" (UID: \"035a076d-3027-4867-b402-a051746ff481\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.523768 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fb0d5e-9b30-4796-927a-976f8c03f791-logs\") pod \"32fb0d5e-9b30-4796-927a-976f8c03f791\" (UID: \"32fb0d5e-9b30-4796-927a-976f8c03f791\") " Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.524525 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32fb0d5e-9b30-4796-927a-976f8c03f791-logs" (OuterVolumeSpecName: "logs") pod "32fb0d5e-9b30-4796-927a-976f8c03f791" (UID: "32fb0d5e-9b30-4796-927a-976f8c03f791"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.524518 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035a076d-3027-4867-b402-a051746ff481-logs" (OuterVolumeSpecName: "logs") pod "035a076d-3027-4867-b402-a051746ff481" (UID: "035a076d-3027-4867-b402-a051746ff481"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.528601 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/035a076d-3027-4867-b402-a051746ff481-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "035a076d-3027-4867-b402-a051746ff481" (UID: "035a076d-3027-4867-b402-a051746ff481"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.529505 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035a076d-3027-4867-b402-a051746ff481-kube-api-access-lw4ck" (OuterVolumeSpecName: "kube-api-access-lw4ck") pod "035a076d-3027-4867-b402-a051746ff481" (UID: "035a076d-3027-4867-b402-a051746ff481"). InnerVolumeSpecName "kube-api-access-lw4ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.541366 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32fb0d5e-9b30-4796-927a-976f8c03f791-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "32fb0d5e-9b30-4796-927a-976f8c03f791" (UID: "32fb0d5e-9b30-4796-927a-976f8c03f791"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.549183 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32fb0d5e-9b30-4796-927a-976f8c03f791-kube-api-access-mw84z" (OuterVolumeSpecName: "kube-api-access-mw84z") pod "32fb0d5e-9b30-4796-927a-976f8c03f791" (UID: "32fb0d5e-9b30-4796-927a-976f8c03f791"). InnerVolumeSpecName "kube-api-access-mw84z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.550521 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-scripts" (OuterVolumeSpecName: "scripts") pod "035a076d-3027-4867-b402-a051746ff481" (UID: "035a076d-3027-4867-b402-a051746ff481"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.550907 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-config-data" (OuterVolumeSpecName: "config-data") pod "035a076d-3027-4867-b402-a051746ff481" (UID: "035a076d-3027-4867-b402-a051746ff481"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.559270 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-scripts" (OuterVolumeSpecName: "scripts") pod "32fb0d5e-9b30-4796-927a-976f8c03f791" (UID: "32fb0d5e-9b30-4796-927a-976f8c03f791"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.559666 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-config-data" (OuterVolumeSpecName: "config-data") pod "32fb0d5e-9b30-4796-927a-976f8c03f791" (UID: "32fb0d5e-9b30-4796-927a-976f8c03f791"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625709 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw84z\" (UniqueName: \"kubernetes.io/projected/32fb0d5e-9b30-4796-927a-976f8c03f791-kube-api-access-mw84z\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625745 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625755 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32fb0d5e-9b30-4796-927a-976f8c03f791-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625765 4917 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/32fb0d5e-9b30-4796-927a-976f8c03f791-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625773 4917 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/035a076d-3027-4867-b402-a051746ff481-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625781 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/32fb0d5e-9b30-4796-927a-976f8c03f791-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625790 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625800 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/035a076d-3027-4867-b402-a051746ff481-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625808 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035a076d-3027-4867-b402-a051746ff481-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.625820 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw4ck\" (UniqueName: \"kubernetes.io/projected/035a076d-3027-4867-b402-a051746ff481-kube-api-access-lw4ck\") on node \"crc\" DevicePath \"\"" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.787842 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d247d8-d97f-44ba-b769-5f13a5973355" path="/var/lib/kubelet/pods/41d247d8-d97f-44ba-b769-5f13a5973355/volumes" Mar 18 08:23:59 crc kubenswrapper[4917]: I0318 08:23:59.788417 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550d2002-958b-45b5-8618-ab8065124e2f" path="/var/lib/kubelet/pods/550d2002-958b-45b5-8618-ab8065124e2f/volumes" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.143670 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563704-b6dn6"] Mar 18 08:24:00 crc kubenswrapper[4917]: E0318 08:24:00.144364 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerName="horizon-log" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.144391 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerName="horizon-log" Mar 18 08:24:00 crc kubenswrapper[4917]: E0318 08:24:00.144416 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035a076d-3027-4867-b402-a051746ff481" containerName="horizon-log" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.144427 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="035a076d-3027-4867-b402-a051746ff481" containerName="horizon-log" Mar 18 08:24:00 crc kubenswrapper[4917]: E0318 08:24:00.144468 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerName="horizon" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.144481 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerName="horizon" Mar 18 08:24:00 crc kubenswrapper[4917]: E0318 08:24:00.144508 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035a076d-3027-4867-b402-a051746ff481" containerName="horizon" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.144522 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="035a076d-3027-4867-b402-a051746ff481" containerName="horizon" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.144868 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="035a076d-3027-4867-b402-a051746ff481" containerName="horizon-log" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.144901 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="035a076d-3027-4867-b402-a051746ff481" containerName="horizon" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.144919 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerName="horizon" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.144934 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="32fb0d5e-9b30-4796-927a-976f8c03f791" containerName="horizon-log" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.145965 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563704-b6dn6" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.148864 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.149111 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.151636 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563704-b6dn6"] Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.151669 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.238486 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chnfb\" (UniqueName: \"kubernetes.io/projected/733cf5ce-2b5c-49fb-a482-1594f3c449d8-kube-api-access-chnfb\") pod \"auto-csr-approver-29563704-b6dn6\" (UID: \"733cf5ce-2b5c-49fb-a482-1594f3c449d8\") " pod="openshift-infra/auto-csr-approver-29563704-b6dn6" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.301681 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fbdf89979-bkddk" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.301700 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-566f99d8b7-s264b" event={"ID":"32fb0d5e-9b30-4796-927a-976f8c03f791","Type":"ContainerDied","Data":"addf054e973e98748a9f2b688085f777e736c91115a91d8c6e5d6befa9979550"} Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.301762 4917 scope.go:117] "RemoveContainer" containerID="4913d0c114f64b7a44b0bab4dcf26a3b661d01e163ea9500f1210c171f574fe8" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.301681 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-566f99d8b7-s264b" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.336779 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fbdf89979-bkddk"] Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.341624 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chnfb\" (UniqueName: \"kubernetes.io/projected/733cf5ce-2b5c-49fb-a482-1594f3c449d8-kube-api-access-chnfb\") pod \"auto-csr-approver-29563704-b6dn6\" (UID: \"733cf5ce-2b5c-49fb-a482-1594f3c449d8\") " pod="openshift-infra/auto-csr-approver-29563704-b6dn6" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.356177 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6fbdf89979-bkddk"] Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.366861 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-566f99d8b7-s264b"] Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.371954 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chnfb\" (UniqueName: \"kubernetes.io/projected/733cf5ce-2b5c-49fb-a482-1594f3c449d8-kube-api-access-chnfb\") pod \"auto-csr-approver-29563704-b6dn6\" (UID: \"733cf5ce-2b5c-49fb-a482-1594f3c449d8\") " pod="openshift-infra/auto-csr-approver-29563704-b6dn6" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.378496 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-566f99d8b7-s264b"] Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.461507 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563704-b6dn6" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.502647 4917 scope.go:117] "RemoveContainer" containerID="1d7af1896796da8759893152f077b8ff2d83c6823919864ffcf4e2c53cde5e2d" Mar 18 08:24:00 crc kubenswrapper[4917]: I0318 08:24:00.985345 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563704-b6dn6"] Mar 18 08:24:01 crc kubenswrapper[4917]: I0318 08:24:01.051281 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d88459b6-s7mzt" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.136:8443: connect: connection refused" Mar 18 08:24:01 crc kubenswrapper[4917]: I0318 08:24:01.313678 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563704-b6dn6" event={"ID":"733cf5ce-2b5c-49fb-a482-1594f3c449d8","Type":"ContainerStarted","Data":"5b48581960de19592a5af601247fbca1d02574999d77a0ded8db0976cbc1f2a8"} Mar 18 08:24:01 crc kubenswrapper[4917]: I0318 08:24:01.789389 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035a076d-3027-4867-b402-a051746ff481" path="/var/lib/kubelet/pods/035a076d-3027-4867-b402-a051746ff481/volumes" Mar 18 08:24:01 crc kubenswrapper[4917]: I0318 08:24:01.790698 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32fb0d5e-9b30-4796-927a-976f8c03f791" path="/var/lib/kubelet/pods/32fb0d5e-9b30-4796-927a-976f8c03f791/volumes" Mar 18 08:24:02 crc kubenswrapper[4917]: I0318 08:24:02.325954 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563704-b6dn6" event={"ID":"733cf5ce-2b5c-49fb-a482-1594f3c449d8","Type":"ContainerStarted","Data":"d9e0d6c79443b937bb561dffbc9d2f8eae57ec7e81fc0055e4c49e59ba417e12"} Mar 18 08:24:02 crc kubenswrapper[4917]: I0318 08:24:02.351650 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563704-b6dn6" podStartSLOduration=1.449864743 podStartE2EDuration="2.351633103s" podCreationTimestamp="2026-03-18 08:24:00 +0000 UTC" firstStartedPulling="2026-03-18 08:24:01.001754075 +0000 UTC m=+5825.942908789" lastFinishedPulling="2026-03-18 08:24:01.903522395 +0000 UTC m=+5826.844677149" observedRunningTime="2026-03-18 08:24:02.351307115 +0000 UTC m=+5827.292461839" watchObservedRunningTime="2026-03-18 08:24:02.351633103 +0000 UTC m=+5827.292787837" Mar 18 08:24:02 crc kubenswrapper[4917]: I0318 08:24:02.955080 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:24:02 crc kubenswrapper[4917]: I0318 08:24:02.955154 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:24:03 crc kubenswrapper[4917]: I0318 08:24:03.340158 4917 generic.go:334] "Generic (PLEG): container finished" podID="733cf5ce-2b5c-49fb-a482-1594f3c449d8" containerID="d9e0d6c79443b937bb561dffbc9d2f8eae57ec7e81fc0055e4c49e59ba417e12" exitCode=0 Mar 18 08:24:03 crc kubenswrapper[4917]: I0318 08:24:03.340267 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563704-b6dn6" event={"ID":"733cf5ce-2b5c-49fb-a482-1594f3c449d8","Type":"ContainerDied","Data":"d9e0d6c79443b937bb561dffbc9d2f8eae57ec7e81fc0055e4c49e59ba417e12"} Mar 18 08:24:04 crc kubenswrapper[4917]: I0318 08:24:04.786122 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563704-b6dn6" Mar 18 08:24:04 crc kubenswrapper[4917]: I0318 08:24:04.797059 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chnfb\" (UniqueName: \"kubernetes.io/projected/733cf5ce-2b5c-49fb-a482-1594f3c449d8-kube-api-access-chnfb\") pod \"733cf5ce-2b5c-49fb-a482-1594f3c449d8\" (UID: \"733cf5ce-2b5c-49fb-a482-1594f3c449d8\") " Mar 18 08:24:04 crc kubenswrapper[4917]: I0318 08:24:04.809857 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/733cf5ce-2b5c-49fb-a482-1594f3c449d8-kube-api-access-chnfb" (OuterVolumeSpecName: "kube-api-access-chnfb") pod "733cf5ce-2b5c-49fb-a482-1594f3c449d8" (UID: "733cf5ce-2b5c-49fb-a482-1594f3c449d8"). InnerVolumeSpecName "kube-api-access-chnfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:24:04 crc kubenswrapper[4917]: I0318 08:24:04.899676 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chnfb\" (UniqueName: \"kubernetes.io/projected/733cf5ce-2b5c-49fb-a482-1594f3c449d8-kube-api-access-chnfb\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:05 crc kubenswrapper[4917]: I0318 08:24:05.367632 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563704-b6dn6" event={"ID":"733cf5ce-2b5c-49fb-a482-1594f3c449d8","Type":"ContainerDied","Data":"5b48581960de19592a5af601247fbca1d02574999d77a0ded8db0976cbc1f2a8"} Mar 18 08:24:05 crc kubenswrapper[4917]: I0318 08:24:05.367680 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563704-b6dn6" Mar 18 08:24:05 crc kubenswrapper[4917]: I0318 08:24:05.367695 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b48581960de19592a5af601247fbca1d02574999d77a0ded8db0976cbc1f2a8" Mar 18 08:24:05 crc kubenswrapper[4917]: I0318 08:24:05.437447 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563698-xss2r"] Mar 18 08:24:05 crc kubenswrapper[4917]: I0318 08:24:05.451413 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563698-xss2r"] Mar 18 08:24:05 crc kubenswrapper[4917]: I0318 08:24:05.785907 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96877157-34b5-49d3-a3b4-e7db7a2952ff" path="/var/lib/kubelet/pods/96877157-34b5-49d3-a3b4-e7db7a2952ff/volumes" Mar 18 08:24:08 crc kubenswrapper[4917]: I0318 08:24:08.987488 4917 scope.go:117] "RemoveContainer" containerID="622916c3b4ad66a4938df690f52d88a166aa619fa4c03fe7bef870621bbcb30d" Mar 18 08:24:09 crc kubenswrapper[4917]: I0318 08:24:09.050188 4917 scope.go:117] "RemoveContainer" containerID="cbd24c68189d983b64dca22a063d908da5a1b7d1d707de951fd6a726ad633846" Mar 18 08:24:09 crc kubenswrapper[4917]: I0318 08:24:09.130912 4917 scope.go:117] "RemoveContainer" containerID="e3fd0f4211d8d8b2af47925b0efb65b133f4de0ea28bc94e5240b1d179bfe9f4" Mar 18 08:24:10 crc kubenswrapper[4917]: I0318 08:24:10.036307 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lxc4g"] Mar 18 08:24:10 crc kubenswrapper[4917]: I0318 08:24:10.044910 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lxc4g"] Mar 18 08:24:11 crc kubenswrapper[4917]: I0318 08:24:11.051300 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d88459b6-s7mzt" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.136:8443: connect: connection refused" Mar 18 08:24:11 crc kubenswrapper[4917]: I0318 08:24:11.789918 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107f1f76-b32c-4371-aca8-e5253102d6cb" path="/var/lib/kubelet/pods/107f1f76-b32c-4371-aca8-e5253102d6cb/volumes" Mar 18 08:24:21 crc kubenswrapper[4917]: I0318 08:24:21.051113 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7d88459b6-s7mzt" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.136:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.136:8443: connect: connection refused" Mar 18 08:24:21 crc kubenswrapper[4917]: I0318 08:24:21.052024 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:24:25 crc kubenswrapper[4917]: E0318 08:24:25.312579 4917 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32fb0d5e_9b30_4796_927a_976f8c03f791.slice/crio-addf054e973e98748a9f2b688085f777e736c91115a91d8c6e5d6befa9979550: Error finding container addf054e973e98748a9f2b688085f777e736c91115a91d8c6e5d6befa9979550: Status 404 returned error can't find the container with id addf054e973e98748a9f2b688085f777e736c91115a91d8c6e5d6befa9979550 Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.593323 4917 generic.go:334] "Generic (PLEG): container finished" podID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerID="9f28d231735215eac1634c4ad3ab7a5611eff9e512a158a8921a02a0dda0838a" exitCode=137 Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.593646 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d88459b6-s7mzt" event={"ID":"1600979d-13bf-4de3-8a5a-a3211880d6e6","Type":"ContainerDied","Data":"9f28d231735215eac1634c4ad3ab7a5611eff9e512a158a8921a02a0dda0838a"} Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.675039 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.714132 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-config-data\") pod \"1600979d-13bf-4de3-8a5a-a3211880d6e6\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.714173 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-tls-certs\") pod \"1600979d-13bf-4de3-8a5a-a3211880d6e6\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.714218 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rn4s\" (UniqueName: \"kubernetes.io/projected/1600979d-13bf-4de3-8a5a-a3211880d6e6-kube-api-access-8rn4s\") pod \"1600979d-13bf-4de3-8a5a-a3211880d6e6\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.714284 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-secret-key\") pod \"1600979d-13bf-4de3-8a5a-a3211880d6e6\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.714338 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-combined-ca-bundle\") pod \"1600979d-13bf-4de3-8a5a-a3211880d6e6\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.714404 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-scripts\") pod \"1600979d-13bf-4de3-8a5a-a3211880d6e6\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.714438 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1600979d-13bf-4de3-8a5a-a3211880d6e6-logs\") pod \"1600979d-13bf-4de3-8a5a-a3211880d6e6\" (UID: \"1600979d-13bf-4de3-8a5a-a3211880d6e6\") " Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.715087 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1600979d-13bf-4de3-8a5a-a3211880d6e6-logs" (OuterVolumeSpecName: "logs") pod "1600979d-13bf-4de3-8a5a-a3211880d6e6" (UID: "1600979d-13bf-4de3-8a5a-a3211880d6e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.721888 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1600979d-13bf-4de3-8a5a-a3211880d6e6-kube-api-access-8rn4s" (OuterVolumeSpecName: "kube-api-access-8rn4s") pod "1600979d-13bf-4de3-8a5a-a3211880d6e6" (UID: "1600979d-13bf-4de3-8a5a-a3211880d6e6"). InnerVolumeSpecName "kube-api-access-8rn4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.722779 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1600979d-13bf-4de3-8a5a-a3211880d6e6" (UID: "1600979d-13bf-4de3-8a5a-a3211880d6e6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.741649 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-config-data" (OuterVolumeSpecName: "config-data") pod "1600979d-13bf-4de3-8a5a-a3211880d6e6" (UID: "1600979d-13bf-4de3-8a5a-a3211880d6e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.744447 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-scripts" (OuterVolumeSpecName: "scripts") pod "1600979d-13bf-4de3-8a5a-a3211880d6e6" (UID: "1600979d-13bf-4de3-8a5a-a3211880d6e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.747704 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1600979d-13bf-4de3-8a5a-a3211880d6e6" (UID: "1600979d-13bf-4de3-8a5a-a3211880d6e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.766123 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "1600979d-13bf-4de3-8a5a-a3211880d6e6" (UID: "1600979d-13bf-4de3-8a5a-a3211880d6e6"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.816087 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.816275 4917 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.816288 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rn4s\" (UniqueName: \"kubernetes.io/projected/1600979d-13bf-4de3-8a5a-a3211880d6e6-kube-api-access-8rn4s\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.816299 4917 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.816309 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1600979d-13bf-4de3-8a5a-a3211880d6e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.816320 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1600979d-13bf-4de3-8a5a-a3211880d6e6-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:25 crc kubenswrapper[4917]: I0318 08:24:25.816329 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1600979d-13bf-4de3-8a5a-a3211880d6e6-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:26 crc kubenswrapper[4917]: I0318 08:24:26.608377 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7d88459b6-s7mzt" event={"ID":"1600979d-13bf-4de3-8a5a-a3211880d6e6","Type":"ContainerDied","Data":"3a047492f2fedc313a46220a3031d72b2824500759b1f11e402343f2b17f2c87"} Mar 18 08:24:26 crc kubenswrapper[4917]: I0318 08:24:26.608474 4917 scope.go:117] "RemoveContainer" containerID="5276f59e075a312fd67bf5c8f268c16c3fcb0d8c69ce1a60dd467a6b724bcdca" Mar 18 08:24:26 crc kubenswrapper[4917]: I0318 08:24:26.608473 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7d88459b6-s7mzt" Mar 18 08:24:26 crc kubenswrapper[4917]: I0318 08:24:26.649999 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7d88459b6-s7mzt"] Mar 18 08:24:26 crc kubenswrapper[4917]: I0318 08:24:26.665853 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7d88459b6-s7mzt"] Mar 18 08:24:26 crc kubenswrapper[4917]: I0318 08:24:26.883694 4917 scope.go:117] "RemoveContainer" containerID="9f28d231735215eac1634c4ad3ab7a5611eff9e512a158a8921a02a0dda0838a" Mar 18 08:24:27 crc kubenswrapper[4917]: I0318 08:24:27.793047 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" path="/var/lib/kubelet/pods/1600979d-13bf-4de3-8a5a-a3211880d6e6/volumes" Mar 18 08:24:32 crc kubenswrapper[4917]: I0318 08:24:32.929270 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:24:32 crc kubenswrapper[4917]: I0318 08:24:32.930099 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.388624 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5995799978-72tkb"] Mar 18 08:24:35 crc kubenswrapper[4917]: E0318 08:24:35.389321 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.389336 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon" Mar 18 08:24:35 crc kubenswrapper[4917]: E0318 08:24:35.389363 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="733cf5ce-2b5c-49fb-a482-1594f3c449d8" containerName="oc" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.389368 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="733cf5ce-2b5c-49fb-a482-1594f3c449d8" containerName="oc" Mar 18 08:24:35 crc kubenswrapper[4917]: E0318 08:24:35.389381 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon-log" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.389387 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon-log" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.389571 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon-log" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.389602 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1600979d-13bf-4de3-8a5a-a3211880d6e6" containerName="horizon" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.389617 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="733cf5ce-2b5c-49fb-a482-1594f3c449d8" containerName="oc" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.390496 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.408354 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5995799978-72tkb"] Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.553869 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2cd988f0-ae23-4f45-8247-29abf7b143c4-horizon-secret-key\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.553961 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd988f0-ae23-4f45-8247-29abf7b143c4-logs\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.554055 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd988f0-ae23-4f45-8247-29abf7b143c4-combined-ca-bundle\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.554098 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd988f0-ae23-4f45-8247-29abf7b143c4-horizon-tls-certs\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.554136 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cd988f0-ae23-4f45-8247-29abf7b143c4-config-data\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.554246 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2cd988f0-ae23-4f45-8247-29abf7b143c4-scripts\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.554305 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7lvw\" (UniqueName: \"kubernetes.io/projected/2cd988f0-ae23-4f45-8247-29abf7b143c4-kube-api-access-w7lvw\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.655733 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2cd988f0-ae23-4f45-8247-29abf7b143c4-horizon-secret-key\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.655839 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd988f0-ae23-4f45-8247-29abf7b143c4-logs\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.655866 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd988f0-ae23-4f45-8247-29abf7b143c4-combined-ca-bundle\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.655902 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd988f0-ae23-4f45-8247-29abf7b143c4-horizon-tls-certs\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.655935 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cd988f0-ae23-4f45-8247-29abf7b143c4-config-data\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.655965 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2cd988f0-ae23-4f45-8247-29abf7b143c4-scripts\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.655986 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7lvw\" (UniqueName: \"kubernetes.io/projected/2cd988f0-ae23-4f45-8247-29abf7b143c4-kube-api-access-w7lvw\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.656483 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd988f0-ae23-4f45-8247-29abf7b143c4-logs\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.657732 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2cd988f0-ae23-4f45-8247-29abf7b143c4-config-data\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.658209 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2cd988f0-ae23-4f45-8247-29abf7b143c4-scripts\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.662464 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd988f0-ae23-4f45-8247-29abf7b143c4-combined-ca-bundle\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.663292 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2cd988f0-ae23-4f45-8247-29abf7b143c4-horizon-secret-key\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.663519 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cd988f0-ae23-4f45-8247-29abf7b143c4-horizon-tls-certs\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.671784 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7lvw\" (UniqueName: \"kubernetes.io/projected/2cd988f0-ae23-4f45-8247-29abf7b143c4-kube-api-access-w7lvw\") pod \"horizon-5995799978-72tkb\" (UID: \"2cd988f0-ae23-4f45-8247-29abf7b143c4\") " pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:35 crc kubenswrapper[4917]: I0318 08:24:35.719313 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.228221 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5995799978-72tkb"] Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.675345 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-4jrlf"] Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.677063 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.693761 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-4jrlf"] Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.728953 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5995799978-72tkb" event={"ID":"2cd988f0-ae23-4f45-8247-29abf7b143c4","Type":"ContainerStarted","Data":"7239e8173e0b8e9c4d46e2af27cd7a0e68bf5b04637300d6b3b8382fe6a2f174"} Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.729046 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5995799978-72tkb" event={"ID":"2cd988f0-ae23-4f45-8247-29abf7b143c4","Type":"ContainerStarted","Data":"b60dfee51cbfaef66514d55fa8d2f8026fa62fc2a5e177dde95cf4c476157f83"} Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.778555 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-9c12-account-create-update-2dh7r"] Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.781369 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/305453bf-9adc-46d4-b810-fcd1e8f67a77-operator-scripts\") pod \"heat-db-create-4jrlf\" (UID: \"305453bf-9adc-46d4-b810-fcd1e8f67a77\") " pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.781411 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trdtc\" (UniqueName: \"kubernetes.io/projected/305453bf-9adc-46d4-b810-fcd1e8f67a77-kube-api-access-trdtc\") pod \"heat-db-create-4jrlf\" (UID: \"305453bf-9adc-46d4-b810-fcd1e8f67a77\") " pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.782054 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.784367 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.794324 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9c12-account-create-update-2dh7r"] Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.884045 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-operator-scripts\") pod \"heat-9c12-account-create-update-2dh7r\" (UID: \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\") " pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.885285 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfksq\" (UniqueName: \"kubernetes.io/projected/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-kube-api-access-lfksq\") pod \"heat-9c12-account-create-update-2dh7r\" (UID: \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\") " pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.885435 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/305453bf-9adc-46d4-b810-fcd1e8f67a77-operator-scripts\") pod \"heat-db-create-4jrlf\" (UID: \"305453bf-9adc-46d4-b810-fcd1e8f67a77\") " pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.885495 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trdtc\" (UniqueName: \"kubernetes.io/projected/305453bf-9adc-46d4-b810-fcd1e8f67a77-kube-api-access-trdtc\") pod \"heat-db-create-4jrlf\" (UID: \"305453bf-9adc-46d4-b810-fcd1e8f67a77\") " pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.886557 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/305453bf-9adc-46d4-b810-fcd1e8f67a77-operator-scripts\") pod \"heat-db-create-4jrlf\" (UID: \"305453bf-9adc-46d4-b810-fcd1e8f67a77\") " pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.907898 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trdtc\" (UniqueName: \"kubernetes.io/projected/305453bf-9adc-46d4-b810-fcd1e8f67a77-kube-api-access-trdtc\") pod \"heat-db-create-4jrlf\" (UID: \"305453bf-9adc-46d4-b810-fcd1e8f67a77\") " pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.987362 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-operator-scripts\") pod \"heat-9c12-account-create-update-2dh7r\" (UID: \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\") " pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.987493 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfksq\" (UniqueName: \"kubernetes.io/projected/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-kube-api-access-lfksq\") pod \"heat-9c12-account-create-update-2dh7r\" (UID: \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\") " pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:36 crc kubenswrapper[4917]: I0318 08:24:36.988294 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-operator-scripts\") pod \"heat-9c12-account-create-update-2dh7r\" (UID: \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\") " pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:37 crc kubenswrapper[4917]: I0318 08:24:37.007365 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfksq\" (UniqueName: \"kubernetes.io/projected/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-kube-api-access-lfksq\") pod \"heat-9c12-account-create-update-2dh7r\" (UID: \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\") " pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:37 crc kubenswrapper[4917]: I0318 08:24:37.057175 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:37 crc kubenswrapper[4917]: I0318 08:24:37.103435 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:37 crc kubenswrapper[4917]: I0318 08:24:37.584992 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-4jrlf"] Mar 18 08:24:37 crc kubenswrapper[4917]: I0318 08:24:37.759690 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5995799978-72tkb" event={"ID":"2cd988f0-ae23-4f45-8247-29abf7b143c4","Type":"ContainerStarted","Data":"86348f99cc6444f2ffefda0cea98c9e63a2b8f96e45c9667e740cf8c2256d00a"} Mar 18 08:24:37 crc kubenswrapper[4917]: I0318 08:24:37.765517 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-4jrlf" event={"ID":"305453bf-9adc-46d4-b810-fcd1e8f67a77","Type":"ContainerStarted","Data":"30a97be4982553752773c801a3999a912bbd91f738fae09bd792894cb601e205"} Mar 18 08:24:37 crc kubenswrapper[4917]: I0318 08:24:37.795843 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-9c12-account-create-update-2dh7r"] Mar 18 08:24:37 crc kubenswrapper[4917]: I0318 08:24:37.808037 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5995799978-72tkb" podStartSLOduration=2.808019114 podStartE2EDuration="2.808019114s" podCreationTimestamp="2026-03-18 08:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:24:37.796420533 +0000 UTC m=+5862.737575257" watchObservedRunningTime="2026-03-18 08:24:37.808019114 +0000 UTC m=+5862.749173828" Mar 18 08:24:38 crc kubenswrapper[4917]: I0318 08:24:38.784059 4917 generic.go:334] "Generic (PLEG): container finished" podID="f16e6bc3-1886-4de2-83ca-aae1c7f3d44f" containerID="d9e3c6db6cc71c1006b65a7bbb203387622f774bbce48f15cd4e49c3be902f96" exitCode=0 Mar 18 08:24:38 crc kubenswrapper[4917]: I0318 08:24:38.784438 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9c12-account-create-update-2dh7r" event={"ID":"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f","Type":"ContainerDied","Data":"d9e3c6db6cc71c1006b65a7bbb203387622f774bbce48f15cd4e49c3be902f96"} Mar 18 08:24:38 crc kubenswrapper[4917]: I0318 08:24:38.784466 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9c12-account-create-update-2dh7r" event={"ID":"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f","Type":"ContainerStarted","Data":"67d2707434c637894e9dcd2fa58bb515748728a02b4494edeb6b2cb7f49ac45a"} Mar 18 08:24:38 crc kubenswrapper[4917]: I0318 08:24:38.787365 4917 generic.go:334] "Generic (PLEG): container finished" podID="305453bf-9adc-46d4-b810-fcd1e8f67a77" containerID="c08f7ae69d431cb796ae6e5cf42f147051f26eb79004457c6e60321f1ea36c15" exitCode=0 Mar 18 08:24:38 crc kubenswrapper[4917]: I0318 08:24:38.787405 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-4jrlf" event={"ID":"305453bf-9adc-46d4-b810-fcd1e8f67a77","Type":"ContainerDied","Data":"c08f7ae69d431cb796ae6e5cf42f147051f26eb79004457c6e60321f1ea36c15"} Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.266636 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.272528 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.450220 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-operator-scripts\") pod \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\" (UID: \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\") " Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.450265 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trdtc\" (UniqueName: \"kubernetes.io/projected/305453bf-9adc-46d4-b810-fcd1e8f67a77-kube-api-access-trdtc\") pod \"305453bf-9adc-46d4-b810-fcd1e8f67a77\" (UID: \"305453bf-9adc-46d4-b810-fcd1e8f67a77\") " Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.450401 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfksq\" (UniqueName: \"kubernetes.io/projected/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-kube-api-access-lfksq\") pod \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\" (UID: \"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f\") " Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.450613 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/305453bf-9adc-46d4-b810-fcd1e8f67a77-operator-scripts\") pod \"305453bf-9adc-46d4-b810-fcd1e8f67a77\" (UID: \"305453bf-9adc-46d4-b810-fcd1e8f67a77\") " Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.451712 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f16e6bc3-1886-4de2-83ca-aae1c7f3d44f" (UID: "f16e6bc3-1886-4de2-83ca-aae1c7f3d44f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.452001 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/305453bf-9adc-46d4-b810-fcd1e8f67a77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "305453bf-9adc-46d4-b810-fcd1e8f67a77" (UID: "305453bf-9adc-46d4-b810-fcd1e8f67a77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.459750 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-kube-api-access-lfksq" (OuterVolumeSpecName: "kube-api-access-lfksq") pod "f16e6bc3-1886-4de2-83ca-aae1c7f3d44f" (UID: "f16e6bc3-1886-4de2-83ca-aae1c7f3d44f"). InnerVolumeSpecName "kube-api-access-lfksq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.469865 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305453bf-9adc-46d4-b810-fcd1e8f67a77-kube-api-access-trdtc" (OuterVolumeSpecName: "kube-api-access-trdtc") pod "305453bf-9adc-46d4-b810-fcd1e8f67a77" (UID: "305453bf-9adc-46d4-b810-fcd1e8f67a77"). InnerVolumeSpecName "kube-api-access-trdtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.552822 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/305453bf-9adc-46d4-b810-fcd1e8f67a77-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.552856 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.552869 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trdtc\" (UniqueName: \"kubernetes.io/projected/305453bf-9adc-46d4-b810-fcd1e8f67a77-kube-api-access-trdtc\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.552883 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfksq\" (UniqueName: \"kubernetes.io/projected/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f-kube-api-access-lfksq\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.833517 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-4jrlf" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.834006 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-4jrlf" event={"ID":"305453bf-9adc-46d4-b810-fcd1e8f67a77","Type":"ContainerDied","Data":"30a97be4982553752773c801a3999a912bbd91f738fae09bd792894cb601e205"} Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.834035 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30a97be4982553752773c801a3999a912bbd91f738fae09bd792894cb601e205" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.842706 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-9c12-account-create-update-2dh7r" event={"ID":"f16e6bc3-1886-4de2-83ca-aae1c7f3d44f","Type":"ContainerDied","Data":"67d2707434c637894e9dcd2fa58bb515748728a02b4494edeb6b2cb7f49ac45a"} Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.842761 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d2707434c637894e9dcd2fa58bb515748728a02b4494edeb6b2cb7f49ac45a" Mar 18 08:24:40 crc kubenswrapper[4917]: I0318 08:24:40.842821 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-9c12-account-create-update-2dh7r" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.857500 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-n46nm"] Mar 18 08:24:41 crc kubenswrapper[4917]: E0318 08:24:41.857996 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305453bf-9adc-46d4-b810-fcd1e8f67a77" containerName="mariadb-database-create" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.858012 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="305453bf-9adc-46d4-b810-fcd1e8f67a77" containerName="mariadb-database-create" Mar 18 08:24:41 crc kubenswrapper[4917]: E0318 08:24:41.858029 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f16e6bc3-1886-4de2-83ca-aae1c7f3d44f" containerName="mariadb-account-create-update" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.858037 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f16e6bc3-1886-4de2-83ca-aae1c7f3d44f" containerName="mariadb-account-create-update" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.858262 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f16e6bc3-1886-4de2-83ca-aae1c7f3d44f" containerName="mariadb-account-create-update" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.858287 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="305453bf-9adc-46d4-b810-fcd1e8f67a77" containerName="mariadb-database-create" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.859350 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.868138 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.868344 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-pt8qq" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.902131 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-n46nm"] Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.991519 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-config-data\") pod \"heat-db-sync-n46nm\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.991693 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-combined-ca-bundle\") pod \"heat-db-sync-n46nm\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:41 crc kubenswrapper[4917]: I0318 08:24:41.991910 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45784\" (UniqueName: \"kubernetes.io/projected/cb781c6a-6adb-484c-aba7-f4f894e8f812-kube-api-access-45784\") pod \"heat-db-sync-n46nm\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:42 crc kubenswrapper[4917]: I0318 08:24:42.094007 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-config-data\") pod \"heat-db-sync-n46nm\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:42 crc kubenswrapper[4917]: I0318 08:24:42.094058 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-combined-ca-bundle\") pod \"heat-db-sync-n46nm\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:42 crc kubenswrapper[4917]: I0318 08:24:42.094147 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45784\" (UniqueName: \"kubernetes.io/projected/cb781c6a-6adb-484c-aba7-f4f894e8f812-kube-api-access-45784\") pod \"heat-db-sync-n46nm\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:42 crc kubenswrapper[4917]: I0318 08:24:42.102379 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-config-data\") pod \"heat-db-sync-n46nm\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:42 crc kubenswrapper[4917]: I0318 08:24:42.103313 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-combined-ca-bundle\") pod \"heat-db-sync-n46nm\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:42 crc kubenswrapper[4917]: I0318 08:24:42.113008 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45784\" (UniqueName: \"kubernetes.io/projected/cb781c6a-6adb-484c-aba7-f4f894e8f812-kube-api-access-45784\") pod \"heat-db-sync-n46nm\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:42 crc kubenswrapper[4917]: I0318 08:24:42.185861 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:42 crc kubenswrapper[4917]: I0318 08:24:42.720352 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-n46nm"] Mar 18 08:24:42 crc kubenswrapper[4917]: W0318 08:24:42.738262 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb781c6a_6adb_484c_aba7_f4f894e8f812.slice/crio-d883f4182199d70be880253c31bad5a2e91f0e00ec9ab2004a6821c60dd18438 WatchSource:0}: Error finding container d883f4182199d70be880253c31bad5a2e91f0e00ec9ab2004a6821c60dd18438: Status 404 returned error can't find the container with id d883f4182199d70be880253c31bad5a2e91f0e00ec9ab2004a6821c60dd18438 Mar 18 08:24:42 crc kubenswrapper[4917]: I0318 08:24:42.875830 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-n46nm" event={"ID":"cb781c6a-6adb-484c-aba7-f4f894e8f812","Type":"ContainerStarted","Data":"d883f4182199d70be880253c31bad5a2e91f0e00ec9ab2004a6821c60dd18438"} Mar 18 08:24:45 crc kubenswrapper[4917]: I0318 08:24:45.720226 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:45 crc kubenswrapper[4917]: I0318 08:24:45.720649 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5995799978-72tkb" Mar 18 08:24:50 crc kubenswrapper[4917]: I0318 08:24:50.985393 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-n46nm" event={"ID":"cb781c6a-6adb-484c-aba7-f4f894e8f812","Type":"ContainerStarted","Data":"f135ab1cfb701faad5d5609ab0e2fe07366d31cc88055ffbfbbfefcb09efa9d2"} Mar 18 08:24:53 crc kubenswrapper[4917]: I0318 08:24:53.008769 4917 generic.go:334] "Generic (PLEG): container finished" podID="cb781c6a-6adb-484c-aba7-f4f894e8f812" containerID="f135ab1cfb701faad5d5609ab0e2fe07366d31cc88055ffbfbbfefcb09efa9d2" exitCode=0 Mar 18 08:24:53 crc kubenswrapper[4917]: I0318 08:24:53.008833 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-n46nm" event={"ID":"cb781c6a-6adb-484c-aba7-f4f894e8f812","Type":"ContainerDied","Data":"f135ab1cfb701faad5d5609ab0e2fe07366d31cc88055ffbfbbfefcb09efa9d2"} Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.441464 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.483049 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-combined-ca-bundle\") pod \"cb781c6a-6adb-484c-aba7-f4f894e8f812\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.483105 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-config-data\") pod \"cb781c6a-6adb-484c-aba7-f4f894e8f812\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.483311 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45784\" (UniqueName: \"kubernetes.io/projected/cb781c6a-6adb-484c-aba7-f4f894e8f812-kube-api-access-45784\") pod \"cb781c6a-6adb-484c-aba7-f4f894e8f812\" (UID: \"cb781c6a-6adb-484c-aba7-f4f894e8f812\") " Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.495239 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb781c6a-6adb-484c-aba7-f4f894e8f812-kube-api-access-45784" (OuterVolumeSpecName: "kube-api-access-45784") pod "cb781c6a-6adb-484c-aba7-f4f894e8f812" (UID: "cb781c6a-6adb-484c-aba7-f4f894e8f812"). InnerVolumeSpecName "kube-api-access-45784". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.512774 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb781c6a-6adb-484c-aba7-f4f894e8f812" (UID: "cb781c6a-6adb-484c-aba7-f4f894e8f812"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.579357 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-config-data" (OuterVolumeSpecName: "config-data") pod "cb781c6a-6adb-484c-aba7-f4f894e8f812" (UID: "cb781c6a-6adb-484c-aba7-f4f894e8f812"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.585620 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.585647 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb781c6a-6adb-484c-aba7-f4f894e8f812-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:54 crc kubenswrapper[4917]: I0318 08:24:54.585660 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45784\" (UniqueName: \"kubernetes.io/projected/cb781c6a-6adb-484c-aba7-f4f894e8f812-kube-api-access-45784\") on node \"crc\" DevicePath \"\"" Mar 18 08:24:55 crc kubenswrapper[4917]: I0318 08:24:55.028091 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-n46nm" event={"ID":"cb781c6a-6adb-484c-aba7-f4f894e8f812","Type":"ContainerDied","Data":"d883f4182199d70be880253c31bad5a2e91f0e00ec9ab2004a6821c60dd18438"} Mar 18 08:24:55 crc kubenswrapper[4917]: I0318 08:24:55.028126 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d883f4182199d70be880253c31bad5a2e91f0e00ec9ab2004a6821c60dd18438" Mar 18 08:24:55 crc kubenswrapper[4917]: I0318 08:24:55.028418 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-n46nm" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.095640 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hzxqs"] Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.112166 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hzxqs"] Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.136407 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-a039-account-create-update-9hhlb"] Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.162401 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-a039-account-create-update-9hhlb"] Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.389849 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7fc579c746-72ssk"] Mar 18 08:24:56 crc kubenswrapper[4917]: E0318 08:24:56.390236 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb781c6a-6adb-484c-aba7-f4f894e8f812" containerName="heat-db-sync" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.390253 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb781c6a-6adb-484c-aba7-f4f894e8f812" containerName="heat-db-sync" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.390405 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb781c6a-6adb-484c-aba7-f4f894e8f812" containerName="heat-db-sync" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.390988 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.396137 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-pt8qq" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.396313 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.396429 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.418096 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7fc579c746-72ssk"] Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.421753 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4zj8\" (UniqueName: \"kubernetes.io/projected/82277826-07e2-4eda-8759-7817347cd653-kube-api-access-x4zj8\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.421792 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-combined-ca-bundle\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.421858 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data-custom\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.421882 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.523732 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.523910 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-combined-ca-bundle\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.523937 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4zj8\" (UniqueName: \"kubernetes.io/projected/82277826-07e2-4eda-8759-7817347cd653-kube-api-access-x4zj8\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.524010 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data-custom\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.530385 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.531377 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-combined-ca-bundle\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.551653 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4zj8\" (UniqueName: \"kubernetes.io/projected/82277826-07e2-4eda-8759-7817347cd653-kube-api-access-x4zj8\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.552208 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data-custom\") pod \"heat-engine-7fc579c746-72ssk\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.601873 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5fc6597ddc-c2hq4"] Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.603196 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.671251 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.682201 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5fc6597ddc-c2hq4"] Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.721205 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-75948f849f-v4g2p"] Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.724404 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.725926 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.726267 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.731154 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-75948f849f-v4g2p"] Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.774733 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data-custom\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.774950 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-combined-ca-bundle\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.775099 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf2ds\" (UniqueName: \"kubernetes.io/projected/3169ca11-feb9-415a-a681-3f68c11b45f8-kube-api-access-sf2ds\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.775185 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.877315 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.877383 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnxtk\" (UniqueName: \"kubernetes.io/projected/6e6d22de-4dec-4ff9-9ebe-47436310eea8-kube-api-access-mnxtk\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.877454 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-combined-ca-bundle\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.877492 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data-custom\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.877525 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.877550 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data-custom\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.877598 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-combined-ca-bundle\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.877792 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf2ds\" (UniqueName: \"kubernetes.io/projected/3169ca11-feb9-415a-a681-3f68c11b45f8-kube-api-access-sf2ds\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.907891 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data-custom\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.908903 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-combined-ca-bundle\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.913083 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf2ds\" (UniqueName: \"kubernetes.io/projected/3169ca11-feb9-415a-a681-3f68c11b45f8-kube-api-access-sf2ds\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.928902 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data\") pod \"heat-cfnapi-5fc6597ddc-c2hq4\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.980018 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnxtk\" (UniqueName: \"kubernetes.io/projected/6e6d22de-4dec-4ff9-9ebe-47436310eea8-kube-api-access-mnxtk\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.980087 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-combined-ca-bundle\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.980137 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data-custom\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.980161 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.984184 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-combined-ca-bundle\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.984375 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.985361 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data-custom\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:56 crc kubenswrapper[4917]: I0318 08:24:56.996515 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:24:57 crc kubenswrapper[4917]: I0318 08:24:57.011372 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnxtk\" (UniqueName: \"kubernetes.io/projected/6e6d22de-4dec-4ff9-9ebe-47436310eea8-kube-api-access-mnxtk\") pod \"heat-api-75948f849f-v4g2p\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:57 crc kubenswrapper[4917]: I0318 08:24:57.048207 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:24:57 crc kubenswrapper[4917]: I0318 08:24:57.289175 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7fc579c746-72ssk"] Mar 18 08:24:57 crc kubenswrapper[4917]: I0318 08:24:57.320692 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5fc6597ddc-c2hq4"] Mar 18 08:24:57 crc kubenswrapper[4917]: I0318 08:24:57.626709 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-75948f849f-v4g2p"] Mar 18 08:24:57 crc kubenswrapper[4917]: I0318 08:24:57.783136 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d77b9f-d04c-485f-ab9d-465b393ba56f" path="/var/lib/kubelet/pods/53d77b9f-d04c-485f-ab9d-465b393ba56f/volumes" Mar 18 08:24:57 crc kubenswrapper[4917]: I0318 08:24:57.783872 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8602779-52ca-408c-974d-bd5723b5cb8f" path="/var/lib/kubelet/pods/c8602779-52ca-408c-974d-bd5723b5cb8f/volumes" Mar 18 08:24:58 crc kubenswrapper[4917]: I0318 08:24:58.062721 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" event={"ID":"3169ca11-feb9-415a-a681-3f68c11b45f8","Type":"ContainerStarted","Data":"fbc0ddfd7def800d131593fdfc73b5c38e539131d2bf34ec858abcce5dc3df13"} Mar 18 08:24:58 crc kubenswrapper[4917]: I0318 08:24:58.064190 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-75948f849f-v4g2p" event={"ID":"6e6d22de-4dec-4ff9-9ebe-47436310eea8","Type":"ContainerStarted","Data":"5530e2d908a0d3bd8eb74cf031b394dcaf43b84e8d59d5bad6e4f69c10dffcda"} Mar 18 08:24:58 crc kubenswrapper[4917]: I0318 08:24:58.066045 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7fc579c746-72ssk" event={"ID":"82277826-07e2-4eda-8759-7817347cd653","Type":"ContainerStarted","Data":"3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041"} Mar 18 08:24:58 crc kubenswrapper[4917]: I0318 08:24:58.066065 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7fc579c746-72ssk" event={"ID":"82277826-07e2-4eda-8759-7817347cd653","Type":"ContainerStarted","Data":"77c106ffec425d676e747c691a450646876a72bb0eafdd8f4ddedd790e4306c3"} Mar 18 08:24:58 crc kubenswrapper[4917]: I0318 08:24:58.066263 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:24:58 crc kubenswrapper[4917]: I0318 08:24:58.102999 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7fc579c746-72ssk" podStartSLOduration=2.102973115 podStartE2EDuration="2.102973115s" podCreationTimestamp="2026-03-18 08:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:24:58.091690131 +0000 UTC m=+5883.032844855" watchObservedRunningTime="2026-03-18 08:24:58.102973115 +0000 UTC m=+5883.044127829" Mar 18 08:24:58 crc kubenswrapper[4917]: I0318 08:24:58.258768 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5995799978-72tkb" Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.091743 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" event={"ID":"3169ca11-feb9-415a-a681-3f68c11b45f8","Type":"ContainerStarted","Data":"551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047"} Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.092279 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.094388 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-75948f849f-v4g2p" event={"ID":"6e6d22de-4dec-4ff9-9ebe-47436310eea8","Type":"ContainerStarted","Data":"86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276"} Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.094528 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.116279 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" podStartSLOduration=2.56868873 podStartE2EDuration="4.116264862s" podCreationTimestamp="2026-03-18 08:24:56 +0000 UTC" firstStartedPulling="2026-03-18 08:24:57.354654357 +0000 UTC m=+5882.295809071" lastFinishedPulling="2026-03-18 08:24:58.902230489 +0000 UTC m=+5883.843385203" observedRunningTime="2026-03-18 08:25:00.108718549 +0000 UTC m=+5885.049873263" watchObservedRunningTime="2026-03-18 08:25:00.116264862 +0000 UTC m=+5885.057419576" Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.130754 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-75948f849f-v4g2p" podStartSLOduration=2.864041793 podStartE2EDuration="4.130734684s" podCreationTimestamp="2026-03-18 08:24:56 +0000 UTC" firstStartedPulling="2026-03-18 08:24:57.640684373 +0000 UTC m=+5882.581839087" lastFinishedPulling="2026-03-18 08:24:58.907377264 +0000 UTC m=+5883.848531978" observedRunningTime="2026-03-18 08:25:00.12643669 +0000 UTC m=+5885.067591404" watchObservedRunningTime="2026-03-18 08:25:00.130734684 +0000 UTC m=+5885.071889398" Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.170037 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5995799978-72tkb" Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.237798 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58cff7766d-4ggqr"] Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.238051 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58cff7766d-4ggqr" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon-log" containerID="cri-o://cd0cf440100899f05139ecacf79add51e9cea5ceed002787e7ade1b6767f5dc5" gracePeriod=30 Mar 18 08:25:00 crc kubenswrapper[4917]: I0318 08:25:00.238132 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58cff7766d-4ggqr" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon" containerID="cri-o://84fa1262b86e5239f7efe9e5ab2a4f8dbc3014619d0e46f0c362e5529aa4647b" gracePeriod=30 Mar 18 08:25:02 crc kubenswrapper[4917]: I0318 08:25:02.928807 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:25:02 crc kubenswrapper[4917]: I0318 08:25:02.929303 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:25:02 crc kubenswrapper[4917]: I0318 08:25:02.929347 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:25:02 crc kubenswrapper[4917]: I0318 08:25:02.930130 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d945d8f3fb288f4abf13f7d56da1feb3070f1b5588474b04c510533002bdbef"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:25:02 crc kubenswrapper[4917]: I0318 08:25:02.930184 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://2d945d8f3fb288f4abf13f7d56da1feb3070f1b5588474b04c510533002bdbef" gracePeriod=600 Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.137634 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="2d945d8f3fb288f4abf13f7d56da1feb3070f1b5588474b04c510533002bdbef" exitCode=0 Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.137724 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"2d945d8f3fb288f4abf13f7d56da1feb3070f1b5588474b04c510533002bdbef"} Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.137969 4917 scope.go:117] "RemoveContainer" containerID="4daa2208c56db2f150694cc1f839c6ee3d138697e3133d22ebe3ceb19064fea1" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.379335 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-58cff7766d-4ggqr" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.137:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:50118->10.217.1.137:8443: read: connection reset by peer" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.655284 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-9f4b4874b-nfbm6"] Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.657441 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.676326 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-548bbcdfbc-7b95h"] Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.677622 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.686604 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-9f4b4874b-nfbm6"] Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.712485 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-548bbcdfbc-7b95h"] Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.726948 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-config-data\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.726999 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-config-data-custom\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.727040 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r96q\" (UniqueName: \"kubernetes.io/projected/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-kube-api-access-7r96q\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.727084 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-combined-ca-bundle\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.727546 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-8b965695d-855gp"] Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.729053 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.745872 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-8b965695d-855gp"] Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.828570 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data-custom\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.828716 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-config-data\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.828751 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-config-data-custom\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.828790 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r96q\" (UniqueName: \"kubernetes.io/projected/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-kube-api-access-7r96q\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.828829 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-combined-ca-bundle\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.828857 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-combined-ca-bundle\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.828895 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.829011 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data-custom\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.829061 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv2x5\" (UniqueName: \"kubernetes.io/projected/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-kube-api-access-zv2x5\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.829095 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-combined-ca-bundle\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.829120 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxnhh\" (UniqueName: \"kubernetes.io/projected/7f49709d-261d-43ee-9679-db08b0fe33dc-kube-api-access-gxnhh\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.829156 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.834875 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-combined-ca-bundle\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.835804 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-config-data-custom\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.845201 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-config-data\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.849250 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r96q\" (UniqueName: \"kubernetes.io/projected/7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c-kube-api-access-7r96q\") pod \"heat-engine-9f4b4874b-nfbm6\" (UID: \"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c\") " pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.930516 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data-custom\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.930580 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv2x5\" (UniqueName: \"kubernetes.io/projected/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-kube-api-access-zv2x5\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.930623 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-combined-ca-bundle\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.930643 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxnhh\" (UniqueName: \"kubernetes.io/projected/7f49709d-261d-43ee-9679-db08b0fe33dc-kube-api-access-gxnhh\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.930675 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.930784 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data-custom\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.930912 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-combined-ca-bundle\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.930939 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.937276 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data-custom\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.938401 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-combined-ca-bundle\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.938876 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.941435 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data-custom\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.955183 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-combined-ca-bundle\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.956112 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.957399 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv2x5\" (UniqueName: \"kubernetes.io/projected/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-kube-api-access-zv2x5\") pod \"heat-cfnapi-8b965695d-855gp\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.960358 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxnhh\" (UniqueName: \"kubernetes.io/projected/7f49709d-261d-43ee-9679-db08b0fe33dc-kube-api-access-gxnhh\") pod \"heat-api-548bbcdfbc-7b95h\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.976607 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:03 crc kubenswrapper[4917]: I0318 08:25:03.992335 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.051517 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.188523 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b"} Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.193332 4917 generic.go:334] "Generic (PLEG): container finished" podID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerID="84fa1262b86e5239f7efe9e5ab2a4f8dbc3014619d0e46f0c362e5529aa4647b" exitCode=0 Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.193359 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cff7766d-4ggqr" event={"ID":"6d857256-d8fa-4b44-b66a-5238f6e7ec1d","Type":"ContainerDied","Data":"84fa1262b86e5239f7efe9e5ab2a4f8dbc3014619d0e46f0c362e5529aa4647b"} Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.467277 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-8b965695d-855gp"] Mar 18 08:25:04 crc kubenswrapper[4917]: W0318 08:25:04.471574 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb6fd2f_bd2c_4ca4_b189_1678c6a4e97b.slice/crio-5461ae54e3f25f7b24c36d010eb495939a15a20b97d99ab0891297bd0f7550b1 WatchSource:0}: Error finding container 5461ae54e3f25f7b24c36d010eb495939a15a20b97d99ab0891297bd0f7550b1: Status 404 returned error can't find the container with id 5461ae54e3f25f7b24c36d010eb495939a15a20b97d99ab0891297bd0f7550b1 Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.505705 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-9f4b4874b-nfbm6"] Mar 18 08:25:04 crc kubenswrapper[4917]: W0318 08:25:04.517991 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7efdb9e3_6dd2_4f00_a37b_c85fa4b2de8c.slice/crio-d9e06610a7a968497a448e7b45d5fb2d5127ab335d3587af3a63713fef0a5747 WatchSource:0}: Error finding container d9e06610a7a968497a448e7b45d5fb2d5127ab335d3587af3a63713fef0a5747: Status 404 returned error can't find the container with id d9e06610a7a968497a448e7b45d5fb2d5127ab335d3587af3a63713fef0a5747 Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.593755 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-548bbcdfbc-7b95h"] Mar 18 08:25:04 crc kubenswrapper[4917]: W0318 08:25:04.595999 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f49709d_261d_43ee_9679_db08b0fe33dc.slice/crio-13ecbc426fbb8da6ef6fec77c00b557d312b0e8c289a011586acc7ba80ece60b WatchSource:0}: Error finding container 13ecbc426fbb8da6ef6fec77c00b557d312b0e8c289a011586acc7ba80ece60b: Status 404 returned error can't find the container with id 13ecbc426fbb8da6ef6fec77c00b557d312b0e8c289a011586acc7ba80ece60b Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.862787 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-75948f849f-v4g2p"] Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.863417 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-75948f849f-v4g2p" podUID="6e6d22de-4dec-4ff9-9ebe-47436310eea8" containerName="heat-api" containerID="cri-o://86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276" gracePeriod=60 Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.873577 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5fc6597ddc-c2hq4"] Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.874333 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" podUID="3169ca11-feb9-415a-a681-3f68c11b45f8" containerName="heat-cfnapi" containerID="cri-o://551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047" gracePeriod=60 Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.880986 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-75948f849f-v4g2p" podUID="6e6d22de-4dec-4ff9-9ebe-47436310eea8" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.147:8004/healthcheck\": EOF" Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.899484 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-9f889575f-j9mf4"] Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.901891 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.908325 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.908567 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.950718 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" podUID="3169ca11-feb9-415a-a681-3f68c11b45f8" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.146:8000/healthcheck\": EOF" Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.954664 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9f889575f-j9mf4"] Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.991195 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7cdb7dbfb5-59str"] Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.992748 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.998937 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 18 08:25:04 crc kubenswrapper[4917]: I0318 08:25:04.999132 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.033116 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7cdb7dbfb5-59str"] Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059109 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-internal-tls-certs\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059165 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg8fg\" (UniqueName: \"kubernetes.io/projected/c6bad2ad-7a90-4798-863d-3b81e21103bc-kube-api-access-pg8fg\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059229 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-config-data-custom\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059255 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-config-data\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059290 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-public-tls-certs\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059329 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2q82\" (UniqueName: \"kubernetes.io/projected/4b41925b-da5b-4ad8-b314-56b2fb282f95-kube-api-access-n2q82\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059376 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-public-tls-certs\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059425 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-internal-tls-certs\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059485 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-config-data-custom\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059506 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-config-data\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059543 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-combined-ca-bundle\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.059605 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-combined-ca-bundle\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.098857 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-dxm9h"] Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.108926 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-dxm9h"] Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.161189 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-public-tls-certs\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.161580 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-internal-tls-certs\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.161683 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-config-data-custom\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.161712 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-config-data\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.161890 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-combined-ca-bundle\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.162397 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-combined-ca-bundle\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.162464 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-internal-tls-certs\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.162843 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg8fg\" (UniqueName: \"kubernetes.io/projected/c6bad2ad-7a90-4798-863d-3b81e21103bc-kube-api-access-pg8fg\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.162924 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-config-data-custom\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.162953 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-config-data\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.162993 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-public-tls-certs\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.163037 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2q82\" (UniqueName: \"kubernetes.io/projected/4b41925b-da5b-4ad8-b314-56b2fb282f95-kube-api-access-n2q82\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.171989 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-combined-ca-bundle\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.172255 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-internal-tls-certs\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.172809 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-config-data\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.173563 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-config-data-custom\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.174294 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-config-data\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.174426 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-public-tls-certs\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.175387 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-combined-ca-bundle\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.180224 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c6bad2ad-7a90-4798-863d-3b81e21103bc-config-data-custom\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.180317 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-internal-tls-certs\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.180729 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b41925b-da5b-4ad8-b314-56b2fb282f95-public-tls-certs\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.186114 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2q82\" (UniqueName: \"kubernetes.io/projected/4b41925b-da5b-4ad8-b314-56b2fb282f95-kube-api-access-n2q82\") pod \"heat-api-9f889575f-j9mf4\" (UID: \"4b41925b-da5b-4ad8-b314-56b2fb282f95\") " pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.190126 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg8fg\" (UniqueName: \"kubernetes.io/projected/c6bad2ad-7a90-4798-863d-3b81e21103bc-kube-api-access-pg8fg\") pod \"heat-cfnapi-7cdb7dbfb5-59str\" (UID: \"c6bad2ad-7a90-4798-863d-3b81e21103bc\") " pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.218497 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-9f4b4874b-nfbm6" event={"ID":"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c","Type":"ContainerStarted","Data":"46d85d66d13df3ea92ce17dd0178ab659697511c35cdee07614884a4becdc3de"} Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.218543 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-9f4b4874b-nfbm6" event={"ID":"7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c","Type":"ContainerStarted","Data":"d9e06610a7a968497a448e7b45d5fb2d5127ab335d3587af3a63713fef0a5747"} Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.219707 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.221873 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-548bbcdfbc-7b95h" event={"ID":"7f49709d-261d-43ee-9679-db08b0fe33dc","Type":"ContainerStarted","Data":"bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f"} Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.221911 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-548bbcdfbc-7b95h" event={"ID":"7f49709d-261d-43ee-9679-db08b0fe33dc","Type":"ContainerStarted","Data":"13ecbc426fbb8da6ef6fec77c00b557d312b0e8c289a011586acc7ba80ece60b"} Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.222502 4917 scope.go:117] "RemoveContainer" containerID="bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.225751 4917 generic.go:334] "Generic (PLEG): container finished" podID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" containerID="4b53c6600d4bf484606f7fe715156dda112fb2c92bd6a92f87c670eac1ae7ca1" exitCode=1 Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.225844 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8b965695d-855gp" event={"ID":"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b","Type":"ContainerDied","Data":"4b53c6600d4bf484606f7fe715156dda112fb2c92bd6a92f87c670eac1ae7ca1"} Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.225876 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8b965695d-855gp" event={"ID":"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b","Type":"ContainerStarted","Data":"5461ae54e3f25f7b24c36d010eb495939a15a20b97d99ab0891297bd0f7550b1"} Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.226551 4917 scope.go:117] "RemoveContainer" containerID="4b53c6600d4bf484606f7fe715156dda112fb2c92bd6a92f87c670eac1ae7ca1" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.240934 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-9f4b4874b-nfbm6" podStartSLOduration=2.2409185369999998 podStartE2EDuration="2.240918537s" podCreationTimestamp="2026-03-18 08:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:25:05.235179708 +0000 UTC m=+5890.176334442" watchObservedRunningTime="2026-03-18 08:25:05.240918537 +0000 UTC m=+5890.182073251" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.253561 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.314668 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.800799 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34878b32-0d4d-4d01-8898-90e7393b3b49" path="/var/lib/kubelet/pods/34878b32-0d4d-4d01-8898-90e7393b3b49/volumes" Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.868293 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7cdb7dbfb5-59str"] Mar 18 08:25:05 crc kubenswrapper[4917]: I0318 08:25:05.934465 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-9f889575f-j9mf4"] Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.236500 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9f889575f-j9mf4" event={"ID":"4b41925b-da5b-4ad8-b314-56b2fb282f95","Type":"ContainerStarted","Data":"5a12455c24150c15e7eace6c829b2abb8f95e71d19a35d3d8567cce2949c859e"} Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.242362 4917 generic.go:334] "Generic (PLEG): container finished" podID="7f49709d-261d-43ee-9679-db08b0fe33dc" containerID="bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f" exitCode=1 Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.242390 4917 generic.go:334] "Generic (PLEG): container finished" podID="7f49709d-261d-43ee-9679-db08b0fe33dc" containerID="b0b23136cc7bcb0b87e1d79469aec0b99b42966c9d1728c295077353e2f0aa9b" exitCode=1 Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.242431 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-548bbcdfbc-7b95h" event={"ID":"7f49709d-261d-43ee-9679-db08b0fe33dc","Type":"ContainerDied","Data":"bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f"} Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.242453 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-548bbcdfbc-7b95h" event={"ID":"7f49709d-261d-43ee-9679-db08b0fe33dc","Type":"ContainerDied","Data":"b0b23136cc7bcb0b87e1d79469aec0b99b42966c9d1728c295077353e2f0aa9b"} Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.242472 4917 scope.go:117] "RemoveContainer" containerID="bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f" Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.242968 4917 scope.go:117] "RemoveContainer" containerID="b0b23136cc7bcb0b87e1d79469aec0b99b42966c9d1728c295077353e2f0aa9b" Mar 18 08:25:06 crc kubenswrapper[4917]: E0318 08:25:06.243360 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-548bbcdfbc-7b95h_openstack(7f49709d-261d-43ee-9679-db08b0fe33dc)\"" pod="openstack/heat-api-548bbcdfbc-7b95h" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.244998 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" event={"ID":"c6bad2ad-7a90-4798-863d-3b81e21103bc","Type":"ContainerStarted","Data":"a66d48f601ca6d222cf39902f0d0c163a074674bd5b91b7e47cb68092d6d700e"} Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.250614 4917 generic.go:334] "Generic (PLEG): container finished" podID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" containerID="46dfbd5f7b8c1aab294e95b97c4bac7e00a66dd5831e23fe4bd7e3bf612464e4" exitCode=1 Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.250785 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8b965695d-855gp" event={"ID":"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b","Type":"ContainerDied","Data":"46dfbd5f7b8c1aab294e95b97c4bac7e00a66dd5831e23fe4bd7e3bf612464e4"} Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.251562 4917 scope.go:117] "RemoveContainer" containerID="46dfbd5f7b8c1aab294e95b97c4bac7e00a66dd5831e23fe4bd7e3bf612464e4" Mar 18 08:25:06 crc kubenswrapper[4917]: E0318 08:25:06.251886 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-8b965695d-855gp_openstack(efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b)\"" pod="openstack/heat-cfnapi-8b965695d-855gp" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.332111 4917 scope.go:117] "RemoveContainer" containerID="bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f" Mar 18 08:25:06 crc kubenswrapper[4917]: E0318 08:25:06.333158 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f\": container with ID starting with bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f not found: ID does not exist" containerID="bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f" Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.333192 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f"} err="failed to get container status \"bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f\": rpc error: code = NotFound desc = could not find container \"bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f\": container with ID starting with bbb095bfd3fbc45c32ba988484ae7b9c0b3269fcfd763f43e23fc28d26e2a56f not found: ID does not exist" Mar 18 08:25:06 crc kubenswrapper[4917]: I0318 08:25:06.333214 4917 scope.go:117] "RemoveContainer" containerID="4b53c6600d4bf484606f7fe715156dda112fb2c92bd6a92f87c670eac1ae7ca1" Mar 18 08:25:07 crc kubenswrapper[4917]: I0318 08:25:07.264822 4917 scope.go:117] "RemoveContainer" containerID="46dfbd5f7b8c1aab294e95b97c4bac7e00a66dd5831e23fe4bd7e3bf612464e4" Mar 18 08:25:07 crc kubenswrapper[4917]: E0318 08:25:07.265406 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-8b965695d-855gp_openstack(efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b)\"" pod="openstack/heat-cfnapi-8b965695d-855gp" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" Mar 18 08:25:07 crc kubenswrapper[4917]: I0318 08:25:07.272180 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-9f889575f-j9mf4" event={"ID":"4b41925b-da5b-4ad8-b314-56b2fb282f95","Type":"ContainerStarted","Data":"5da105886c8e88c32ada256b035ad89d2e4c096a10899ade4a4737fd352934b0"} Mar 18 08:25:07 crc kubenswrapper[4917]: I0318 08:25:07.272396 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:07 crc kubenswrapper[4917]: I0318 08:25:07.275407 4917 scope.go:117] "RemoveContainer" containerID="b0b23136cc7bcb0b87e1d79469aec0b99b42966c9d1728c295077353e2f0aa9b" Mar 18 08:25:07 crc kubenswrapper[4917]: E0318 08:25:07.275872 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-548bbcdfbc-7b95h_openstack(7f49709d-261d-43ee-9679-db08b0fe33dc)\"" pod="openstack/heat-api-548bbcdfbc-7b95h" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" Mar 18 08:25:07 crc kubenswrapper[4917]: I0318 08:25:07.276577 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" event={"ID":"c6bad2ad-7a90-4798-863d-3b81e21103bc","Type":"ContainerStarted","Data":"8b887bfe8be68ecb86377cdacf687e14ce576ea2de88d552d293c58c014dcd54"} Mar 18 08:25:07 crc kubenswrapper[4917]: I0318 08:25:07.276848 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:07 crc kubenswrapper[4917]: I0318 08:25:07.323265 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" podStartSLOduration=3.323234159 podStartE2EDuration="3.323234159s" podCreationTimestamp="2026-03-18 08:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:25:07.313684347 +0000 UTC m=+5892.254839061" watchObservedRunningTime="2026-03-18 08:25:07.323234159 +0000 UTC m=+5892.264388883" Mar 18 08:25:07 crc kubenswrapper[4917]: I0318 08:25:07.372734 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-9f889575f-j9mf4" podStartSLOduration=3.372690098 podStartE2EDuration="3.372690098s" podCreationTimestamp="2026-03-18 08:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:25:07.357700474 +0000 UTC m=+5892.298855198" watchObservedRunningTime="2026-03-18 08:25:07.372690098 +0000 UTC m=+5892.313844822" Mar 18 08:25:08 crc kubenswrapper[4917]: I0318 08:25:08.993689 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:08 crc kubenswrapper[4917]: I0318 08:25:08.994957 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:08 crc kubenswrapper[4917]: I0318 08:25:08.995722 4917 scope.go:117] "RemoveContainer" containerID="b0b23136cc7bcb0b87e1d79469aec0b99b42966c9d1728c295077353e2f0aa9b" Mar 18 08:25:08 crc kubenswrapper[4917]: E0318 08:25:08.996231 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-548bbcdfbc-7b95h_openstack(7f49709d-261d-43ee-9679-db08b0fe33dc)\"" pod="openstack/heat-api-548bbcdfbc-7b95h" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.052570 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.052625 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.053262 4917 scope.go:117] "RemoveContainer" containerID="46dfbd5f7b8c1aab294e95b97c4bac7e00a66dd5831e23fe4bd7e3bf612464e4" Mar 18 08:25:09 crc kubenswrapper[4917]: E0318 08:25:09.053486 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-8b965695d-855gp_openstack(efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b)\"" pod="openstack/heat-cfnapi-8b965695d-855gp" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.313489 4917 scope.go:117] "RemoveContainer" containerID="b0b23136cc7bcb0b87e1d79469aec0b99b42966c9d1728c295077353e2f0aa9b" Mar 18 08:25:09 crc kubenswrapper[4917]: E0318 08:25:09.313776 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-548bbcdfbc-7b95h_openstack(7f49709d-261d-43ee-9679-db08b0fe33dc)\"" pod="openstack/heat-api-548bbcdfbc-7b95h" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.365699 4917 scope.go:117] "RemoveContainer" containerID="fbf4a49cf2ccf29f65fc784a461789ed30313cbb669fc1751c34d5b7c02a9989" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.395654 4917 scope.go:117] "RemoveContainer" containerID="f9bec35a8a95a2f42835913615c45960b3d71fda506b83f97f9f143b30919f3e" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.439926 4917 scope.go:117] "RemoveContainer" containerID="e217587353e71ace2a478044a1fb68bc906b885892207047a0d099d4896d2582" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.476891 4917 scope.go:117] "RemoveContainer" containerID="9de26a0491cdd67b4195032535fa3b2c14410e53fc91c6fd765377abab3bc834" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.533903 4917 scope.go:117] "RemoveContainer" containerID="f085efb893f4bbac0631daa8589a6a7312a0c398f52b528f7af6b93c30d0f7c3" Mar 18 08:25:09 crc kubenswrapper[4917]: I0318 08:25:09.574628 4917 scope.go:117] "RemoveContainer" containerID="32e1c3096ccac1a4fe42d7ddce9ac61a328a0bfd323c79202551863fe40b82bd" Mar 18 08:25:10 crc kubenswrapper[4917]: I0318 08:25:10.265163 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-75948f849f-v4g2p" podUID="6e6d22de-4dec-4ff9-9ebe-47436310eea8" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.1.147:8004/healthcheck\": read tcp 10.217.0.2:43708->10.217.1.147:8004: read: connection reset by peer" Mar 18 08:25:10 crc kubenswrapper[4917]: I0318 08:25:10.339755 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" podUID="3169ca11-feb9-415a-a681-3f68c11b45f8" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.1.146:8000/healthcheck\": read tcp 10.217.0.2:42958->10.217.1.146:8000: read: connection reset by peer" Mar 18 08:25:10 crc kubenswrapper[4917]: I0318 08:25:10.915272 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:25:10 crc kubenswrapper[4917]: I0318 08:25:10.923180 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.008368 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data\") pod \"3169ca11-feb9-415a-a681-3f68c11b45f8\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.008450 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf2ds\" (UniqueName: \"kubernetes.io/projected/3169ca11-feb9-415a-a681-3f68c11b45f8-kube-api-access-sf2ds\") pod \"3169ca11-feb9-415a-a681-3f68c11b45f8\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.008538 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data\") pod \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.008554 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnxtk\" (UniqueName: \"kubernetes.io/projected/6e6d22de-4dec-4ff9-9ebe-47436310eea8-kube-api-access-mnxtk\") pod \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.008602 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-combined-ca-bundle\") pod \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.008711 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-combined-ca-bundle\") pod \"3169ca11-feb9-415a-a681-3f68c11b45f8\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.008784 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data-custom\") pod \"3169ca11-feb9-415a-a681-3f68c11b45f8\" (UID: \"3169ca11-feb9-415a-a681-3f68c11b45f8\") " Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.008814 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data-custom\") pod \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\" (UID: \"6e6d22de-4dec-4ff9-9ebe-47436310eea8\") " Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.015352 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3169ca11-feb9-415a-a681-3f68c11b45f8-kube-api-access-sf2ds" (OuterVolumeSpecName: "kube-api-access-sf2ds") pod "3169ca11-feb9-415a-a681-3f68c11b45f8" (UID: "3169ca11-feb9-415a-a681-3f68c11b45f8"). InnerVolumeSpecName "kube-api-access-sf2ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.015518 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3169ca11-feb9-415a-a681-3f68c11b45f8" (UID: "3169ca11-feb9-415a-a681-3f68c11b45f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.015976 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6e6d22de-4dec-4ff9-9ebe-47436310eea8" (UID: "6e6d22de-4dec-4ff9-9ebe-47436310eea8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.036822 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e6d22de-4dec-4ff9-9ebe-47436310eea8-kube-api-access-mnxtk" (OuterVolumeSpecName: "kube-api-access-mnxtk") pod "6e6d22de-4dec-4ff9-9ebe-47436310eea8" (UID: "6e6d22de-4dec-4ff9-9ebe-47436310eea8"). InnerVolumeSpecName "kube-api-access-mnxtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.058716 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3169ca11-feb9-415a-a681-3f68c11b45f8" (UID: "3169ca11-feb9-415a-a681-3f68c11b45f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.064426 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e6d22de-4dec-4ff9-9ebe-47436310eea8" (UID: "6e6d22de-4dec-4ff9-9ebe-47436310eea8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.080196 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data" (OuterVolumeSpecName: "config-data") pod "3169ca11-feb9-415a-a681-3f68c11b45f8" (UID: "3169ca11-feb9-415a-a681-3f68c11b45f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.086876 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data" (OuterVolumeSpecName: "config-data") pod "6e6d22de-4dec-4ff9-9ebe-47436310eea8" (UID: "6e6d22de-4dec-4ff9-9ebe-47436310eea8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.111147 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.111178 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnxtk\" (UniqueName: \"kubernetes.io/projected/6e6d22de-4dec-4ff9-9ebe-47436310eea8-kube-api-access-mnxtk\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.111190 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.111200 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.111208 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.111217 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e6d22de-4dec-4ff9-9ebe-47436310eea8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.111226 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3169ca11-feb9-415a-a681-3f68c11b45f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.111234 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf2ds\" (UniqueName: \"kubernetes.io/projected/3169ca11-feb9-415a-a681-3f68c11b45f8-kube-api-access-sf2ds\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.165297 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-58cff7766d-4ggqr" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.137:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.137:8443: connect: connection refused" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.346188 4917 generic.go:334] "Generic (PLEG): container finished" podID="3169ca11-feb9-415a-a681-3f68c11b45f8" containerID="551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047" exitCode=0 Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.346305 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.346341 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" event={"ID":"3169ca11-feb9-415a-a681-3f68c11b45f8","Type":"ContainerDied","Data":"551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047"} Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.347659 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5fc6597ddc-c2hq4" event={"ID":"3169ca11-feb9-415a-a681-3f68c11b45f8","Type":"ContainerDied","Data":"fbc0ddfd7def800d131593fdfc73b5c38e539131d2bf34ec858abcce5dc3df13"} Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.347708 4917 scope.go:117] "RemoveContainer" containerID="551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.350975 4917 generic.go:334] "Generic (PLEG): container finished" podID="6e6d22de-4dec-4ff9-9ebe-47436310eea8" containerID="86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276" exitCode=0 Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.351033 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-75948f849f-v4g2p" event={"ID":"6e6d22de-4dec-4ff9-9ebe-47436310eea8","Type":"ContainerDied","Data":"86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276"} Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.351062 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-75948f849f-v4g2p" event={"ID":"6e6d22de-4dec-4ff9-9ebe-47436310eea8","Type":"ContainerDied","Data":"5530e2d908a0d3bd8eb74cf031b394dcaf43b84e8d59d5bad6e4f69c10dffcda"} Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.351107 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-75948f849f-v4g2p" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.375777 4917 scope.go:117] "RemoveContainer" containerID="551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047" Mar 18 08:25:11 crc kubenswrapper[4917]: E0318 08:25:11.376176 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047\": container with ID starting with 551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047 not found: ID does not exist" containerID="551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.376230 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047"} err="failed to get container status \"551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047\": rpc error: code = NotFound desc = could not find container \"551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047\": container with ID starting with 551a80bc08fd529e7e5ca7c86f703d99b37820cef8b19f627e8832f797dec047 not found: ID does not exist" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.376261 4917 scope.go:117] "RemoveContainer" containerID="86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.408172 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5fc6597ddc-c2hq4"] Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.430373 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5fc6597ddc-c2hq4"] Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.430485 4917 scope.go:117] "RemoveContainer" containerID="86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276" Mar 18 08:25:11 crc kubenswrapper[4917]: E0318 08:25:11.431071 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276\": container with ID starting with 86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276 not found: ID does not exist" containerID="86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.431115 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276"} err="failed to get container status \"86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276\": rpc error: code = NotFound desc = could not find container \"86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276\": container with ID starting with 86b51be78dddbf1c0d42656acfbce681af583b3663283afa3a612b000d344276 not found: ID does not exist" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.445868 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-75948f849f-v4g2p"] Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.456240 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-75948f849f-v4g2p"] Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.788716 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3169ca11-feb9-415a-a681-3f68c11b45f8" path="/var/lib/kubelet/pods/3169ca11-feb9-415a-a681-3f68c11b45f8/volumes" Mar 18 08:25:11 crc kubenswrapper[4917]: I0318 08:25:11.789421 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e6d22de-4dec-4ff9-9ebe-47436310eea8" path="/var/lib/kubelet/pods/6e6d22de-4dec-4ff9-9ebe-47436310eea8/volumes" Mar 18 08:25:16 crc kubenswrapper[4917]: I0318 08:25:16.555922 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-9f889575f-j9mf4" Mar 18 08:25:16 crc kubenswrapper[4917]: I0318 08:25:16.618509 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-548bbcdfbc-7b95h"] Mar 18 08:25:16 crc kubenswrapper[4917]: I0318 08:25:16.756683 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7cdb7dbfb5-59str" Mar 18 08:25:16 crc kubenswrapper[4917]: I0318 08:25:16.768177 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:25:16 crc kubenswrapper[4917]: I0318 08:25:16.844048 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-8b965695d-855gp"] Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.089963 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.252515 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.274166 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data-custom\") pod \"7f49709d-261d-43ee-9679-db08b0fe33dc\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.274476 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-combined-ca-bundle\") pod \"7f49709d-261d-43ee-9679-db08b0fe33dc\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.274598 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxnhh\" (UniqueName: \"kubernetes.io/projected/7f49709d-261d-43ee-9679-db08b0fe33dc-kube-api-access-gxnhh\") pod \"7f49709d-261d-43ee-9679-db08b0fe33dc\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.274681 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data\") pod \"7f49709d-261d-43ee-9679-db08b0fe33dc\" (UID: \"7f49709d-261d-43ee-9679-db08b0fe33dc\") " Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.285175 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7f49709d-261d-43ee-9679-db08b0fe33dc" (UID: "7f49709d-261d-43ee-9679-db08b0fe33dc"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.288962 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f49709d-261d-43ee-9679-db08b0fe33dc-kube-api-access-gxnhh" (OuterVolumeSpecName: "kube-api-access-gxnhh") pod "7f49709d-261d-43ee-9679-db08b0fe33dc" (UID: "7f49709d-261d-43ee-9679-db08b0fe33dc"). InnerVolumeSpecName "kube-api-access-gxnhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.311604 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f49709d-261d-43ee-9679-db08b0fe33dc" (UID: "7f49709d-261d-43ee-9679-db08b0fe33dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.341982 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data" (OuterVolumeSpecName: "config-data") pod "7f49709d-261d-43ee-9679-db08b0fe33dc" (UID: "7f49709d-261d-43ee-9679-db08b0fe33dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.376071 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-combined-ca-bundle\") pod \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.376221 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data\") pod \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.376253 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data-custom\") pod \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.376315 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv2x5\" (UniqueName: \"kubernetes.io/projected/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-kube-api-access-zv2x5\") pod \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\" (UID: \"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b\") " Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.376761 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.376777 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxnhh\" (UniqueName: \"kubernetes.io/projected/7f49709d-261d-43ee-9679-db08b0fe33dc-kube-api-access-gxnhh\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.376789 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.376798 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7f49709d-261d-43ee-9679-db08b0fe33dc-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.380600 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" (UID: "efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.380632 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-kube-api-access-zv2x5" (OuterVolumeSpecName: "kube-api-access-zv2x5") pod "efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" (UID: "efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b"). InnerVolumeSpecName "kube-api-access-zv2x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.410673 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" (UID: "efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.432328 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-8b965695d-855gp" event={"ID":"efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b","Type":"ContainerDied","Data":"5461ae54e3f25f7b24c36d010eb495939a15a20b97d99ab0891297bd0f7550b1"} Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.432397 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-8b965695d-855gp" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.432413 4917 scope.go:117] "RemoveContainer" containerID="46dfbd5f7b8c1aab294e95b97c4bac7e00a66dd5831e23fe4bd7e3bf612464e4" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.433844 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-548bbcdfbc-7b95h" event={"ID":"7f49709d-261d-43ee-9679-db08b0fe33dc","Type":"ContainerDied","Data":"13ecbc426fbb8da6ef6fec77c00b557d312b0e8c289a011586acc7ba80ece60b"} Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.433886 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-548bbcdfbc-7b95h" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.465952 4917 scope.go:117] "RemoveContainer" containerID="b0b23136cc7bcb0b87e1d79469aec0b99b42966c9d1728c295077353e2f0aa9b" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.467397 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-548bbcdfbc-7b95h"] Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.475400 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-548bbcdfbc-7b95h"] Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.477446 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data" (OuterVolumeSpecName: "config-data") pod "efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" (UID: "efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.478528 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.478562 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv2x5\" (UniqueName: \"kubernetes.io/projected/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-kube-api-access-zv2x5\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.478573 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.478592 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.803113 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" path="/var/lib/kubelet/pods/7f49709d-261d-43ee-9679-db08b0fe33dc/volumes" Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.805877 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-8b965695d-855gp"] Mar 18 08:25:17 crc kubenswrapper[4917]: I0318 08:25:17.806129 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-8b965695d-855gp"] Mar 18 08:25:19 crc kubenswrapper[4917]: I0318 08:25:19.797025 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" path="/var/lib/kubelet/pods/efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b/volumes" Mar 18 08:25:21 crc kubenswrapper[4917]: I0318 08:25:21.166318 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-58cff7766d-4ggqr" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.1.137:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.137:8443: connect: connection refused" Mar 18 08:25:21 crc kubenswrapper[4917]: I0318 08:25:21.167964 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:25:24 crc kubenswrapper[4917]: I0318 08:25:24.007047 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-9f4b4874b-nfbm6" Mar 18 08:25:24 crc kubenswrapper[4917]: I0318 08:25:24.056379 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7fc579c746-72ssk"] Mar 18 08:25:24 crc kubenswrapper[4917]: I0318 08:25:24.056738 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-7fc579c746-72ssk" podUID="82277826-07e2-4eda-8759-7817347cd653" containerName="heat-engine" containerID="cri-o://3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041" gracePeriod=60 Mar 18 08:25:26 crc kubenswrapper[4917]: E0318 08:25:26.728837 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 08:25:26 crc kubenswrapper[4917]: E0318 08:25:26.731897 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 08:25:26 crc kubenswrapper[4917]: E0318 08:25:26.733557 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 08:25:26 crc kubenswrapper[4917]: E0318 08:25:26.733661 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7fc579c746-72ssk" podUID="82277826-07e2-4eda-8759-7817347cd653" containerName="heat-engine" Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.611400 4917 generic.go:334] "Generic (PLEG): container finished" podID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerID="cd0cf440100899f05139ecacf79add51e9cea5ceed002787e7ade1b6767f5dc5" exitCode=137 Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.611489 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cff7766d-4ggqr" event={"ID":"6d857256-d8fa-4b44-b66a-5238f6e7ec1d","Type":"ContainerDied","Data":"cd0cf440100899f05139ecacf79add51e9cea5ceed002787e7ade1b6767f5dc5"} Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.728491 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.912065 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mg6s\" (UniqueName: \"kubernetes.io/projected/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-kube-api-access-7mg6s\") pod \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.912146 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-secret-key\") pod \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.912233 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-logs\") pod \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.912276 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-tls-certs\") pod \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.912313 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-scripts\") pod \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.912367 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-config-data\") pod \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.912446 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-combined-ca-bundle\") pod \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\" (UID: \"6d857256-d8fa-4b44-b66a-5238f6e7ec1d\") " Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.917066 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-logs" (OuterVolumeSpecName: "logs") pod "6d857256-d8fa-4b44-b66a-5238f6e7ec1d" (UID: "6d857256-d8fa-4b44-b66a-5238f6e7ec1d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.920556 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-kube-api-access-7mg6s" (OuterVolumeSpecName: "kube-api-access-7mg6s") pod "6d857256-d8fa-4b44-b66a-5238f6e7ec1d" (UID: "6d857256-d8fa-4b44-b66a-5238f6e7ec1d"). InnerVolumeSpecName "kube-api-access-7mg6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.926873 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6d857256-d8fa-4b44-b66a-5238f6e7ec1d" (UID: "6d857256-d8fa-4b44-b66a-5238f6e7ec1d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.987234 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-config-data" (OuterVolumeSpecName: "config-data") pod "6d857256-d8fa-4b44-b66a-5238f6e7ec1d" (UID: "6d857256-d8fa-4b44-b66a-5238f6e7ec1d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:25:30 crc kubenswrapper[4917]: I0318 08:25:30.998523 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-scripts" (OuterVolumeSpecName: "scripts") pod "6d857256-d8fa-4b44-b66a-5238f6e7ec1d" (UID: "6d857256-d8fa-4b44-b66a-5238f6e7ec1d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.017972 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mg6s\" (UniqueName: \"kubernetes.io/projected/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-kube-api-access-7mg6s\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.018020 4917 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.018031 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-logs\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.018041 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.018051 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.022692 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d857256-d8fa-4b44-b66a-5238f6e7ec1d" (UID: "6d857256-d8fa-4b44-b66a-5238f6e7ec1d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.030705 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "6d857256-d8fa-4b44-b66a-5238f6e7ec1d" (UID: "6d857256-d8fa-4b44-b66a-5238f6e7ec1d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.119342 4917 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.119608 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d857256-d8fa-4b44-b66a-5238f6e7ec1d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178116 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wssk2"] Mar 18 08:25:31 crc kubenswrapper[4917]: E0318 08:25:31.178477 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3169ca11-feb9-415a-a681-3f68c11b45f8" containerName="heat-cfnapi" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178493 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3169ca11-feb9-415a-a681-3f68c11b45f8" containerName="heat-cfnapi" Mar 18 08:25:31 crc kubenswrapper[4917]: E0318 08:25:31.178503 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" containerName="heat-cfnapi" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178509 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" containerName="heat-cfnapi" Mar 18 08:25:31 crc kubenswrapper[4917]: E0318 08:25:31.178520 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e6d22de-4dec-4ff9-9ebe-47436310eea8" containerName="heat-api" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178526 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e6d22de-4dec-4ff9-9ebe-47436310eea8" containerName="heat-api" Mar 18 08:25:31 crc kubenswrapper[4917]: E0318 08:25:31.178538 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" containerName="heat-cfnapi" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178543 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" containerName="heat-cfnapi" Mar 18 08:25:31 crc kubenswrapper[4917]: E0318 08:25:31.178552 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon-log" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178558 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon-log" Mar 18 08:25:31 crc kubenswrapper[4917]: E0318 08:25:31.178567 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" containerName="heat-api" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178572 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" containerName="heat-api" Mar 18 08:25:31 crc kubenswrapper[4917]: E0318 08:25:31.178645 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178652 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon" Mar 18 08:25:31 crc kubenswrapper[4917]: E0318 08:25:31.178663 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" containerName="heat-api" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178669 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" containerName="heat-api" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178825 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" containerName="heat-api" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178837 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" containerName="heat-cfnapi" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178847 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178857 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3169ca11-feb9-415a-a681-3f68c11b45f8" containerName="heat-cfnapi" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178868 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" containerName="horizon-log" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178880 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb6fd2f-bd2c-4ca4-b189-1678c6a4e97b" containerName="heat-cfnapi" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.178886 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e6d22de-4dec-4ff9-9ebe-47436310eea8" containerName="heat-api" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.179221 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f49709d-261d-43ee-9679-db08b0fe33dc" containerName="heat-api" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.180139 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.191773 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wssk2"] Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.322867 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvct\" (UniqueName: \"kubernetes.io/projected/64966d5f-e8c0-4887-99e1-976100437824-kube-api-access-tnvct\") pod \"community-operators-wssk2\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.323312 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-catalog-content\") pod \"community-operators-wssk2\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.323432 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-utilities\") pod \"community-operators-wssk2\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.425063 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnvct\" (UniqueName: \"kubernetes.io/projected/64966d5f-e8c0-4887-99e1-976100437824-kube-api-access-tnvct\") pod \"community-operators-wssk2\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.425221 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-catalog-content\") pod \"community-operators-wssk2\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.425246 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-utilities\") pod \"community-operators-wssk2\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.425901 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-utilities\") pod \"community-operators-wssk2\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.425910 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-catalog-content\") pod \"community-operators-wssk2\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.451712 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnvct\" (UniqueName: \"kubernetes.io/projected/64966d5f-e8c0-4887-99e1-976100437824-kube-api-access-tnvct\") pod \"community-operators-wssk2\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.496090 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.629989 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cff7766d-4ggqr" event={"ID":"6d857256-d8fa-4b44-b66a-5238f6e7ec1d","Type":"ContainerDied","Data":"6f5650f9c1670bd8c745625301e16768aa93c4cbfc5ba301d6052b8019164aeb"} Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.630058 4917 scope.go:117] "RemoveContainer" containerID="84fa1262b86e5239f7efe9e5ab2a4f8dbc3014619d0e46f0c362e5529aa4647b" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.630125 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cff7766d-4ggqr" Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.719691 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58cff7766d-4ggqr"] Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.740789 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58cff7766d-4ggqr"] Mar 18 08:25:31 crc kubenswrapper[4917]: I0318 08:25:31.807780 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d857256-d8fa-4b44-b66a-5238f6e7ec1d" path="/var/lib/kubelet/pods/6d857256-d8fa-4b44-b66a-5238f6e7ec1d/volumes" Mar 18 08:25:32 crc kubenswrapper[4917]: I0318 08:25:32.054909 4917 scope.go:117] "RemoveContainer" containerID="cd0cf440100899f05139ecacf79add51e9cea5ceed002787e7ade1b6767f5dc5" Mar 18 08:25:32 crc kubenswrapper[4917]: I0318 08:25:32.143523 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wssk2"] Mar 18 08:25:32 crc kubenswrapper[4917]: I0318 08:25:32.641155 4917 generic.go:334] "Generic (PLEG): container finished" podID="64966d5f-e8c0-4887-99e1-976100437824" containerID="f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a" exitCode=0 Mar 18 08:25:32 crc kubenswrapper[4917]: I0318 08:25:32.641492 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wssk2" event={"ID":"64966d5f-e8c0-4887-99e1-976100437824","Type":"ContainerDied","Data":"f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a"} Mar 18 08:25:32 crc kubenswrapper[4917]: I0318 08:25:32.641519 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wssk2" event={"ID":"64966d5f-e8c0-4887-99e1-976100437824","Type":"ContainerStarted","Data":"d1e04e8aeeef73e29e9339a360554ea44a5aafaa9e57d05820f57a0f18891b44"} Mar 18 08:25:35 crc kubenswrapper[4917]: I0318 08:25:35.850692 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wssk2" event={"ID":"64966d5f-e8c0-4887-99e1-976100437824","Type":"ContainerStarted","Data":"3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d"} Mar 18 08:25:36 crc kubenswrapper[4917]: E0318 08:25:36.728863 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 08:25:36 crc kubenswrapper[4917]: E0318 08:25:36.736228 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 08:25:36 crc kubenswrapper[4917]: E0318 08:25:36.737571 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 18 08:25:36 crc kubenswrapper[4917]: E0318 08:25:36.737616 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7fc579c746-72ssk" podUID="82277826-07e2-4eda-8759-7817347cd653" containerName="heat-engine" Mar 18 08:25:36 crc kubenswrapper[4917]: I0318 08:25:36.864010 4917 generic.go:334] "Generic (PLEG): container finished" podID="64966d5f-e8c0-4887-99e1-976100437824" containerID="3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d" exitCode=0 Mar 18 08:25:36 crc kubenswrapper[4917]: I0318 08:25:36.864109 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wssk2" event={"ID":"64966d5f-e8c0-4887-99e1-976100437824","Type":"ContainerDied","Data":"3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d"} Mar 18 08:25:37 crc kubenswrapper[4917]: I0318 08:25:37.878576 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wssk2" event={"ID":"64966d5f-e8c0-4887-99e1-976100437824","Type":"ContainerStarted","Data":"9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807"} Mar 18 08:25:37 crc kubenswrapper[4917]: I0318 08:25:37.901559 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wssk2" podStartSLOduration=2.285852956 podStartE2EDuration="6.901540618s" podCreationTimestamp="2026-03-18 08:25:31 +0000 UTC" firstStartedPulling="2026-03-18 08:25:32.644305637 +0000 UTC m=+5917.585460351" lastFinishedPulling="2026-03-18 08:25:37.259993299 +0000 UTC m=+5922.201148013" observedRunningTime="2026-03-18 08:25:37.895691986 +0000 UTC m=+5922.836846710" watchObservedRunningTime="2026-03-18 08:25:37.901540618 +0000 UTC m=+5922.842695332" Mar 18 08:25:40 crc kubenswrapper[4917]: I0318 08:25:40.905409 4917 generic.go:334] "Generic (PLEG): container finished" podID="82277826-07e2-4eda-8759-7817347cd653" containerID="3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041" exitCode=0 Mar 18 08:25:40 crc kubenswrapper[4917]: I0318 08:25:40.905494 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7fc579c746-72ssk" event={"ID":"82277826-07e2-4eda-8759-7817347cd653","Type":"ContainerDied","Data":"3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041"} Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.064435 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.164635 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data-custom\") pod \"82277826-07e2-4eda-8759-7817347cd653\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.164754 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data\") pod \"82277826-07e2-4eda-8759-7817347cd653\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.164774 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zj8\" (UniqueName: \"kubernetes.io/projected/82277826-07e2-4eda-8759-7817347cd653-kube-api-access-x4zj8\") pod \"82277826-07e2-4eda-8759-7817347cd653\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.164914 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-combined-ca-bundle\") pod \"82277826-07e2-4eda-8759-7817347cd653\" (UID: \"82277826-07e2-4eda-8759-7817347cd653\") " Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.173929 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "82277826-07e2-4eda-8759-7817347cd653" (UID: "82277826-07e2-4eda-8759-7817347cd653"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.174117 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82277826-07e2-4eda-8759-7817347cd653-kube-api-access-x4zj8" (OuterVolumeSpecName: "kube-api-access-x4zj8") pod "82277826-07e2-4eda-8759-7817347cd653" (UID: "82277826-07e2-4eda-8759-7817347cd653"). InnerVolumeSpecName "kube-api-access-x4zj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.193852 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82277826-07e2-4eda-8759-7817347cd653" (UID: "82277826-07e2-4eda-8759-7817347cd653"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.248724 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data" (OuterVolumeSpecName: "config-data") pod "82277826-07e2-4eda-8759-7817347cd653" (UID: "82277826-07e2-4eda-8759-7817347cd653"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.267452 4917 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.267484 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.267494 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zj8\" (UniqueName: \"kubernetes.io/projected/82277826-07e2-4eda-8759-7817347cd653-kube-api-access-x4zj8\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.267503 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82277826-07e2-4eda-8759-7817347cd653-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.496725 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.498158 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.540269 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.914656 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7fc579c746-72ssk" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.914652 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7fc579c746-72ssk" event={"ID":"82277826-07e2-4eda-8759-7817347cd653","Type":"ContainerDied","Data":"77c106ffec425d676e747c691a450646876a72bb0eafdd8f4ddedd790e4306c3"} Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.914735 4917 scope.go:117] "RemoveContainer" containerID="3aedeb97a392a13c85ef7100bf10134d62bdeab6a067f976733217561c01a041" Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.937748 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7fc579c746-72ssk"] Mar 18 08:25:41 crc kubenswrapper[4917]: I0318 08:25:41.946227 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-7fc579c746-72ssk"] Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.255571 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79"] Mar 18 08:25:42 crc kubenswrapper[4917]: E0318 08:25:42.256053 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82277826-07e2-4eda-8759-7817347cd653" containerName="heat-engine" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.256069 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="82277826-07e2-4eda-8759-7817347cd653" containerName="heat-engine" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.256253 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="82277826-07e2-4eda-8759-7817347cd653" containerName="heat-engine" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.257572 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.260152 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.266185 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79"] Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.387601 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.387764 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.387927 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8ffw\" (UniqueName: \"kubernetes.io/projected/01b048b4-f839-47f4-b90c-ed166746415a-kube-api-access-w8ffw\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.489511 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.489657 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8ffw\" (UniqueName: \"kubernetes.io/projected/01b048b4-f839-47f4-b90c-ed166746415a-kube-api-access-w8ffw\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.489787 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.490321 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.490370 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.516121 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8ffw\" (UniqueName: \"kubernetes.io/projected/01b048b4-f839-47f4-b90c-ed166746415a-kube-api-access-w8ffw\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.578903 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:42 crc kubenswrapper[4917]: I0318 08:25:42.997091 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:43 crc kubenswrapper[4917]: I0318 08:25:43.071789 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79"] Mar 18 08:25:43 crc kubenswrapper[4917]: I0318 08:25:43.792188 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82277826-07e2-4eda-8759-7817347cd653" path="/var/lib/kubelet/pods/82277826-07e2-4eda-8759-7817347cd653/volumes" Mar 18 08:25:43 crc kubenswrapper[4917]: I0318 08:25:43.938329 4917 generic.go:334] "Generic (PLEG): container finished" podID="01b048b4-f839-47f4-b90c-ed166746415a" containerID="e57048051b2a6426c365e5c58d11a3f8cf303003ca9232cd226137b17032068d" exitCode=0 Mar 18 08:25:43 crc kubenswrapper[4917]: I0318 08:25:43.940697 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" event={"ID":"01b048b4-f839-47f4-b90c-ed166746415a","Type":"ContainerDied","Data":"e57048051b2a6426c365e5c58d11a3f8cf303003ca9232cd226137b17032068d"} Mar 18 08:25:43 crc kubenswrapper[4917]: I0318 08:25:43.940960 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" event={"ID":"01b048b4-f839-47f4-b90c-ed166746415a","Type":"ContainerStarted","Data":"7915cc63b7f65a4da3c32a690b4d2dbb9561cb25205050c32822b2e8b4cb25e6"} Mar 18 08:25:44 crc kubenswrapper[4917]: I0318 08:25:44.784063 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wssk2"] Mar 18 08:25:44 crc kubenswrapper[4917]: I0318 08:25:44.951377 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wssk2" podUID="64966d5f-e8c0-4887-99e1-976100437824" containerName="registry-server" containerID="cri-o://9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807" gracePeriod=2 Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.459703 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.485724 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-catalog-content\") pod \"64966d5f-e8c0-4887-99e1-976100437824\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.485940 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-utilities\") pod \"64966d5f-e8c0-4887-99e1-976100437824\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.485984 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnvct\" (UniqueName: \"kubernetes.io/projected/64966d5f-e8c0-4887-99e1-976100437824-kube-api-access-tnvct\") pod \"64966d5f-e8c0-4887-99e1-976100437824\" (UID: \"64966d5f-e8c0-4887-99e1-976100437824\") " Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.487190 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-utilities" (OuterVolumeSpecName: "utilities") pod "64966d5f-e8c0-4887-99e1-976100437824" (UID: "64966d5f-e8c0-4887-99e1-976100437824"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.490978 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.517814 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64966d5f-e8c0-4887-99e1-976100437824-kube-api-access-tnvct" (OuterVolumeSpecName: "kube-api-access-tnvct") pod "64966d5f-e8c0-4887-99e1-976100437824" (UID: "64966d5f-e8c0-4887-99e1-976100437824"). InnerVolumeSpecName "kube-api-access-tnvct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.583606 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64966d5f-e8c0-4887-99e1-976100437824" (UID: "64966d5f-e8c0-4887-99e1-976100437824"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.592617 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64966d5f-e8c0-4887-99e1-976100437824-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.592643 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnvct\" (UniqueName: \"kubernetes.io/projected/64966d5f-e8c0-4887-99e1-976100437824-kube-api-access-tnvct\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.963495 4917 generic.go:334] "Generic (PLEG): container finished" podID="01b048b4-f839-47f4-b90c-ed166746415a" containerID="c0b4c5d8600ccfa8a60a4f26b72dc0d41cc7e5b8cf54d2969ce19e03a0d05447" exitCode=0 Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.963606 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" event={"ID":"01b048b4-f839-47f4-b90c-ed166746415a","Type":"ContainerDied","Data":"c0b4c5d8600ccfa8a60a4f26b72dc0d41cc7e5b8cf54d2969ce19e03a0d05447"} Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.978465 4917 generic.go:334] "Generic (PLEG): container finished" podID="64966d5f-e8c0-4887-99e1-976100437824" containerID="9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807" exitCode=0 Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.978516 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wssk2" event={"ID":"64966d5f-e8c0-4887-99e1-976100437824","Type":"ContainerDied","Data":"9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807"} Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.978559 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wssk2" event={"ID":"64966d5f-e8c0-4887-99e1-976100437824","Type":"ContainerDied","Data":"d1e04e8aeeef73e29e9339a360554ea44a5aafaa9e57d05820f57a0f18891b44"} Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.978596 4917 scope.go:117] "RemoveContainer" containerID="9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807" Mar 18 08:25:45 crc kubenswrapper[4917]: I0318 08:25:45.978691 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wssk2" Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.015355 4917 scope.go:117] "RemoveContainer" containerID="3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d" Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.022964 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wssk2"] Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.033232 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wssk2"] Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.043787 4917 scope.go:117] "RemoveContainer" containerID="f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a" Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.066678 4917 scope.go:117] "RemoveContainer" containerID="9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807" Mar 18 08:25:46 crc kubenswrapper[4917]: E0318 08:25:46.067169 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807\": container with ID starting with 9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807 not found: ID does not exist" containerID="9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807" Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.067208 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807"} err="failed to get container status \"9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807\": rpc error: code = NotFound desc = could not find container \"9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807\": container with ID starting with 9ceee087420d966b44e3e099ea2ae8e2c3e6e1035482a314bf896f658b921807 not found: ID does not exist" Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.067234 4917 scope.go:117] "RemoveContainer" containerID="3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d" Mar 18 08:25:46 crc kubenswrapper[4917]: E0318 08:25:46.067484 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d\": container with ID starting with 3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d not found: ID does not exist" containerID="3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d" Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.067543 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d"} err="failed to get container status \"3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d\": rpc error: code = NotFound desc = could not find container \"3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d\": container with ID starting with 3c68660785a248b8431eefa52db3aa507f76fe9de3c809bb5304ee2d7c8cc11d not found: ID does not exist" Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.067562 4917 scope.go:117] "RemoveContainer" containerID="f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a" Mar 18 08:25:46 crc kubenswrapper[4917]: E0318 08:25:46.067822 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a\": container with ID starting with f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a not found: ID does not exist" containerID="f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a" Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.067857 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a"} err="failed to get container status \"f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a\": rpc error: code = NotFound desc = could not find container \"f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a\": container with ID starting with f7e4328fc58c74436c9fbadfff406e4062758f182a7ce109d628f6b2d6f01c2a not found: ID does not exist" Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.996265 4917 generic.go:334] "Generic (PLEG): container finished" podID="01b048b4-f839-47f4-b90c-ed166746415a" containerID="c9c73b600fa057ca63ad4254d3d833ecf75a6d9a8e9a3ea95c088acb76cf998b" exitCode=0 Mar 18 08:25:46 crc kubenswrapper[4917]: I0318 08:25:46.996337 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" event={"ID":"01b048b4-f839-47f4-b90c-ed166746415a","Type":"ContainerDied","Data":"c9c73b600fa057ca63ad4254d3d833ecf75a6d9a8e9a3ea95c088acb76cf998b"} Mar 18 08:25:47 crc kubenswrapper[4917]: I0318 08:25:47.788013 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64966d5f-e8c0-4887-99e1-976100437824" path="/var/lib/kubelet/pods/64966d5f-e8c0-4887-99e1-976100437824/volumes" Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.379324 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.556118 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-util\") pod \"01b048b4-f839-47f4-b90c-ed166746415a\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.556282 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8ffw\" (UniqueName: \"kubernetes.io/projected/01b048b4-f839-47f4-b90c-ed166746415a-kube-api-access-w8ffw\") pod \"01b048b4-f839-47f4-b90c-ed166746415a\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.556413 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-bundle\") pod \"01b048b4-f839-47f4-b90c-ed166746415a\" (UID: \"01b048b4-f839-47f4-b90c-ed166746415a\") " Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.558731 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-bundle" (OuterVolumeSpecName: "bundle") pod "01b048b4-f839-47f4-b90c-ed166746415a" (UID: "01b048b4-f839-47f4-b90c-ed166746415a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.561992 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b048b4-f839-47f4-b90c-ed166746415a-kube-api-access-w8ffw" (OuterVolumeSpecName: "kube-api-access-w8ffw") pod "01b048b4-f839-47f4-b90c-ed166746415a" (UID: "01b048b4-f839-47f4-b90c-ed166746415a"). InnerVolumeSpecName "kube-api-access-w8ffw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.659016 4917 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.659053 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8ffw\" (UniqueName: \"kubernetes.io/projected/01b048b4-f839-47f4-b90c-ed166746415a-kube-api-access-w8ffw\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.791534 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-util" (OuterVolumeSpecName: "util") pod "01b048b4-f839-47f4-b90c-ed166746415a" (UID: "01b048b4-f839-47f4-b90c-ed166746415a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:25:48 crc kubenswrapper[4917]: I0318 08:25:48.863690 4917 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/01b048b4-f839-47f4-b90c-ed166746415a-util\") on node \"crc\" DevicePath \"\"" Mar 18 08:25:49 crc kubenswrapper[4917]: I0318 08:25:49.022870 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" event={"ID":"01b048b4-f839-47f4-b90c-ed166746415a","Type":"ContainerDied","Data":"7915cc63b7f65a4da3c32a690b4d2dbb9561cb25205050c32822b2e8b4cb25e6"} Mar 18 08:25:49 crc kubenswrapper[4917]: I0318 08:25:49.022944 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7915cc63b7f65a4da3c32a690b4d2dbb9561cb25205050c32822b2e8b4cb25e6" Mar 18 08:25:49 crc kubenswrapper[4917]: I0318 08:25:49.022954 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.497180 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc"] Mar 18 08:25:59 crc kubenswrapper[4917]: E0318 08:25:59.498175 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b048b4-f839-47f4-b90c-ed166746415a" containerName="pull" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.498192 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b048b4-f839-47f4-b90c-ed166746415a" containerName="pull" Mar 18 08:25:59 crc kubenswrapper[4917]: E0318 08:25:59.498263 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64966d5f-e8c0-4887-99e1-976100437824" containerName="registry-server" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.498274 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="64966d5f-e8c0-4887-99e1-976100437824" containerName="registry-server" Mar 18 08:25:59 crc kubenswrapper[4917]: E0318 08:25:59.498293 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b048b4-f839-47f4-b90c-ed166746415a" containerName="util" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.498302 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b048b4-f839-47f4-b90c-ed166746415a" containerName="util" Mar 18 08:25:59 crc kubenswrapper[4917]: E0318 08:25:59.498320 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64966d5f-e8c0-4887-99e1-976100437824" containerName="extract-utilities" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.498329 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="64966d5f-e8c0-4887-99e1-976100437824" containerName="extract-utilities" Mar 18 08:25:59 crc kubenswrapper[4917]: E0318 08:25:59.498352 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64966d5f-e8c0-4887-99e1-976100437824" containerName="extract-content" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.498359 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="64966d5f-e8c0-4887-99e1-976100437824" containerName="extract-content" Mar 18 08:25:59 crc kubenswrapper[4917]: E0318 08:25:59.498369 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b048b4-f839-47f4-b90c-ed166746415a" containerName="extract" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.498376 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b048b4-f839-47f4-b90c-ed166746415a" containerName="extract" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.498610 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="64966d5f-e8c0-4887-99e1-976100437824" containerName="registry-server" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.498632 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b048b4-f839-47f4-b90c-ed166746415a" containerName="extract" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.499310 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.504139 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-bvc2r" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.504753 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.504792 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.514757 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc"] Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.591168 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbh6m\" (UniqueName: \"kubernetes.io/projected/1ff89bec-5b99-41f8-8859-90b9fac30a81-kube-api-access-qbh6m\") pod \"obo-prometheus-operator-8ff7d675-7rjvc\" (UID: \"1ff89bec-5b99-41f8-8859-90b9fac30a81\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.693653 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbh6m\" (UniqueName: \"kubernetes.io/projected/1ff89bec-5b99-41f8-8859-90b9fac30a81-kube-api-access-qbh6m\") pod \"obo-prometheus-operator-8ff7d675-7rjvc\" (UID: \"1ff89bec-5b99-41f8-8859-90b9fac30a81\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.758512 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbh6m\" (UniqueName: \"kubernetes.io/projected/1ff89bec-5b99-41f8-8859-90b9fac30a81-kube-api-access-qbh6m\") pod \"obo-prometheus-operator-8ff7d675-7rjvc\" (UID: \"1ff89bec-5b99-41f8-8859-90b9fac30a81\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.817292 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk"] Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.818997 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.823906 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.824368 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-wk8wf" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.826928 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.842427 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb"] Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.851737 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.866782 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk"] Mar 18 08:25:59 crc kubenswrapper[4917]: I0318 08:25:59.883130 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:25:59.999636 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0041ff4d-5053-427a-aaac-dadb33cd80c8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk\" (UID: \"0041ff4d-5053-427a-aaac-dadb33cd80c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:25:59.999736 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0041ff4d-5053-427a-aaac-dadb33cd80c8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk\" (UID: \"0041ff4d-5053-427a-aaac-dadb33cd80c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.000370 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb\" (UID: \"a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.000405 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb\" (UID: \"a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.102552 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb\" (UID: \"a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.102982 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb\" (UID: \"a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.103045 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0041ff4d-5053-427a-aaac-dadb33cd80c8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk\" (UID: \"0041ff4d-5053-427a-aaac-dadb33cd80c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.103170 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0041ff4d-5053-427a-aaac-dadb33cd80c8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk\" (UID: \"0041ff4d-5053-427a-aaac-dadb33cd80c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.108704 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb\" (UID: \"a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.110036 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb\" (UID: \"a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.110387 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0041ff4d-5053-427a-aaac-dadb33cd80c8-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk\" (UID: \"0041ff4d-5053-427a-aaac-dadb33cd80c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.126472 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0041ff4d-5053-427a-aaac-dadb33cd80c8-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk\" (UID: \"0041ff4d-5053-427a-aaac-dadb33cd80c8\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.147569 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.150271 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563706-pg6sc"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.173518 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563706-pg6sc" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.182990 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.183215 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.183311 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563706-pg6sc"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.183352 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.256694 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-xhwk7"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.257949 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.258290 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.261716 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-bvk22" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.267396 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.273836 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-xhwk7"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.306606 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwskn\" (UniqueName: \"kubernetes.io/projected/6c0b0cea-1278-459b-917b-63e0cbd85682-kube-api-access-qwskn\") pod \"auto-csr-approver-29563706-pg6sc\" (UID: \"6c0b0cea-1278-459b-917b-63e0cbd85682\") " pod="openshift-infra/auto-csr-approver-29563706-pg6sc" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.369977 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.409019 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2c168dd-a6d0-40e6-832b-e5658df4e05d-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-xhwk7\" (UID: \"e2c168dd-a6d0-40e6-832b-e5658df4e05d\") " pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.409079 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxz2s\" (UniqueName: \"kubernetes.io/projected/e2c168dd-a6d0-40e6-832b-e5658df4e05d-kube-api-access-mxz2s\") pod \"observability-operator-6dd7dd855f-xhwk7\" (UID: \"e2c168dd-a6d0-40e6-832b-e5658df4e05d\") " pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.409864 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwskn\" (UniqueName: \"kubernetes.io/projected/6c0b0cea-1278-459b-917b-63e0cbd85682-kube-api-access-qwskn\") pod \"auto-csr-approver-29563706-pg6sc\" (UID: \"6c0b0cea-1278-459b-917b-63e0cbd85682\") " pod="openshift-infra/auto-csr-approver-29563706-pg6sc" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.431183 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwskn\" (UniqueName: \"kubernetes.io/projected/6c0b0cea-1278-459b-917b-63e0cbd85682-kube-api-access-qwskn\") pod \"auto-csr-approver-29563706-pg6sc\" (UID: \"6c0b0cea-1278-459b-917b-63e0cbd85682\") " pod="openshift-infra/auto-csr-approver-29563706-pg6sc" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.511734 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxz2s\" (UniqueName: \"kubernetes.io/projected/e2c168dd-a6d0-40e6-832b-e5658df4e05d-kube-api-access-mxz2s\") pod \"observability-operator-6dd7dd855f-xhwk7\" (UID: \"e2c168dd-a6d0-40e6-832b-e5658df4e05d\") " pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.511910 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2c168dd-a6d0-40e6-832b-e5658df4e05d-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-xhwk7\" (UID: \"e2c168dd-a6d0-40e6-832b-e5658df4e05d\") " pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.517810 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/e2c168dd-a6d0-40e6-832b-e5658df4e05d-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-xhwk7\" (UID: \"e2c168dd-a6d0-40e6-832b-e5658df4e05d\") " pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.539321 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxz2s\" (UniqueName: \"kubernetes.io/projected/e2c168dd-a6d0-40e6-832b-e5658df4e05d-kube-api-access-mxz2s\") pod \"observability-operator-6dd7dd855f-xhwk7\" (UID: \"e2c168dd-a6d0-40e6-832b-e5658df4e05d\") " pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.560996 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563706-pg6sc" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.613234 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.644655 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.724560 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-7479c95c5-vfhcm"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.725929 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.735332 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.735526 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-kdlhd" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.738415 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-webhook-cert\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.738448 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq9lm\" (UniqueName: \"kubernetes.io/projected/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-kube-api-access-fq9lm\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.738548 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-apiservice-cert\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.738567 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-openshift-service-ca\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.767644 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-7479c95c5-vfhcm"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.811546 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk"] Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.853001 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-apiservice-cert\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.853080 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-openshift-service-ca\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.853153 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-webhook-cert\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.853175 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq9lm\" (UniqueName: \"kubernetes.io/projected/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-kube-api-access-fq9lm\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.854866 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-openshift-service-ca\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.867088 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-apiservice-cert\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.875290 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-webhook-cert\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:00 crc kubenswrapper[4917]: I0318 08:26:00.890259 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq9lm\" (UniqueName: \"kubernetes.io/projected/dd85f6bd-3b28-4ae4-bc48-7733ccd8546e-kube-api-access-fq9lm\") pod \"perses-operator-7479c95c5-vfhcm\" (UID: \"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e\") " pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:01 crc kubenswrapper[4917]: I0318 08:26:01.156184 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:01 crc kubenswrapper[4917]: I0318 08:26:01.240968 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" event={"ID":"0041ff4d-5053-427a-aaac-dadb33cd80c8","Type":"ContainerStarted","Data":"e6423d2eaaac860da0f62e4ca9502dc439642707a4a3a2a7f6e590465492422d"} Mar 18 08:26:01 crc kubenswrapper[4917]: I0318 08:26:01.242753 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" event={"ID":"a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6","Type":"ContainerStarted","Data":"6c2343fa266f9ccb80faed479d22402a69270a1ce6b148bc57e377abc1d1bfb4"} Mar 18 08:26:01 crc kubenswrapper[4917]: I0318 08:26:01.247618 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc" event={"ID":"1ff89bec-5b99-41f8-8859-90b9fac30a81","Type":"ContainerStarted","Data":"ae0f2630c52ad41d3b4a0ae0de703f9f2bff0b40149b0e049e473f82d0598e52"} Mar 18 08:26:01 crc kubenswrapper[4917]: I0318 08:26:01.345444 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563706-pg6sc"] Mar 18 08:26:01 crc kubenswrapper[4917]: I0318 08:26:01.433071 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-xhwk7"] Mar 18 08:26:01 crc kubenswrapper[4917]: I0318 08:26:01.748496 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-7479c95c5-vfhcm"] Mar 18 08:26:02 crc kubenswrapper[4917]: I0318 08:26:02.280819 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" event={"ID":"e2c168dd-a6d0-40e6-832b-e5658df4e05d","Type":"ContainerStarted","Data":"33d94cc414abec5d5588c933a578a81c0c22017d2fb237f49a7d5254a36ca7eb"} Mar 18 08:26:02 crc kubenswrapper[4917]: I0318 08:26:02.302952 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-7479c95c5-vfhcm" event={"ID":"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e","Type":"ContainerStarted","Data":"036374985df95675c2b112de556f199f3d42daf7d09b29b7857e06f37a35d0e8"} Mar 18 08:26:02 crc kubenswrapper[4917]: I0318 08:26:02.320257 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563706-pg6sc" event={"ID":"6c0b0cea-1278-459b-917b-63e0cbd85682","Type":"ContainerStarted","Data":"6ba59eda3071e2a409fd05421399247c4919a9a741086b3b7667220e51f7be0c"} Mar 18 08:26:03 crc kubenswrapper[4917]: I0318 08:26:03.332305 4917 generic.go:334] "Generic (PLEG): container finished" podID="6c0b0cea-1278-459b-917b-63e0cbd85682" containerID="d297982df4c019212acf7285b5d98c16566978de2a7f101dc7f43fcbd6e698f9" exitCode=0 Mar 18 08:26:03 crc kubenswrapper[4917]: I0318 08:26:03.332400 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563706-pg6sc" event={"ID":"6c0b0cea-1278-459b-917b-63e0cbd85682","Type":"ContainerDied","Data":"d297982df4c019212acf7285b5d98c16566978de2a7f101dc7f43fcbd6e698f9"} Mar 18 08:26:05 crc kubenswrapper[4917]: I0318 08:26:05.067516 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563706-pg6sc" Mar 18 08:26:05 crc kubenswrapper[4917]: I0318 08:26:05.185898 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwskn\" (UniqueName: \"kubernetes.io/projected/6c0b0cea-1278-459b-917b-63e0cbd85682-kube-api-access-qwskn\") pod \"6c0b0cea-1278-459b-917b-63e0cbd85682\" (UID: \"6c0b0cea-1278-459b-917b-63e0cbd85682\") " Mar 18 08:26:05 crc kubenswrapper[4917]: I0318 08:26:05.198720 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0b0cea-1278-459b-917b-63e0cbd85682-kube-api-access-qwskn" (OuterVolumeSpecName: "kube-api-access-qwskn") pod "6c0b0cea-1278-459b-917b-63e0cbd85682" (UID: "6c0b0cea-1278-459b-917b-63e0cbd85682"). InnerVolumeSpecName "kube-api-access-qwskn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:26:05 crc kubenswrapper[4917]: I0318 08:26:05.290368 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwskn\" (UniqueName: \"kubernetes.io/projected/6c0b0cea-1278-459b-917b-63e0cbd85682-kube-api-access-qwskn\") on node \"crc\" DevicePath \"\"" Mar 18 08:26:05 crc kubenswrapper[4917]: I0318 08:26:05.374336 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563706-pg6sc" event={"ID":"6c0b0cea-1278-459b-917b-63e0cbd85682","Type":"ContainerDied","Data":"6ba59eda3071e2a409fd05421399247c4919a9a741086b3b7667220e51f7be0c"} Mar 18 08:26:05 crc kubenswrapper[4917]: I0318 08:26:05.374378 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ba59eda3071e2a409fd05421399247c4919a9a741086b3b7667220e51f7be0c" Mar 18 08:26:05 crc kubenswrapper[4917]: I0318 08:26:05.374443 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563706-pg6sc" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.150801 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563700-skttq"] Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.157490 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563700-skttq"] Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.204437 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m9hj2"] Mar 18 08:26:06 crc kubenswrapper[4917]: E0318 08:26:06.204982 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0b0cea-1278-459b-917b-63e0cbd85682" containerName="oc" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.205007 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0b0cea-1278-459b-917b-63e0cbd85682" containerName="oc" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.205241 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0b0cea-1278-459b-917b-63e0cbd85682" containerName="oc" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.207280 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.234284 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9hj2"] Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.322507 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-catalog-content\") pod \"redhat-marketplace-m9hj2\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.322590 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-utilities\") pod \"redhat-marketplace-m9hj2\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.322740 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljj9n\" (UniqueName: \"kubernetes.io/projected/d354faa0-f322-412b-a04e-967c1c3bbc87-kube-api-access-ljj9n\") pod \"redhat-marketplace-m9hj2\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.424768 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljj9n\" (UniqueName: \"kubernetes.io/projected/d354faa0-f322-412b-a04e-967c1c3bbc87-kube-api-access-ljj9n\") pod \"redhat-marketplace-m9hj2\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.424909 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-catalog-content\") pod \"redhat-marketplace-m9hj2\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.424946 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-utilities\") pod \"redhat-marketplace-m9hj2\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.425431 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-catalog-content\") pod \"redhat-marketplace-m9hj2\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.425486 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-utilities\") pod \"redhat-marketplace-m9hj2\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.442325 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljj9n\" (UniqueName: \"kubernetes.io/projected/d354faa0-f322-412b-a04e-967c1c3bbc87-kube-api-access-ljj9n\") pod \"redhat-marketplace-m9hj2\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:06 crc kubenswrapper[4917]: I0318 08:26:06.534710 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:07 crc kubenswrapper[4917]: I0318 08:26:07.786719 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c" path="/var/lib/kubelet/pods/2cdfaeb4-4f8e-4e63-b4e9-c6d3226a517c/volumes" Mar 18 08:26:09 crc kubenswrapper[4917]: I0318 08:26:09.804041 4917 scope.go:117] "RemoveContainer" containerID="060da4ba1c5057d1e59507b771cb775090141d7ea0ac5f28357ed96dbf517d9b" Mar 18 08:26:10 crc kubenswrapper[4917]: W0318 08:26:10.811623 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd354faa0_f322_412b_a04e_967c1c3bbc87.slice/crio-8c4aa2fd3a959459cffb23a793ce4cb5a70eca9243ba62dcec517a275f798bb0 WatchSource:0}: Error finding container 8c4aa2fd3a959459cffb23a793ce4cb5a70eca9243ba62dcec517a275f798bb0: Status 404 returned error can't find the container with id 8c4aa2fd3a959459cffb23a793ce4cb5a70eca9243ba62dcec517a275f798bb0 Mar 18 08:26:10 crc kubenswrapper[4917]: I0318 08:26:10.814747 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9hj2"] Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.461618 4917 generic.go:334] "Generic (PLEG): container finished" podID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerID="6da409cc667e3c48efbb04d261203e583b0f668b061788d274907f34c18a5378" exitCode=0 Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.461709 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9hj2" event={"ID":"d354faa0-f322-412b-a04e-967c1c3bbc87","Type":"ContainerDied","Data":"6da409cc667e3c48efbb04d261203e583b0f668b061788d274907f34c18a5378"} Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.462021 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9hj2" event={"ID":"d354faa0-f322-412b-a04e-967c1c3bbc87","Type":"ContainerStarted","Data":"8c4aa2fd3a959459cffb23a793ce4cb5a70eca9243ba62dcec517a275f798bb0"} Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.463875 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.465798 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc" event={"ID":"1ff89bec-5b99-41f8-8859-90b9fac30a81","Type":"ContainerStarted","Data":"280acba3368ed3a68db8c26f63e8cb198df22fec10f2b170c1ed0f71c1f54610"} Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.468277 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" event={"ID":"0041ff4d-5053-427a-aaac-dadb33cd80c8","Type":"ContainerStarted","Data":"40b8c07736c51a16922d1aa2cb3b2584818204adf9a6a08c284e4e11de3f744c"} Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.470838 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" event={"ID":"e2c168dd-a6d0-40e6-832b-e5658df4e05d","Type":"ContainerStarted","Data":"64c68b9aa449cd88fe1c7baca52e45f9419df1ca6c522075b0f46fca565cc28e"} Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.472112 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.474413 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" event={"ID":"a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6","Type":"ContainerStarted","Data":"bd243378dbf67d8f9689714245c781d3c9ad6e56b41435a3603129b8f070e947"} Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.477198 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-7479c95c5-vfhcm" event={"ID":"dd85f6bd-3b28-4ae4-bc48-7733ccd8546e","Type":"ContainerStarted","Data":"aa066dfb9d8961821aa8bbad99e89941ff9514c3c32fc1b373fb8b1d15988ae3"} Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.478179 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.503236 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-7rjvc" podStartSLOduration=2.564774347 podStartE2EDuration="12.503210598s" podCreationTimestamp="2026-03-18 08:25:59 +0000 UTC" firstStartedPulling="2026-03-18 08:26:00.381414658 +0000 UTC m=+5945.322569372" lastFinishedPulling="2026-03-18 08:26:10.319850909 +0000 UTC m=+5955.261005623" observedRunningTime="2026-03-18 08:26:11.49420902 +0000 UTC m=+5956.435363734" watchObservedRunningTime="2026-03-18 08:26:11.503210598 +0000 UTC m=+5956.444365312" Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.526347 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk" podStartSLOduration=8.408401549 podStartE2EDuration="12.526322758s" podCreationTimestamp="2026-03-18 08:25:59 +0000 UTC" firstStartedPulling="2026-03-18 08:26:00.857108525 +0000 UTC m=+5945.798263229" lastFinishedPulling="2026-03-18 08:26:04.975029724 +0000 UTC m=+5949.916184438" observedRunningTime="2026-03-18 08:26:11.519311478 +0000 UTC m=+5956.460466232" watchObservedRunningTime="2026-03-18 08:26:11.526322758 +0000 UTC m=+5956.467477462" Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.534785 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.549536 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb" podStartSLOduration=8.271802857 podStartE2EDuration="12.549512561s" podCreationTimestamp="2026-03-18 08:25:59 +0000 UTC" firstStartedPulling="2026-03-18 08:26:00.704994457 +0000 UTC m=+5945.646149161" lastFinishedPulling="2026-03-18 08:26:04.982704151 +0000 UTC m=+5949.923858865" observedRunningTime="2026-03-18 08:26:11.540354609 +0000 UTC m=+5956.481509343" watchObservedRunningTime="2026-03-18 08:26:11.549512561 +0000 UTC m=+5956.490667295" Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.586436 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-xhwk7" podStartSLOduration=2.7129922840000003 podStartE2EDuration="11.586413496s" podCreationTimestamp="2026-03-18 08:26:00 +0000 UTC" firstStartedPulling="2026-03-18 08:26:01.446445358 +0000 UTC m=+5946.387600062" lastFinishedPulling="2026-03-18 08:26:10.31986656 +0000 UTC m=+5955.261021274" observedRunningTime="2026-03-18 08:26:11.573617636 +0000 UTC m=+5956.514772370" watchObservedRunningTime="2026-03-18 08:26:11.586413496 +0000 UTC m=+5956.527568230" Mar 18 08:26:11 crc kubenswrapper[4917]: I0318 08:26:11.614516 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-7479c95c5-vfhcm" podStartSLOduration=3.036980802 podStartE2EDuration="11.614497757s" podCreationTimestamp="2026-03-18 08:26:00 +0000 UTC" firstStartedPulling="2026-03-18 08:26:01.749435657 +0000 UTC m=+5946.690590391" lastFinishedPulling="2026-03-18 08:26:10.326952632 +0000 UTC m=+5955.268107346" observedRunningTime="2026-03-18 08:26:11.600500617 +0000 UTC m=+5956.541655331" watchObservedRunningTime="2026-03-18 08:26:11.614497757 +0000 UTC m=+5956.555652471" Mar 18 08:26:12 crc kubenswrapper[4917]: I0318 08:26:12.490339 4917 generic.go:334] "Generic (PLEG): container finished" podID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerID="72629b7bb3a51cb32244d4b46e9ec1972ce3791b22da6d4c41075e725ffd9347" exitCode=0 Mar 18 08:26:12 crc kubenswrapper[4917]: I0318 08:26:12.490406 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9hj2" event={"ID":"d354faa0-f322-412b-a04e-967c1c3bbc87","Type":"ContainerDied","Data":"72629b7bb3a51cb32244d4b46e9ec1972ce3791b22da6d4c41075e725ffd9347"} Mar 18 08:26:13 crc kubenswrapper[4917]: I0318 08:26:13.502834 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9hj2" event={"ID":"d354faa0-f322-412b-a04e-967c1c3bbc87","Type":"ContainerStarted","Data":"1e3a9dd8839b5074d3b3ecc188c79c18a7562a163704e742c11e08f1e3891867"} Mar 18 08:26:13 crc kubenswrapper[4917]: I0318 08:26:13.540594 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m9hj2" podStartSLOduration=5.902717009 podStartE2EDuration="7.540561339s" podCreationTimestamp="2026-03-18 08:26:06 +0000 UTC" firstStartedPulling="2026-03-18 08:26:11.46366687 +0000 UTC m=+5956.404821574" lastFinishedPulling="2026-03-18 08:26:13.10151119 +0000 UTC m=+5958.042665904" observedRunningTime="2026-03-18 08:26:13.536225834 +0000 UTC m=+5958.477380548" watchObservedRunningTime="2026-03-18 08:26:13.540561339 +0000 UTC m=+5958.481716053" Mar 18 08:26:16 crc kubenswrapper[4917]: I0318 08:26:16.535511 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:16 crc kubenswrapper[4917]: I0318 08:26:16.535840 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:16 crc kubenswrapper[4917]: I0318 08:26:16.609330 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:21 crc kubenswrapper[4917]: I0318 08:26:21.159418 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-7479c95c5-vfhcm" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.024637 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.025406 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" containerName="openstackclient" containerID="cri-o://8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96" gracePeriod=2 Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.042835 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.082526 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 08:26:25 crc kubenswrapper[4917]: E0318 08:26:25.091144 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" containerName="openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.091181 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" containerName="openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.091379 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" containerName="openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.092063 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.131345 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" podUID="da782152-b2cc-493f-9cdb-084684cd752f" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.143485 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.185727 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.185780 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config-secret\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.185806 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjd7\" (UniqueName: \"kubernetes.io/projected/da782152-b2cc-493f-9cdb-084684cd752f-kube-api-access-kpjd7\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.185852 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.286990 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config-secret\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.287033 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjd7\" (UniqueName: \"kubernetes.io/projected/da782152-b2cc-493f-9cdb-084684cd752f-kube-api-access-kpjd7\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.287095 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.287205 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.288150 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.296182 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.306459 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config-secret\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.313388 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjd7\" (UniqueName: \"kubernetes.io/projected/da782152-b2cc-493f-9cdb-084684cd752f-kube-api-access-kpjd7\") pod \"openstackclient\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.321293 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.322711 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.324705 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-nqfxr" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.339416 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.430188 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.490740 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2hsb\" (UniqueName: \"kubernetes.io/projected/d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c-kube-api-access-c2hsb\") pod \"kube-state-metrics-0\" (UID: \"d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c\") " pod="openstack/kube-state-metrics-0" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.592323 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2hsb\" (UniqueName: \"kubernetes.io/projected/d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c-kube-api-access-c2hsb\") pod \"kube-state-metrics-0\" (UID: \"d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c\") " pod="openstack/kube-state-metrics-0" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.638212 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2hsb\" (UniqueName: \"kubernetes.io/projected/d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c-kube-api-access-c2hsb\") pod \"kube-state-metrics-0\" (UID: \"d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c\") " pod="openstack/kube-state-metrics-0" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.700763 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.950244 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.952969 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.955971 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.956249 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.956403 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.957046 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-86w6d" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.966227 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 18 08:26:25 crc kubenswrapper[4917]: I0318 08:26:25.969009 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.106683 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e280f1c8-4761-422b-a9c2-5c429de52eef-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.106770 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e280f1c8-4761-422b-a9c2-5c429de52eef-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.106807 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e280f1c8-4761-422b-a9c2-5c429de52eef-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.106843 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e280f1c8-4761-422b-a9c2-5c429de52eef-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.106871 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/e280f1c8-4761-422b-a9c2-5c429de52eef-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.106924 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ndw\" (UniqueName: \"kubernetes.io/projected/e280f1c8-4761-422b-a9c2-5c429de52eef-kube-api-access-c6ndw\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.106948 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e280f1c8-4761-422b-a9c2-5c429de52eef-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.184505 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 08:26:26 crc kubenswrapper[4917]: W0318 08:26:26.186011 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda782152_b2cc_493f_9cdb_084684cd752f.slice/crio-6d893681765472d6501fb7b678bc432e30452871e3e13020a824e9302d46941c WatchSource:0}: Error finding container 6d893681765472d6501fb7b678bc432e30452871e3e13020a824e9302d46941c: Status 404 returned error can't find the container with id 6d893681765472d6501fb7b678bc432e30452871e3e13020a824e9302d46941c Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.210335 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ndw\" (UniqueName: \"kubernetes.io/projected/e280f1c8-4761-422b-a9c2-5c429de52eef-kube-api-access-c6ndw\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.210390 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e280f1c8-4761-422b-a9c2-5c429de52eef-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.210437 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e280f1c8-4761-422b-a9c2-5c429de52eef-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.210492 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e280f1c8-4761-422b-a9c2-5c429de52eef-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.210527 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e280f1c8-4761-422b-a9c2-5c429de52eef-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.210558 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e280f1c8-4761-422b-a9c2-5c429de52eef-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.210604 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/e280f1c8-4761-422b-a9c2-5c429de52eef-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.211120 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/e280f1c8-4761-422b-a9c2-5c429de52eef-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.225118 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e280f1c8-4761-422b-a9c2-5c429de52eef-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.231843 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e280f1c8-4761-422b-a9c2-5c429de52eef-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.240103 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e280f1c8-4761-422b-a9c2-5c429de52eef-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.246095 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/e280f1c8-4761-422b-a9c2-5c429de52eef-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.254258 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ndw\" (UniqueName: \"kubernetes.io/projected/e280f1c8-4761-422b-a9c2-5c429de52eef-kube-api-access-c6ndw\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.294154 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/e280f1c8-4761-422b-a9c2-5c429de52eef-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"e280f1c8-4761-422b-a9c2-5c429de52eef\") " pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.392796 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.579818 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.639936 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.666843 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c","Type":"ContainerStarted","Data":"6f1162f90257725167a737fbfa67bc17f17cafe9198f4f7ce624b9c3bb86cdf9"} Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.687713 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da782152-b2cc-493f-9cdb-084684cd752f","Type":"ContainerStarted","Data":"898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b"} Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.687756 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"da782152-b2cc-493f-9cdb-084684cd752f","Type":"ContainerStarted","Data":"6d893681765472d6501fb7b678bc432e30452871e3e13020a824e9302d46941c"} Mar 18 08:26:26 crc kubenswrapper[4917]: I0318 08:26:26.738690 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.738674144 podStartE2EDuration="1.738674144s" podCreationTimestamp="2026-03-18 08:26:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:26:26.727906764 +0000 UTC m=+5971.669061488" watchObservedRunningTime="2026-03-18 08:26:26.738674144 +0000 UTC m=+5971.679828858" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.009849 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.025507 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.040328 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.040519 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-57vl2" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.040655 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.040766 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.040870 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.048037 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.048372 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.063322 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.094937 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130379 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130433 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czb8h\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-kube-api-access-czb8h\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130483 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-config\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130510 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130536 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130605 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f569-ea89-4735-b69d-d642fcee0295-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130627 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130661 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130689 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.130721 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.235684 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czb8h\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-kube-api-access-czb8h\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.235760 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-config\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.235788 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.235820 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.235872 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f569-ea89-4735-b69d-d642fcee0295-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.235896 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.235918 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.235945 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.235980 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.236016 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.240284 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.240694 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.244138 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.250788 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.251754 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f569-ea89-4735-b69d-d642fcee0295-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.253363 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.253789 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-config\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.257294 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.258043 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.258071 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0bb2d1890a357d19ffdcc97238b432893576203b5079f7b9f5a7bdd3cad7ae3a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.280857 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czb8h\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-kube-api-access-czb8h\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.327073 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") pod \"prometheus-metric-storage-0\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.392181 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.467905 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.635907 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.643048 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" podUID="da782152-b2cc-493f-9cdb-084684cd752f" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.744592 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c","Type":"ContainerStarted","Data":"498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4"} Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.744998 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.748731 4917 generic.go:334] "Generic (PLEG): container finished" podID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" containerID="8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96" exitCode=137 Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.748792 4917 scope.go:117] "RemoveContainer" containerID="8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.748894 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.749094 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-combined-ca-bundle\") pod \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.749181 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44tl8\" (UniqueName: \"kubernetes.io/projected/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-kube-api-access-44tl8\") pod \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.749300 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config\") pod \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.749339 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config-secret\") pod \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\" (UID: \"a64a3f74-fdb0-45c0-970b-0fc6220fb8aa\") " Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.778746 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-kube-api-access-44tl8" (OuterVolumeSpecName: "kube-api-access-44tl8") pod "a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" (UID: "a64a3f74-fdb0-45c0-970b-0fc6220fb8aa"). InnerVolumeSpecName "kube-api-access-44tl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.794781 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.287049283 podStartE2EDuration="2.794751946s" podCreationTimestamp="2026-03-18 08:26:25 +0000 UTC" firstStartedPulling="2026-03-18 08:26:26.405467613 +0000 UTC m=+5971.346622327" lastFinishedPulling="2026-03-18 08:26:26.913170286 +0000 UTC m=+5971.854324990" observedRunningTime="2026-03-18 08:26:27.774746111 +0000 UTC m=+5972.715900855" watchObservedRunningTime="2026-03-18 08:26:27.794751946 +0000 UTC m=+5972.735906660" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.797913 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" podUID="da782152-b2cc-493f-9cdb-084684cd752f" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.829825 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" (UID: "a64a3f74-fdb0-45c0-970b-0fc6220fb8aa"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.830273 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e280f1c8-4761-422b-a9c2-5c429de52eef","Type":"ContainerStarted","Data":"c7133fd59e8489ee4a74695d69889f8a83d1c2f7aa4a3831827501a9e0c370c7"} Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.843917 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" (UID: "a64a3f74-fdb0-45c0-970b-0fc6220fb8aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.847494 4917 scope.go:117] "RemoveContainer" containerID="8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96" Mar 18 08:26:27 crc kubenswrapper[4917]: E0318 08:26:27.848243 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96\": container with ID starting with 8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96 not found: ID does not exist" containerID="8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.848322 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96"} err="failed to get container status \"8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96\": rpc error: code = NotFound desc = could not find container \"8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96\": container with ID starting with 8f9d7dc6347b7fa56f51f56ac8c5f8f16fe30e29316387054448f48f2e550e96 not found: ID does not exist" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.850727 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" (UID: "a64a3f74-fdb0-45c0-970b-0fc6220fb8aa"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.852383 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.852512 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44tl8\" (UniqueName: \"kubernetes.io/projected/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-kube-api-access-44tl8\") on node \"crc\" DevicePath \"\"" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.852532 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:26:27 crc kubenswrapper[4917]: I0318 08:26:27.852544 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 08:26:28 crc kubenswrapper[4917]: I0318 08:26:28.070041 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" podUID="da782152-b2cc-493f-9cdb-084684cd752f" Mar 18 08:26:28 crc kubenswrapper[4917]: W0318 08:26:28.144131 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb7f569_ea89_4735_b69d_d642fcee0295.slice/crio-4f635c0a804a1e7638ca6da60ef6497ea229b8dc8eedadac1eda21e02beb53ed WatchSource:0}: Error finding container 4f635c0a804a1e7638ca6da60ef6497ea229b8dc8eedadac1eda21e02beb53ed: Status 404 returned error can't find the container with id 4f635c0a804a1e7638ca6da60ef6497ea229b8dc8eedadac1eda21e02beb53ed Mar 18 08:26:28 crc kubenswrapper[4917]: I0318 08:26:28.153910 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 08:26:28 crc kubenswrapper[4917]: I0318 08:26:28.426111 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9hj2"] Mar 18 08:26:28 crc kubenswrapper[4917]: I0318 08:26:28.426897 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m9hj2" podUID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerName="registry-server" containerID="cri-o://1e3a9dd8839b5074d3b3ecc188c79c18a7562a163704e742c11e08f1e3891867" gracePeriod=2 Mar 18 08:26:28 crc kubenswrapper[4917]: I0318 08:26:28.840482 4917 generic.go:334] "Generic (PLEG): container finished" podID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerID="1e3a9dd8839b5074d3b3ecc188c79c18a7562a163704e742c11e08f1e3891867" exitCode=0 Mar 18 08:26:28 crc kubenswrapper[4917]: I0318 08:26:28.840560 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9hj2" event={"ID":"d354faa0-f322-412b-a04e-967c1c3bbc87","Type":"ContainerDied","Data":"1e3a9dd8839b5074d3b3ecc188c79c18a7562a163704e742c11e08f1e3891867"} Mar 18 08:26:28 crc kubenswrapper[4917]: I0318 08:26:28.841857 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerStarted","Data":"4f635c0a804a1e7638ca6da60ef6497ea229b8dc8eedadac1eda21e02beb53ed"} Mar 18 08:26:28 crc kubenswrapper[4917]: I0318 08:26:28.992191 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.078481 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-catalog-content\") pod \"d354faa0-f322-412b-a04e-967c1c3bbc87\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.078655 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljj9n\" (UniqueName: \"kubernetes.io/projected/d354faa0-f322-412b-a04e-967c1c3bbc87-kube-api-access-ljj9n\") pod \"d354faa0-f322-412b-a04e-967c1c3bbc87\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.078683 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-utilities\") pod \"d354faa0-f322-412b-a04e-967c1c3bbc87\" (UID: \"d354faa0-f322-412b-a04e-967c1c3bbc87\") " Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.079461 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-utilities" (OuterVolumeSpecName: "utilities") pod "d354faa0-f322-412b-a04e-967c1c3bbc87" (UID: "d354faa0-f322-412b-a04e-967c1c3bbc87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.089851 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d354faa0-f322-412b-a04e-967c1c3bbc87-kube-api-access-ljj9n" (OuterVolumeSpecName: "kube-api-access-ljj9n") pod "d354faa0-f322-412b-a04e-967c1c3bbc87" (UID: "d354faa0-f322-412b-a04e-967c1c3bbc87"). InnerVolumeSpecName "kube-api-access-ljj9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.107755 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d354faa0-f322-412b-a04e-967c1c3bbc87" (UID: "d354faa0-f322-412b-a04e-967c1c3bbc87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.181226 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljj9n\" (UniqueName: \"kubernetes.io/projected/d354faa0-f322-412b-a04e-967c1c3bbc87-kube-api-access-ljj9n\") on node \"crc\" DevicePath \"\"" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.182571 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.182632 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d354faa0-f322-412b-a04e-967c1c3bbc87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.783563 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a64a3f74-fdb0-45c0-970b-0fc6220fb8aa" path="/var/lib/kubelet/pods/a64a3f74-fdb0-45c0-970b-0fc6220fb8aa/volumes" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.852856 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m9hj2" event={"ID":"d354faa0-f322-412b-a04e-967c1c3bbc87","Type":"ContainerDied","Data":"8c4aa2fd3a959459cffb23a793ce4cb5a70eca9243ba62dcec517a275f798bb0"} Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.852898 4917 scope.go:117] "RemoveContainer" containerID="1e3a9dd8839b5074d3b3ecc188c79c18a7562a163704e742c11e08f1e3891867" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.853050 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m9hj2" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.881985 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9hj2"] Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.893551 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m9hj2"] Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.935179 4917 scope.go:117] "RemoveContainer" containerID="72629b7bb3a51cb32244d4b46e9ec1972ce3791b22da6d4c41075e725ffd9347" Mar 18 08:26:29 crc kubenswrapper[4917]: I0318 08:26:29.989211 4917 scope.go:117] "RemoveContainer" containerID="6da409cc667e3c48efbb04d261203e583b0f668b061788d274907f34c18a5378" Mar 18 08:26:31 crc kubenswrapper[4917]: I0318 08:26:31.783032 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d354faa0-f322-412b-a04e-967c1c3bbc87" path="/var/lib/kubelet/pods/d354faa0-f322-412b-a04e-967c1c3bbc87/volumes" Mar 18 08:26:33 crc kubenswrapper[4917]: I0318 08:26:33.900258 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerStarted","Data":"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6"} Mar 18 08:26:33 crc kubenswrapper[4917]: I0318 08:26:33.904875 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e280f1c8-4761-422b-a9c2-5c429de52eef","Type":"ContainerStarted","Data":"629da6586e70d8b3f0100a430c905af358b17ee8e673d67b3a9d30198da5d3ca"} Mar 18 08:26:35 crc kubenswrapper[4917]: I0318 08:26:35.714034 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 08:26:40 crc kubenswrapper[4917]: I0318 08:26:40.057175 4917 generic.go:334] "Generic (PLEG): container finished" podID="e280f1c8-4761-422b-a9c2-5c429de52eef" containerID="629da6586e70d8b3f0100a430c905af358b17ee8e673d67b3a9d30198da5d3ca" exitCode=0 Mar 18 08:26:40 crc kubenswrapper[4917]: I0318 08:26:40.057618 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e280f1c8-4761-422b-a9c2-5c429de52eef","Type":"ContainerDied","Data":"629da6586e70d8b3f0100a430c905af358b17ee8e673d67b3a9d30198da5d3ca"} Mar 18 08:26:41 crc kubenswrapper[4917]: I0318 08:26:41.070500 4917 generic.go:334] "Generic (PLEG): container finished" podID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerID="23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6" exitCode=0 Mar 18 08:26:41 crc kubenswrapper[4917]: I0318 08:26:41.070568 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerDied","Data":"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6"} Mar 18 08:26:48 crc kubenswrapper[4917]: I0318 08:26:48.159830 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerStarted","Data":"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584"} Mar 18 08:26:48 crc kubenswrapper[4917]: I0318 08:26:48.161743 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e280f1c8-4761-422b-a9c2-5c429de52eef","Type":"ContainerStarted","Data":"e6f19710216ac1307f40116de67e47c8fe6fc8b89fd926d488249e24d88c0fbd"} Mar 18 08:26:53 crc kubenswrapper[4917]: I0318 08:26:53.217742 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerStarted","Data":"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed"} Mar 18 08:26:53 crc kubenswrapper[4917]: I0318 08:26:53.223844 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"e280f1c8-4761-422b-a9c2-5c429de52eef","Type":"ContainerStarted","Data":"84d1da44283ed411b4a37d1b220523895ca841cd9865c7abed76d1d9e4ddf4d0"} Mar 18 08:26:53 crc kubenswrapper[4917]: I0318 08:26:53.224842 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:53 crc kubenswrapper[4917]: I0318 08:26:53.228673 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 18 08:26:53 crc kubenswrapper[4917]: I0318 08:26:53.261022 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.89912902 podStartE2EDuration="28.260994833s" podCreationTimestamp="2026-03-18 08:26:25 +0000 UTC" firstStartedPulling="2026-03-18 08:26:27.411714767 +0000 UTC m=+5972.352869481" lastFinishedPulling="2026-03-18 08:26:47.77358057 +0000 UTC m=+5992.714735294" observedRunningTime="2026-03-18 08:26:53.251624126 +0000 UTC m=+5998.192778900" watchObservedRunningTime="2026-03-18 08:26:53.260994833 +0000 UTC m=+5998.202149587" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.124260 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-w7m74"] Mar 18 08:26:56 crc kubenswrapper[4917]: E0318 08:26:56.125683 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerName="extract-utilities" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.125714 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerName="extract-utilities" Mar 18 08:26:56 crc kubenswrapper[4917]: E0318 08:26:56.125756 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerName="registry-server" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.125776 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerName="registry-server" Mar 18 08:26:56 crc kubenswrapper[4917]: E0318 08:26:56.125856 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerName="extract-content" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.125874 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerName="extract-content" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.126384 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d354faa0-f322-412b-a04e-967c1c3bbc87" containerName="registry-server" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.143049 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w7m74"] Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.143192 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.296753 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-utilities\") pod \"redhat-operators-w7m74\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.298480 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-catalog-content\") pod \"redhat-operators-w7m74\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.300007 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lns7p\" (UniqueName: \"kubernetes.io/projected/21f0511a-4483-45ae-820c-36161e39a281-kube-api-access-lns7p\") pod \"redhat-operators-w7m74\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.404765 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lns7p\" (UniqueName: \"kubernetes.io/projected/21f0511a-4483-45ae-820c-36161e39a281-kube-api-access-lns7p\") pod \"redhat-operators-w7m74\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.405086 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-utilities\") pod \"redhat-operators-w7m74\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.405179 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-catalog-content\") pod \"redhat-operators-w7m74\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.405824 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-catalog-content\") pod \"redhat-operators-w7m74\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.406928 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-utilities\") pod \"redhat-operators-w7m74\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.429168 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lns7p\" (UniqueName: \"kubernetes.io/projected/21f0511a-4483-45ae-820c-36161e39a281-kube-api-access-lns7p\") pod \"redhat-operators-w7m74\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.480688 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:26:56 crc kubenswrapper[4917]: I0318 08:26:56.986980 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-w7m74"] Mar 18 08:26:57 crc kubenswrapper[4917]: I0318 08:26:57.290189 4917 generic.go:334] "Generic (PLEG): container finished" podID="21f0511a-4483-45ae-820c-36161e39a281" containerID="95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae" exitCode=0 Mar 18 08:26:57 crc kubenswrapper[4917]: I0318 08:26:57.290253 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7m74" event={"ID":"21f0511a-4483-45ae-820c-36161e39a281","Type":"ContainerDied","Data":"95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae"} Mar 18 08:26:57 crc kubenswrapper[4917]: I0318 08:26:57.290657 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7m74" event={"ID":"21f0511a-4483-45ae-820c-36161e39a281","Type":"ContainerStarted","Data":"67fc15bc5cf203f876568205ff3796ab9c18e2ce00f4cb607e4ea8c5a2ab0244"} Mar 18 08:26:57 crc kubenswrapper[4917]: I0318 08:26:57.294546 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerStarted","Data":"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c"} Mar 18 08:26:57 crc kubenswrapper[4917]: I0318 08:26:57.351050 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.269400741 podStartE2EDuration="32.351025926s" podCreationTimestamp="2026-03-18 08:26:25 +0000 UTC" firstStartedPulling="2026-03-18 08:26:28.147653536 +0000 UTC m=+5973.088808250" lastFinishedPulling="2026-03-18 08:26:56.229278711 +0000 UTC m=+6001.170433435" observedRunningTime="2026-03-18 08:26:57.346739722 +0000 UTC m=+6002.287894456" watchObservedRunningTime="2026-03-18 08:26:57.351025926 +0000 UTC m=+6002.292180660" Mar 18 08:26:57 crc kubenswrapper[4917]: I0318 08:26:57.468922 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:57 crc kubenswrapper[4917]: I0318 08:26:57.468973 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:57 crc kubenswrapper[4917]: I0318 08:26:57.472254 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:58 crc kubenswrapper[4917]: I0318 08:26:58.305376 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7m74" event={"ID":"21f0511a-4483-45ae-820c-36161e39a281","Type":"ContainerStarted","Data":"281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b"} Mar 18 08:26:58 crc kubenswrapper[4917]: I0318 08:26:58.308419 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 08:26:59 crc kubenswrapper[4917]: I0318 08:26:59.044993 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ea18-account-create-update-n8jpt"] Mar 18 08:26:59 crc kubenswrapper[4917]: I0318 08:26:59.057288 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-786w2"] Mar 18 08:26:59 crc kubenswrapper[4917]: I0318 08:26:59.065137 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ea18-account-create-update-n8jpt"] Mar 18 08:26:59 crc kubenswrapper[4917]: I0318 08:26:59.075127 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-786w2"] Mar 18 08:26:59 crc kubenswrapper[4917]: I0318 08:26:59.787740 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dc6794f-aaa6-4c41-b7c1-668c0c840241" path="/var/lib/kubelet/pods/0dc6794f-aaa6-4c41-b7c1-668c0c840241/volumes" Mar 18 08:26:59 crc kubenswrapper[4917]: I0318 08:26:59.788747 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad1a26d5-418d-4dc5-b71c-7db92630eb74" path="/var/lib/kubelet/pods/ad1a26d5-418d-4dc5-b71c-7db92630eb74/volumes" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.373118 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.373367 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="da782152-b2cc-493f-9cdb-084684cd752f" containerName="openstackclient" containerID="cri-o://898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b" gracePeriod=2 Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.381426 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.398110 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 18 08:27:00 crc kubenswrapper[4917]: E0318 08:27:00.398691 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da782152-b2cc-493f-9cdb-084684cd752f" containerName="openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.398775 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="da782152-b2cc-493f-9cdb-084684cd752f" containerName="openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.399069 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="da782152-b2cc-493f-9cdb-084684cd752f" containerName="openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.399883 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.430142 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="da782152-b2cc-493f-9cdb-084684cd752f" podUID="f4465c4d-dda1-44ec-84d0-5a060ec3453f" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.458145 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.495880 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4465c4d-dda1-44ec-84d0-5a060ec3453f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.495990 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzj7x\" (UniqueName: \"kubernetes.io/projected/f4465c4d-dda1-44ec-84d0-5a060ec3453f-kube-api-access-jzj7x\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.496028 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f4465c4d-dda1-44ec-84d0-5a060ec3453f-openstack-config-secret\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.496047 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f4465c4d-dda1-44ec-84d0-5a060ec3453f-openstack-config\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.598199 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4465c4d-dda1-44ec-84d0-5a060ec3453f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.598397 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzj7x\" (UniqueName: \"kubernetes.io/projected/f4465c4d-dda1-44ec-84d0-5a060ec3453f-kube-api-access-jzj7x\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.598460 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f4465c4d-dda1-44ec-84d0-5a060ec3453f-openstack-config\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.598496 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f4465c4d-dda1-44ec-84d0-5a060ec3453f-openstack-config-secret\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.599686 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f4465c4d-dda1-44ec-84d0-5a060ec3453f-openstack-config\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.607226 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f4465c4d-dda1-44ec-84d0-5a060ec3453f-openstack-config-secret\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.621190 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4465c4d-dda1-44ec-84d0-5a060ec3453f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.625107 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzj7x\" (UniqueName: \"kubernetes.io/projected/f4465c4d-dda1-44ec-84d0-5a060ec3453f-kube-api-access-jzj7x\") pod \"openstackclient\" (UID: \"f4465c4d-dda1-44ec-84d0-5a060ec3453f\") " pod="openstack/openstackclient" Mar 18 08:27:00 crc kubenswrapper[4917]: I0318 08:27:00.739814 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:27:01 crc kubenswrapper[4917]: I0318 08:27:01.307827 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 18 08:27:01 crc kubenswrapper[4917]: I0318 08:27:01.332128 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f4465c4d-dda1-44ec-84d0-5a060ec3453f","Type":"ContainerStarted","Data":"bcb2329c226c1fe173252ed83168fa212ca2f54ddf1c2c80a1a2b002ffa0151e"} Mar 18 08:27:01 crc kubenswrapper[4917]: I0318 08:27:01.681744 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 08:27:01 crc kubenswrapper[4917]: I0318 08:27:01.682233 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="prometheus" containerID="cri-o://9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584" gracePeriod=600 Mar 18 08:27:01 crc kubenswrapper[4917]: I0318 08:27:01.682362 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="thanos-sidecar" containerID="cri-o://a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c" gracePeriod=600 Mar 18 08:27:01 crc kubenswrapper[4917]: I0318 08:27:01.682406 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="config-reloader" containerID="cri-o://f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed" gracePeriod=600 Mar 18 08:27:02 crc kubenswrapper[4917]: I0318 08:27:02.342795 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f4465c4d-dda1-44ec-84d0-5a060ec3453f","Type":"ContainerStarted","Data":"6030d6f74b196d84da990b759897781b52694ae8fc4e5c4d4528c35a6a655c61"} Mar 18 08:27:02 crc kubenswrapper[4917]: I0318 08:27:02.469662 4917 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.1.165:9090/-/ready\": dial tcp 10.217.1.165:9090: connect: connection refused" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.222235 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.230371 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.251657 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-combined-ca-bundle\") pod \"da782152-b2cc-493f-9cdb-084684cd752f\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.251711 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config-secret\") pod \"da782152-b2cc-493f-9cdb-084684cd752f\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.251829 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpjd7\" (UniqueName: \"kubernetes.io/projected/da782152-b2cc-493f-9cdb-084684cd752f-kube-api-access-kpjd7\") pod \"da782152-b2cc-493f-9cdb-084684cd752f\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.251921 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config\") pod \"da782152-b2cc-493f-9cdb-084684cd752f\" (UID: \"da782152-b2cc-493f-9cdb-084684cd752f\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.266654 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da782152-b2cc-493f-9cdb-084684cd752f-kube-api-access-kpjd7" (OuterVolumeSpecName: "kube-api-access-kpjd7") pod "da782152-b2cc-493f-9cdb-084684cd752f" (UID: "da782152-b2cc-493f-9cdb-084684cd752f"). InnerVolumeSpecName "kube-api-access-kpjd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.274727 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.2747097 podStartE2EDuration="3.2747097s" podCreationTimestamp="2026-03-18 08:27:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:27:02.368048271 +0000 UTC m=+6007.309202995" watchObservedRunningTime="2026-03-18 08:27:03.2747097 +0000 UTC m=+6008.215864414" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.287208 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "da782152-b2cc-493f-9cdb-084684cd752f" (UID: "da782152-b2cc-493f-9cdb-084684cd752f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.325278 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "da782152-b2cc-493f-9cdb-084684cd752f" (UID: "da782152-b2cc-493f-9cdb-084684cd752f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.330742 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da782152-b2cc-493f-9cdb-084684cd752f" (UID: "da782152-b2cc-493f-9cdb-084684cd752f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355087 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f569-ea89-4735-b69d-d642fcee0295-config-out\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355179 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-thanos-prometheus-http-client-file\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355312 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-1\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355371 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-config\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355406 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-0\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355447 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-web-config\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355503 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-2\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355573 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-tls-assets\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355768 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.355833 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czb8h\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-kube-api-access-czb8h\") pod \"9fb7f569-ea89-4735-b69d-d642fcee0295\" (UID: \"9fb7f569-ea89-4735-b69d-d642fcee0295\") " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.356382 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.356401 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.356418 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpjd7\" (UniqueName: \"kubernetes.io/projected/da782152-b2cc-493f-9cdb-084684cd752f-kube-api-access-kpjd7\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.356431 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/da782152-b2cc-493f-9cdb-084684cd752f-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.357271 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.357726 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.359457 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-config" (OuterVolumeSpecName: "config") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.359698 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.361201 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.363421 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb7f569-ea89-4735-b69d-d642fcee0295-config-out" (OuterVolumeSpecName: "config-out") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.364496 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-kube-api-access-czb8h" (OuterVolumeSpecName: "kube-api-access-czb8h") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "kube-api-access-czb8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.365485 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.382761 4917 generic.go:334] "Generic (PLEG): container finished" podID="da782152-b2cc-493f-9cdb-084684cd752f" containerID="898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b" exitCode=137 Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.382855 4917 scope.go:117] "RemoveContainer" containerID="898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.382889 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.386881 4917 generic.go:334] "Generic (PLEG): container finished" podID="21f0511a-4483-45ae-820c-36161e39a281" containerID="281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b" exitCode=0 Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.386970 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7m74" event={"ID":"21f0511a-4483-45ae-820c-36161e39a281","Type":"ContainerDied","Data":"281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b"} Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.392091 4917 generic.go:334] "Generic (PLEG): container finished" podID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerID="a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c" exitCode=0 Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.392116 4917 generic.go:334] "Generic (PLEG): container finished" podID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerID="f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed" exitCode=0 Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.392126 4917 generic.go:334] "Generic (PLEG): container finished" podID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerID="9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584" exitCode=0 Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.392114 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.392266 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerDied","Data":"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c"} Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.393950 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerDied","Data":"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed"} Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.393968 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerDied","Data":"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584"} Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.393978 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9fb7f569-ea89-4735-b69d-d642fcee0295","Type":"ContainerDied","Data":"4f635c0a804a1e7638ca6da60ef6497ea229b8dc8eedadac1eda21e02beb53ed"} Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.395075 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "pvc-640cb196-aa1d-4376-b24a-a01d064c661f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.408734 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-web-config" (OuterVolumeSpecName: "web-config") pod "9fb7f569-ea89-4735-b69d-d642fcee0295" (UID: "9fb7f569-ea89-4735-b69d-d642fcee0295"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.413228 4917 scope.go:117] "RemoveContainer" containerID="898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b" Mar 18 08:27:03 crc kubenswrapper[4917]: E0318 08:27:03.417133 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b\": container with ID starting with 898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b not found: ID does not exist" containerID="898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.417181 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b"} err="failed to get container status \"898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b\": rpc error: code = NotFound desc = could not find container \"898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b\": container with ID starting with 898ac96c29a9e7000905e47db96fd44a039e129615402e8978c4570e0bd5c41b not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.417208 4917 scope.go:117] "RemoveContainer" containerID="a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.420115 4917 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="da782152-b2cc-493f-9cdb-084684cd752f" podUID="f4465c4d-dda1-44ec-84d0-5a060ec3453f" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.447898 4917 scope.go:117] "RemoveContainer" containerID="f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459197 4917 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459228 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459237 4917 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459248 4917 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-web-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459258 4917 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/9fb7f569-ea89-4735-b69d-d642fcee0295-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459267 4917 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459289 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") on node \"crc\" " Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459302 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czb8h\" (UniqueName: \"kubernetes.io/projected/9fb7f569-ea89-4735-b69d-d642fcee0295-kube-api-access-czb8h\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459310 4917 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f569-ea89-4735-b69d-d642fcee0295-config-out\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.459320 4917 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9fb7f569-ea89-4735-b69d-d642fcee0295-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.470509 4917 scope.go:117] "RemoveContainer" containerID="9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.493159 4917 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.493345 4917 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-640cb196-aa1d-4376-b24a-a01d064c661f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f") on node "crc" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.505261 4917 scope.go:117] "RemoveContainer" containerID="23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.524843 4917 scope.go:117] "RemoveContainer" containerID="a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c" Mar 18 08:27:03 crc kubenswrapper[4917]: E0318 08:27:03.525259 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c\": container with ID starting with a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c not found: ID does not exist" containerID="a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.525296 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c"} err="failed to get container status \"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c\": rpc error: code = NotFound desc = could not find container \"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c\": container with ID starting with a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.525322 4917 scope.go:117] "RemoveContainer" containerID="f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed" Mar 18 08:27:03 crc kubenswrapper[4917]: E0318 08:27:03.525679 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed\": container with ID starting with f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed not found: ID does not exist" containerID="f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.525703 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed"} err="failed to get container status \"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed\": rpc error: code = NotFound desc = could not find container \"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed\": container with ID starting with f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.525715 4917 scope.go:117] "RemoveContainer" containerID="9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584" Mar 18 08:27:03 crc kubenswrapper[4917]: E0318 08:27:03.526384 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584\": container with ID starting with 9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584 not found: ID does not exist" containerID="9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.526407 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584"} err="failed to get container status \"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584\": rpc error: code = NotFound desc = could not find container \"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584\": container with ID starting with 9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584 not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.526426 4917 scope.go:117] "RemoveContainer" containerID="23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6" Mar 18 08:27:03 crc kubenswrapper[4917]: E0318 08:27:03.526773 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6\": container with ID starting with 23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6 not found: ID does not exist" containerID="23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.526796 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6"} err="failed to get container status \"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6\": rpc error: code = NotFound desc = could not find container \"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6\": container with ID starting with 23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6 not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.526811 4917 scope.go:117] "RemoveContainer" containerID="a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.527027 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c"} err="failed to get container status \"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c\": rpc error: code = NotFound desc = could not find container \"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c\": container with ID starting with a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.527048 4917 scope.go:117] "RemoveContainer" containerID="f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.527339 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed"} err="failed to get container status \"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed\": rpc error: code = NotFound desc = could not find container \"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed\": container with ID starting with f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.527361 4917 scope.go:117] "RemoveContainer" containerID="9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.528104 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584"} err="failed to get container status \"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584\": rpc error: code = NotFound desc = could not find container \"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584\": container with ID starting with 9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584 not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.528132 4917 scope.go:117] "RemoveContainer" containerID="23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.529326 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6"} err="failed to get container status \"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6\": rpc error: code = NotFound desc = could not find container \"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6\": container with ID starting with 23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6 not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.529346 4917 scope.go:117] "RemoveContainer" containerID="a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.529694 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c"} err="failed to get container status \"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c\": rpc error: code = NotFound desc = could not find container \"a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c\": container with ID starting with a9a42fb6c0a5e79bbad291877f06edf49f55544f3a80512202251ad6e7ef936c not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.529754 4917 scope.go:117] "RemoveContainer" containerID="f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.530034 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed"} err="failed to get container status \"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed\": rpc error: code = NotFound desc = could not find container \"f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed\": container with ID starting with f369da383fefa9d95dfc94c2846eaac18a784836dc9fc2db8f7f213f61acd1ed not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.530056 4917 scope.go:117] "RemoveContainer" containerID="9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.530299 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584"} err="failed to get container status \"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584\": rpc error: code = NotFound desc = could not find container \"9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584\": container with ID starting with 9d6b97c25866569ee427d6218f70f0dab9023a5daeb3e9fb390c560e72550584 not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.530319 4917 scope.go:117] "RemoveContainer" containerID="23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.530568 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6"} err="failed to get container status \"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6\": rpc error: code = NotFound desc = could not find container \"23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6\": container with ID starting with 23c01e2625d57045dc215266a6d00c1548ee174da4da4369a50abff121b292b6 not found: ID does not exist" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.560717 4917 reconciler_common.go:293] "Volume detached for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.728942 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.740052 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.762536 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 08:27:03 crc kubenswrapper[4917]: E0318 08:27:03.763070 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="config-reloader" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.763091 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="config-reloader" Mar 18 08:27:03 crc kubenswrapper[4917]: E0318 08:27:03.763131 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="prometheus" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.763141 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="prometheus" Mar 18 08:27:03 crc kubenswrapper[4917]: E0318 08:27:03.763154 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="init-config-reloader" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.763163 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="init-config-reloader" Mar 18 08:27:03 crc kubenswrapper[4917]: E0318 08:27:03.763178 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="thanos-sidecar" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.763185 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="thanos-sidecar" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.763409 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="prometheus" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.763433 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="config-reloader" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.763454 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" containerName="thanos-sidecar" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.765786 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.771798 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.772017 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.772179 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.772411 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.772664 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.777222 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-57vl2" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.777381 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.777492 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.785213 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.795769 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb7f569-ea89-4735-b69d-d642fcee0295" path="/var/lib/kubelet/pods/9fb7f569-ea89-4735-b69d-d642fcee0295/volumes" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.807465 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da782152-b2cc-493f-9cdb-084684cd752f" path="/var/lib/kubelet/pods/da782152-b2cc-493f-9cdb-084684cd752f/volumes" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.808441 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.865426 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.865491 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fd1a219f-72b5-4574-8f02-9a956ee8bb56-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.865538 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-config\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.865725 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd1a219f-72b5-4574-8f02-9a956ee8bb56-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.865766 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd1a219f-72b5-4574-8f02-9a956ee8bb56-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.865807 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fd1a219f-72b5-4574-8f02-9a956ee8bb56-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.865852 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fd1a219f-72b5-4574-8f02-9a956ee8bb56-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.865916 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.866033 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.866072 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt8w8\" (UniqueName: \"kubernetes.io/projected/fd1a219f-72b5-4574-8f02-9a956ee8bb56-kube-api-access-mt8w8\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.866137 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.866386 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.866476 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.968731 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.968838 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.969557 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.969815 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.969914 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fd1a219f-72b5-4574-8f02-9a956ee8bb56-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.969984 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-config\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.970104 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd1a219f-72b5-4574-8f02-9a956ee8bb56-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.970191 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd1a219f-72b5-4574-8f02-9a956ee8bb56-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.970292 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fd1a219f-72b5-4574-8f02-9a956ee8bb56-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.970382 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fd1a219f-72b5-4574-8f02-9a956ee8bb56-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.970750 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.971702 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.971297 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/fd1a219f-72b5-4574-8f02-9a956ee8bb56-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.971995 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/fd1a219f-72b5-4574-8f02-9a956ee8bb56-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.972172 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fd1a219f-72b5-4574-8f02-9a956ee8bb56-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.973152 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.973293 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt8w8\" (UniqueName: \"kubernetes.io/projected/fd1a219f-72b5-4574-8f02-9a956ee8bb56-kube-api-access-mt8w8\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.974620 4917 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.974660 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0bb2d1890a357d19ffdcc97238b432893576203b5079f7b9f5a7bdd3cad7ae3a/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.974886 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.975966 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fd1a219f-72b5-4574-8f02-9a956ee8bb56-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.977261 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.978449 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.981449 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-config\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.984117 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/fd1a219f-72b5-4574-8f02-9a956ee8bb56-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.984329 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fd1a219f-72b5-4574-8f02-9a956ee8bb56-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:03 crc kubenswrapper[4917]: I0318 08:27:03.992060 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt8w8\" (UniqueName: \"kubernetes.io/projected/fd1a219f-72b5-4574-8f02-9a956ee8bb56-kube-api-access-mt8w8\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:04 crc kubenswrapper[4917]: I0318 08:27:04.043445 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-640cb196-aa1d-4376-b24a-a01d064c661f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-640cb196-aa1d-4376-b24a-a01d064c661f\") pod \"prometheus-metric-storage-0\" (UID: \"fd1a219f-72b5-4574-8f02-9a956ee8bb56\") " pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:04 crc kubenswrapper[4917]: I0318 08:27:04.085764 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:04 crc kubenswrapper[4917]: I0318 08:27:04.403144 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7m74" event={"ID":"21f0511a-4483-45ae-820c-36161e39a281","Type":"ContainerStarted","Data":"ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e"} Mar 18 08:27:04 crc kubenswrapper[4917]: I0318 08:27:04.429654 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-w7m74" podStartSLOduration=1.780120073 podStartE2EDuration="8.429629439s" podCreationTimestamp="2026-03-18 08:26:56 +0000 UTC" firstStartedPulling="2026-03-18 08:26:57.294935156 +0000 UTC m=+6002.236089880" lastFinishedPulling="2026-03-18 08:27:03.944444492 +0000 UTC m=+6008.885599246" observedRunningTime="2026-03-18 08:27:04.420914607 +0000 UTC m=+6009.362069331" watchObservedRunningTime="2026-03-18 08:27:04.429629439 +0000 UTC m=+6009.370784173" Mar 18 08:27:04 crc kubenswrapper[4917]: I0318 08:27:04.586118 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.054311 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.057624 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.060494 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.060964 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.081572 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.095328 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-scripts\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.095372 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-run-httpd\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.095421 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.095449 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxw4d\" (UniqueName: \"kubernetes.io/projected/74564dae-2d5c-4b3e-9132-eb991eedb26c-kube-api-access-rxw4d\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.095567 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-config-data\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.095934 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-log-httpd\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.096007 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.197679 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-config-data\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.197789 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-log-httpd\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.197832 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.197930 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-scripts\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.197950 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-run-httpd\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.198001 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.198035 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxw4d\" (UniqueName: \"kubernetes.io/projected/74564dae-2d5c-4b3e-9132-eb991eedb26c-kube-api-access-rxw4d\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.198524 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-log-httpd\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.198528 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-run-httpd\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.202196 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-scripts\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.204117 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-config-data\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.205197 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.215170 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.217539 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxw4d\" (UniqueName: \"kubernetes.io/projected/74564dae-2d5c-4b3e-9132-eb991eedb26c-kube-api-access-rxw4d\") pod \"ceilometer-0\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.383476 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.415920 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fd1a219f-72b5-4574-8f02-9a956ee8bb56","Type":"ContainerStarted","Data":"601ebad9a4ce3bc62ecbf5582c388e2a51aa337164aba79af1df2c8f01da83c5"} Mar 18 08:27:05 crc kubenswrapper[4917]: I0318 08:27:05.705061 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:05 crc kubenswrapper[4917]: W0318 08:27:05.717054 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74564dae_2d5c_4b3e_9132_eb991eedb26c.slice/crio-d34a30d52ddc34ec42759db254e32b916c0284ab143e339c49ae1f2be22c3976 WatchSource:0}: Error finding container d34a30d52ddc34ec42759db254e32b916c0284ab143e339c49ae1f2be22c3976: Status 404 returned error can't find the container with id d34a30d52ddc34ec42759db254e32b916c0284ab143e339c49ae1f2be22c3976 Mar 18 08:27:06 crc kubenswrapper[4917]: I0318 08:27:06.424927 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerStarted","Data":"d34a30d52ddc34ec42759db254e32b916c0284ab143e339c49ae1f2be22c3976"} Mar 18 08:27:06 crc kubenswrapper[4917]: I0318 08:27:06.482206 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:27:06 crc kubenswrapper[4917]: I0318 08:27:06.482627 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:27:07 crc kubenswrapper[4917]: I0318 08:27:07.543329 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w7m74" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="registry-server" probeResult="failure" output=< Mar 18 08:27:07 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:27:07 crc kubenswrapper[4917]: > Mar 18 08:27:08 crc kubenswrapper[4917]: I0318 08:27:08.453832 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fd1a219f-72b5-4574-8f02-9a956ee8bb56","Type":"ContainerStarted","Data":"2959e5bcb6a471e3b098a0ef316268eb89e742192ca91632a3d39bf7a9991860"} Mar 18 08:27:10 crc kubenswrapper[4917]: I0318 08:27:10.676664 4917 scope.go:117] "RemoveContainer" containerID="ed0dcf577a24c5b067e176deab67828293fb9859ba45d8f2690026c9ef0bcd59" Mar 18 08:27:10 crc kubenswrapper[4917]: I0318 08:27:10.709275 4917 scope.go:117] "RemoveContainer" containerID="8f4b0c6cbcbe8cb448c0e814455a023e956292e0a6331cf93393314cdaf04871" Mar 18 08:27:10 crc kubenswrapper[4917]: I0318 08:27:10.752145 4917 scope.go:117] "RemoveContainer" containerID="88bfe776a1fd558423ef36cb38306302c772f10688a8c666d523e33c02f30d3c" Mar 18 08:27:11 crc kubenswrapper[4917]: I0318 08:27:11.485159 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerStarted","Data":"a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7"} Mar 18 08:27:12 crc kubenswrapper[4917]: I0318 08:27:12.496112 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerStarted","Data":"abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9"} Mar 18 08:27:13 crc kubenswrapper[4917]: I0318 08:27:13.506959 4917 generic.go:334] "Generic (PLEG): container finished" podID="fd1a219f-72b5-4574-8f02-9a956ee8bb56" containerID="2959e5bcb6a471e3b098a0ef316268eb89e742192ca91632a3d39bf7a9991860" exitCode=0 Mar 18 08:27:13 crc kubenswrapper[4917]: I0318 08:27:13.507100 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fd1a219f-72b5-4574-8f02-9a956ee8bb56","Type":"ContainerDied","Data":"2959e5bcb6a471e3b098a0ef316268eb89e742192ca91632a3d39bf7a9991860"} Mar 18 08:27:13 crc kubenswrapper[4917]: I0318 08:27:13.514602 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerStarted","Data":"e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec"} Mar 18 08:27:14 crc kubenswrapper[4917]: I0318 08:27:14.528198 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fd1a219f-72b5-4574-8f02-9a956ee8bb56","Type":"ContainerStarted","Data":"c657daebf728d1b852c2a6180a5758c146af109f85cc333ea3c94792736a8c79"} Mar 18 08:27:15 crc kubenswrapper[4917]: I0318 08:27:15.543727 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerStarted","Data":"7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b"} Mar 18 08:27:15 crc kubenswrapper[4917]: I0318 08:27:15.544383 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 08:27:17 crc kubenswrapper[4917]: I0318 08:27:17.533097 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-w7m74" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="registry-server" probeResult="failure" output=< Mar 18 08:27:17 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:27:17 crc kubenswrapper[4917]: > Mar 18 08:27:17 crc kubenswrapper[4917]: I0318 08:27:17.571783 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fd1a219f-72b5-4574-8f02-9a956ee8bb56","Type":"ContainerStarted","Data":"5e24511fc12e4ff7c423540c2efe6320573bf5e76ee2db5d02613439bdc61c18"} Mar 18 08:27:17 crc kubenswrapper[4917]: I0318 08:27:17.571843 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"fd1a219f-72b5-4574-8f02-9a956ee8bb56","Type":"ContainerStarted","Data":"4bbba2d8c5648b7bfd3f161e54457a26e9bc3ac2e0df2d817973b8665a66e73e"} Mar 18 08:27:17 crc kubenswrapper[4917]: I0318 08:27:17.615912 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.906660017 podStartE2EDuration="12.615834435s" podCreationTimestamp="2026-03-18 08:27:05 +0000 UTC" firstStartedPulling="2026-03-18 08:27:05.719385349 +0000 UTC m=+6010.660540063" lastFinishedPulling="2026-03-18 08:27:14.428559767 +0000 UTC m=+6019.369714481" observedRunningTime="2026-03-18 08:27:15.583454556 +0000 UTC m=+6020.524609290" watchObservedRunningTime="2026-03-18 08:27:17.615834435 +0000 UTC m=+6022.556989159" Mar 18 08:27:17 crc kubenswrapper[4917]: I0318 08:27:17.626765 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=14.626731429 podStartE2EDuration="14.626731429s" podCreationTimestamp="2026-03-18 08:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:27:17.610179218 +0000 UTC m=+6022.551333972" watchObservedRunningTime="2026-03-18 08:27:17.626731429 +0000 UTC m=+6022.567886183" Mar 18 08:27:19 crc kubenswrapper[4917]: I0318 08:27:19.086180 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:19 crc kubenswrapper[4917]: I0318 08:27:19.086231 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:19 crc kubenswrapper[4917]: I0318 08:27:19.093049 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:19 crc kubenswrapper[4917]: I0318 08:27:19.601779 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.194428 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-gz4lz"] Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.196847 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.210643 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-f692-account-create-update-4l7xl"] Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.211851 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.214735 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-gz4lz"] Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.215855 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.244972 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-f692-account-create-update-4l7xl"] Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.348169 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfq5m\" (UniqueName: \"kubernetes.io/projected/eac39fa3-e042-46f8-a673-653191c0b588-kube-api-access-dfq5m\") pod \"aodh-f692-account-create-update-4l7xl\" (UID: \"eac39fa3-e042-46f8-a673-653191c0b588\") " pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.348417 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca31488-cf6b-48a6-94f4-830e7a65fecc-operator-scripts\") pod \"aodh-db-create-gz4lz\" (UID: \"dca31488-cf6b-48a6-94f4-830e7a65fecc\") " pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.348550 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb7zv\" (UniqueName: \"kubernetes.io/projected/dca31488-cf6b-48a6-94f4-830e7a65fecc-kube-api-access-tb7zv\") pod \"aodh-db-create-gz4lz\" (UID: \"dca31488-cf6b-48a6-94f4-830e7a65fecc\") " pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.348776 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eac39fa3-e042-46f8-a673-653191c0b588-operator-scripts\") pod \"aodh-f692-account-create-update-4l7xl\" (UID: \"eac39fa3-e042-46f8-a673-653191c0b588\") " pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.450826 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfq5m\" (UniqueName: \"kubernetes.io/projected/eac39fa3-e042-46f8-a673-653191c0b588-kube-api-access-dfq5m\") pod \"aodh-f692-account-create-update-4l7xl\" (UID: \"eac39fa3-e042-46f8-a673-653191c0b588\") " pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.450881 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca31488-cf6b-48a6-94f4-830e7a65fecc-operator-scripts\") pod \"aodh-db-create-gz4lz\" (UID: \"dca31488-cf6b-48a6-94f4-830e7a65fecc\") " pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.450946 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb7zv\" (UniqueName: \"kubernetes.io/projected/dca31488-cf6b-48a6-94f4-830e7a65fecc-kube-api-access-tb7zv\") pod \"aodh-db-create-gz4lz\" (UID: \"dca31488-cf6b-48a6-94f4-830e7a65fecc\") " pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.450982 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eac39fa3-e042-46f8-a673-653191c0b588-operator-scripts\") pod \"aodh-f692-account-create-update-4l7xl\" (UID: \"eac39fa3-e042-46f8-a673-653191c0b588\") " pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.451976 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca31488-cf6b-48a6-94f4-830e7a65fecc-operator-scripts\") pod \"aodh-db-create-gz4lz\" (UID: \"dca31488-cf6b-48a6-94f4-830e7a65fecc\") " pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.452098 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eac39fa3-e042-46f8-a673-653191c0b588-operator-scripts\") pod \"aodh-f692-account-create-update-4l7xl\" (UID: \"eac39fa3-e042-46f8-a673-653191c0b588\") " pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.476227 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfq5m\" (UniqueName: \"kubernetes.io/projected/eac39fa3-e042-46f8-a673-653191c0b588-kube-api-access-dfq5m\") pod \"aodh-f692-account-create-update-4l7xl\" (UID: \"eac39fa3-e042-46f8-a673-653191c0b588\") " pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.476239 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb7zv\" (UniqueName: \"kubernetes.io/projected/dca31488-cf6b-48a6-94f4-830e7a65fecc-kube-api-access-tb7zv\") pod \"aodh-db-create-gz4lz\" (UID: \"dca31488-cf6b-48a6-94f4-830e7a65fecc\") " pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.536287 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:21 crc kubenswrapper[4917]: I0318 08:27:21.545899 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:22 crc kubenswrapper[4917]: I0318 08:27:22.089924 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-gz4lz"] Mar 18 08:27:22 crc kubenswrapper[4917]: W0318 08:27:22.096552 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddca31488_cf6b_48a6_94f4_830e7a65fecc.slice/crio-7c4426017b6fd0022b336464a267b58f657fa5134c40d2f66bd9a2465b609819 WatchSource:0}: Error finding container 7c4426017b6fd0022b336464a267b58f657fa5134c40d2f66bd9a2465b609819: Status 404 returned error can't find the container with id 7c4426017b6fd0022b336464a267b58f657fa5134c40d2f66bd9a2465b609819 Mar 18 08:27:22 crc kubenswrapper[4917]: I0318 08:27:22.172506 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-f692-account-create-update-4l7xl"] Mar 18 08:27:22 crc kubenswrapper[4917]: W0318 08:27:22.172507 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeac39fa3_e042_46f8_a673_653191c0b588.slice/crio-cef5910ba42ea5a1ccd19a686f9f0399e7664c7f125c430b1449e2cd9f2cbed8 WatchSource:0}: Error finding container cef5910ba42ea5a1ccd19a686f9f0399e7664c7f125c430b1449e2cd9f2cbed8: Status 404 returned error can't find the container with id cef5910ba42ea5a1ccd19a686f9f0399e7664c7f125c430b1449e2cd9f2cbed8 Mar 18 08:27:22 crc kubenswrapper[4917]: I0318 08:27:22.631624 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gz4lz" event={"ID":"dca31488-cf6b-48a6-94f4-830e7a65fecc","Type":"ContainerStarted","Data":"c24c3c39aa319b019ca3da58ed6a1b3f24ef525292b15bf978f76eb634bc8e0c"} Mar 18 08:27:22 crc kubenswrapper[4917]: I0318 08:27:22.631662 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gz4lz" event={"ID":"dca31488-cf6b-48a6-94f4-830e7a65fecc","Type":"ContainerStarted","Data":"7c4426017b6fd0022b336464a267b58f657fa5134c40d2f66bd9a2465b609819"} Mar 18 08:27:22 crc kubenswrapper[4917]: I0318 08:27:22.633830 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-f692-account-create-update-4l7xl" event={"ID":"eac39fa3-e042-46f8-a673-653191c0b588","Type":"ContainerStarted","Data":"a77cc6e18818ef3c07a0b9b6677f8ee986aa2713960c8607533e0303a355f71e"} Mar 18 08:27:22 crc kubenswrapper[4917]: I0318 08:27:22.633876 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-f692-account-create-update-4l7xl" event={"ID":"eac39fa3-e042-46f8-a673-653191c0b588","Type":"ContainerStarted","Data":"cef5910ba42ea5a1ccd19a686f9f0399e7664c7f125c430b1449e2cd9f2cbed8"} Mar 18 08:27:22 crc kubenswrapper[4917]: I0318 08:27:22.652918 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-gz4lz" podStartSLOduration=1.652899696 podStartE2EDuration="1.652899696s" podCreationTimestamp="2026-03-18 08:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:27:22.643865467 +0000 UTC m=+6027.585020201" watchObservedRunningTime="2026-03-18 08:27:22.652899696 +0000 UTC m=+6027.594054400" Mar 18 08:27:22 crc kubenswrapper[4917]: I0318 08:27:22.670179 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-f692-account-create-update-4l7xl" podStartSLOduration=1.670161164 podStartE2EDuration="1.670161164s" podCreationTimestamp="2026-03-18 08:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:27:22.66253287 +0000 UTC m=+6027.603687594" watchObservedRunningTime="2026-03-18 08:27:22.670161164 +0000 UTC m=+6027.611315878" Mar 18 08:27:23 crc kubenswrapper[4917]: I0318 08:27:23.646650 4917 generic.go:334] "Generic (PLEG): container finished" podID="dca31488-cf6b-48a6-94f4-830e7a65fecc" containerID="c24c3c39aa319b019ca3da58ed6a1b3f24ef525292b15bf978f76eb634bc8e0c" exitCode=0 Mar 18 08:27:23 crc kubenswrapper[4917]: I0318 08:27:23.646705 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gz4lz" event={"ID":"dca31488-cf6b-48a6-94f4-830e7a65fecc","Type":"ContainerDied","Data":"c24c3c39aa319b019ca3da58ed6a1b3f24ef525292b15bf978f76eb634bc8e0c"} Mar 18 08:27:23 crc kubenswrapper[4917]: I0318 08:27:23.648861 4917 generic.go:334] "Generic (PLEG): container finished" podID="eac39fa3-e042-46f8-a673-653191c0b588" containerID="a77cc6e18818ef3c07a0b9b6677f8ee986aa2713960c8607533e0303a355f71e" exitCode=0 Mar 18 08:27:23 crc kubenswrapper[4917]: I0318 08:27:23.648913 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-f692-account-create-update-4l7xl" event={"ID":"eac39fa3-e042-46f8-a673-653191c0b588","Type":"ContainerDied","Data":"a77cc6e18818ef3c07a0b9b6677f8ee986aa2713960c8607533e0303a355f71e"} Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.127172 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.136101 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.230743 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca31488-cf6b-48a6-94f4-830e7a65fecc-operator-scripts\") pod \"dca31488-cf6b-48a6-94f4-830e7a65fecc\" (UID: \"dca31488-cf6b-48a6-94f4-830e7a65fecc\") " Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.230872 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eac39fa3-e042-46f8-a673-653191c0b588-operator-scripts\") pod \"eac39fa3-e042-46f8-a673-653191c0b588\" (UID: \"eac39fa3-e042-46f8-a673-653191c0b588\") " Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.231103 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfq5m\" (UniqueName: \"kubernetes.io/projected/eac39fa3-e042-46f8-a673-653191c0b588-kube-api-access-dfq5m\") pod \"eac39fa3-e042-46f8-a673-653191c0b588\" (UID: \"eac39fa3-e042-46f8-a673-653191c0b588\") " Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.231164 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb7zv\" (UniqueName: \"kubernetes.io/projected/dca31488-cf6b-48a6-94f4-830e7a65fecc-kube-api-access-tb7zv\") pod \"dca31488-cf6b-48a6-94f4-830e7a65fecc\" (UID: \"dca31488-cf6b-48a6-94f4-830e7a65fecc\") " Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.231706 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dca31488-cf6b-48a6-94f4-830e7a65fecc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dca31488-cf6b-48a6-94f4-830e7a65fecc" (UID: "dca31488-cf6b-48a6-94f4-830e7a65fecc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.231717 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eac39fa3-e042-46f8-a673-653191c0b588-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eac39fa3-e042-46f8-a673-653191c0b588" (UID: "eac39fa3-e042-46f8-a673-653191c0b588"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.232085 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dca31488-cf6b-48a6-94f4-830e7a65fecc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.232119 4917 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eac39fa3-e042-46f8-a673-653191c0b588-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.247544 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eac39fa3-e042-46f8-a673-653191c0b588-kube-api-access-dfq5m" (OuterVolumeSpecName: "kube-api-access-dfq5m") pod "eac39fa3-e042-46f8-a673-653191c0b588" (UID: "eac39fa3-e042-46f8-a673-653191c0b588"). InnerVolumeSpecName "kube-api-access-dfq5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.247624 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dca31488-cf6b-48a6-94f4-830e7a65fecc-kube-api-access-tb7zv" (OuterVolumeSpecName: "kube-api-access-tb7zv") pod "dca31488-cf6b-48a6-94f4-830e7a65fecc" (UID: "dca31488-cf6b-48a6-94f4-830e7a65fecc"). InnerVolumeSpecName "kube-api-access-tb7zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.334939 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfq5m\" (UniqueName: \"kubernetes.io/projected/eac39fa3-e042-46f8-a673-653191c0b588-kube-api-access-dfq5m\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.334997 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb7zv\" (UniqueName: \"kubernetes.io/projected/dca31488-cf6b-48a6-94f4-830e7a65fecc-kube-api-access-tb7zv\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.681199 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-gz4lz" event={"ID":"dca31488-cf6b-48a6-94f4-830e7a65fecc","Type":"ContainerDied","Data":"7c4426017b6fd0022b336464a267b58f657fa5134c40d2f66bd9a2465b609819"} Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.681241 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c4426017b6fd0022b336464a267b58f657fa5134c40d2f66bd9a2465b609819" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.681270 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-gz4lz" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.683899 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-f692-account-create-update-4l7xl" event={"ID":"eac39fa3-e042-46f8-a673-653191c0b588","Type":"ContainerDied","Data":"cef5910ba42ea5a1ccd19a686f9f0399e7664c7f125c430b1449e2cd9f2cbed8"} Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.683919 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cef5910ba42ea5a1ccd19a686f9f0399e7664c7f125c430b1449e2cd9f2cbed8" Mar 18 08:27:25 crc kubenswrapper[4917]: I0318 08:27:25.684040 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-f692-account-create-update-4l7xl" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.061518 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-wbgr8"] Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.073548 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-wbgr8"] Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.476995 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-pjhws"] Mar 18 08:27:26 crc kubenswrapper[4917]: E0318 08:27:26.477409 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dca31488-cf6b-48a6-94f4-830e7a65fecc" containerName="mariadb-database-create" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.477420 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dca31488-cf6b-48a6-94f4-830e7a65fecc" containerName="mariadb-database-create" Mar 18 08:27:26 crc kubenswrapper[4917]: E0318 08:27:26.477433 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eac39fa3-e042-46f8-a673-653191c0b588" containerName="mariadb-account-create-update" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.477440 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="eac39fa3-e042-46f8-a673-653191c0b588" containerName="mariadb-account-create-update" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.477625 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dca31488-cf6b-48a6-94f4-830e7a65fecc" containerName="mariadb-database-create" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.477642 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="eac39fa3-e042-46f8-a673-653191c0b588" containerName="mariadb-account-create-update" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.478269 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.481817 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.482804 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.482870 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.491619 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-xwrh9" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.491734 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-pjhws"] Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.562905 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-combined-ca-bundle\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.563051 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5tqz\" (UniqueName: \"kubernetes.io/projected/f3609520-f250-40d8-8444-360d760da047-kube-api-access-r5tqz\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.563085 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-config-data\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.563117 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-scripts\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.569216 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.612674 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.665168 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-config-data\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.665242 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-scripts\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.665318 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-combined-ca-bundle\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.665431 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5tqz\" (UniqueName: \"kubernetes.io/projected/f3609520-f250-40d8-8444-360d760da047-kube-api-access-r5tqz\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.670276 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-combined-ca-bundle\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.670270 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-scripts\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.671379 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-config-data\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.679875 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5tqz\" (UniqueName: \"kubernetes.io/projected/f3609520-f250-40d8-8444-360d760da047-kube-api-access-r5tqz\") pod \"aodh-db-sync-pjhws\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:26 crc kubenswrapper[4917]: I0318 08:27:26.833384 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:27 crc kubenswrapper[4917]: I0318 08:27:27.318171 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w7m74"] Mar 18 08:27:27 crc kubenswrapper[4917]: I0318 08:27:27.340671 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-pjhws"] Mar 18 08:27:27 crc kubenswrapper[4917]: I0318 08:27:27.741764 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-w7m74" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="registry-server" containerID="cri-o://ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e" gracePeriod=2 Mar 18 08:27:27 crc kubenswrapper[4917]: I0318 08:27:27.742057 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pjhws" event={"ID":"f3609520-f250-40d8-8444-360d760da047","Type":"ContainerStarted","Data":"e7c7a55fe3f727c46fba963bee7ddcce0fb52853979cadccb7de764a389dcb48"} Mar 18 08:27:27 crc kubenswrapper[4917]: I0318 08:27:27.788251 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a76d7db-f61e-45b4-b91d-9f81fb1e20b5" path="/var/lib/kubelet/pods/8a76d7db-f61e-45b4-b91d-9f81fb1e20b5/volumes" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.210519 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.302233 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-catalog-content\") pod \"21f0511a-4483-45ae-820c-36161e39a281\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.302288 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-utilities\") pod \"21f0511a-4483-45ae-820c-36161e39a281\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.302326 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lns7p\" (UniqueName: \"kubernetes.io/projected/21f0511a-4483-45ae-820c-36161e39a281-kube-api-access-lns7p\") pod \"21f0511a-4483-45ae-820c-36161e39a281\" (UID: \"21f0511a-4483-45ae-820c-36161e39a281\") " Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.303094 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-utilities" (OuterVolumeSpecName: "utilities") pod "21f0511a-4483-45ae-820c-36161e39a281" (UID: "21f0511a-4483-45ae-820c-36161e39a281"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.307099 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f0511a-4483-45ae-820c-36161e39a281-kube-api-access-lns7p" (OuterVolumeSpecName: "kube-api-access-lns7p") pod "21f0511a-4483-45ae-820c-36161e39a281" (UID: "21f0511a-4483-45ae-820c-36161e39a281"). InnerVolumeSpecName "kube-api-access-lns7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.405166 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.405195 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lns7p\" (UniqueName: \"kubernetes.io/projected/21f0511a-4483-45ae-820c-36161e39a281-kube-api-access-lns7p\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.425265 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21f0511a-4483-45ae-820c-36161e39a281" (UID: "21f0511a-4483-45ae-820c-36161e39a281"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.507730 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21f0511a-4483-45ae-820c-36161e39a281-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.755417 4917 generic.go:334] "Generic (PLEG): container finished" podID="21f0511a-4483-45ae-820c-36161e39a281" containerID="ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e" exitCode=0 Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.755450 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-w7m74" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.755475 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7m74" event={"ID":"21f0511a-4483-45ae-820c-36161e39a281","Type":"ContainerDied","Data":"ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e"} Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.755536 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-w7m74" event={"ID":"21f0511a-4483-45ae-820c-36161e39a281","Type":"ContainerDied","Data":"67fc15bc5cf203f876568205ff3796ab9c18e2ce00f4cb607e4ea8c5a2ab0244"} Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.755561 4917 scope.go:117] "RemoveContainer" containerID="ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.796285 4917 scope.go:117] "RemoveContainer" containerID="281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.808115 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-w7m74"] Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.825268 4917 scope.go:117] "RemoveContainer" containerID="95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.829012 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-w7m74"] Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.857287 4917 scope.go:117] "RemoveContainer" containerID="ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e" Mar 18 08:27:28 crc kubenswrapper[4917]: E0318 08:27:28.857766 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e\": container with ID starting with ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e not found: ID does not exist" containerID="ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.857812 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e"} err="failed to get container status \"ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e\": rpc error: code = NotFound desc = could not find container \"ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e\": container with ID starting with ef215904f2762733220441053fd4e7931838bc5a7ffa73bb3471a0cae0cf1d0e not found: ID does not exist" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.857835 4917 scope.go:117] "RemoveContainer" containerID="281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b" Mar 18 08:27:28 crc kubenswrapper[4917]: E0318 08:27:28.858508 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b\": container with ID starting with 281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b not found: ID does not exist" containerID="281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.858555 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b"} err="failed to get container status \"281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b\": rpc error: code = NotFound desc = could not find container \"281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b\": container with ID starting with 281ec8829d0fdc63f399d243a65a75fe39941e52145c84163dbe4a085063a91b not found: ID does not exist" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.858605 4917 scope.go:117] "RemoveContainer" containerID="95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae" Mar 18 08:27:28 crc kubenswrapper[4917]: E0318 08:27:28.858982 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae\": container with ID starting with 95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae not found: ID does not exist" containerID="95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae" Mar 18 08:27:28 crc kubenswrapper[4917]: I0318 08:27:28.859026 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae"} err="failed to get container status \"95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae\": rpc error: code = NotFound desc = could not find container \"95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae\": container with ID starting with 95121a93c4e5393c904e8f65bfe09d86025b2525c0df92919cc7bf34d9aed6ae not found: ID does not exist" Mar 18 08:27:29 crc kubenswrapper[4917]: I0318 08:27:29.788958 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f0511a-4483-45ae-820c-36161e39a281" path="/var/lib/kubelet/pods/21f0511a-4483-45ae-820c-36161e39a281/volumes" Mar 18 08:27:32 crc kubenswrapper[4917]: I0318 08:27:32.812092 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pjhws" event={"ID":"f3609520-f250-40d8-8444-360d760da047","Type":"ContainerStarted","Data":"741caf756728c20fa11c8bd13e56644318a18214db39bb7cf7939ba51a7d80af"} Mar 18 08:27:32 crc kubenswrapper[4917]: I0318 08:27:32.834346 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-pjhws" podStartSLOduration=2.031375128 podStartE2EDuration="6.83432126s" podCreationTimestamp="2026-03-18 08:27:26 +0000 UTC" firstStartedPulling="2026-03-18 08:27:27.349298505 +0000 UTC m=+6032.290453209" lastFinishedPulling="2026-03-18 08:27:32.152244627 +0000 UTC m=+6037.093399341" observedRunningTime="2026-03-18 08:27:32.829633236 +0000 UTC m=+6037.770788000" watchObservedRunningTime="2026-03-18 08:27:32.83432126 +0000 UTC m=+6037.775476014" Mar 18 08:27:32 crc kubenswrapper[4917]: I0318 08:27:32.929125 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:27:32 crc kubenswrapper[4917]: I0318 08:27:32.929171 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:27:34 crc kubenswrapper[4917]: I0318 08:27:34.839309 4917 generic.go:334] "Generic (PLEG): container finished" podID="f3609520-f250-40d8-8444-360d760da047" containerID="741caf756728c20fa11c8bd13e56644318a18214db39bb7cf7939ba51a7d80af" exitCode=0 Mar 18 08:27:34 crc kubenswrapper[4917]: I0318 08:27:34.839466 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pjhws" event={"ID":"f3609520-f250-40d8-8444-360d760da047","Type":"ContainerDied","Data":"741caf756728c20fa11c8bd13e56644318a18214db39bb7cf7939ba51a7d80af"} Mar 18 08:27:35 crc kubenswrapper[4917]: I0318 08:27:35.398561 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.283047 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.368521 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-scripts\") pod \"f3609520-f250-40d8-8444-360d760da047\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.368801 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5tqz\" (UniqueName: \"kubernetes.io/projected/f3609520-f250-40d8-8444-360d760da047-kube-api-access-r5tqz\") pod \"f3609520-f250-40d8-8444-360d760da047\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.368908 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-combined-ca-bundle\") pod \"f3609520-f250-40d8-8444-360d760da047\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.368931 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-config-data\") pod \"f3609520-f250-40d8-8444-360d760da047\" (UID: \"f3609520-f250-40d8-8444-360d760da047\") " Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.374072 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3609520-f250-40d8-8444-360d760da047-kube-api-access-r5tqz" (OuterVolumeSpecName: "kube-api-access-r5tqz") pod "f3609520-f250-40d8-8444-360d760da047" (UID: "f3609520-f250-40d8-8444-360d760da047"). InnerVolumeSpecName "kube-api-access-r5tqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.374103 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-scripts" (OuterVolumeSpecName: "scripts") pod "f3609520-f250-40d8-8444-360d760da047" (UID: "f3609520-f250-40d8-8444-360d760da047"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.396339 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-config-data" (OuterVolumeSpecName: "config-data") pod "f3609520-f250-40d8-8444-360d760da047" (UID: "f3609520-f250-40d8-8444-360d760da047"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.402885 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3609520-f250-40d8-8444-360d760da047" (UID: "f3609520-f250-40d8-8444-360d760da047"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.470733 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5tqz\" (UniqueName: \"kubernetes.io/projected/f3609520-f250-40d8-8444-360d760da047-kube-api-access-r5tqz\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.470795 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.470808 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.470820 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3609520-f250-40d8-8444-360d760da047-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.860066 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-pjhws" event={"ID":"f3609520-f250-40d8-8444-360d760da047","Type":"ContainerDied","Data":"e7c7a55fe3f727c46fba963bee7ddcce0fb52853979cadccb7de764a389dcb48"} Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.860691 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c7a55fe3f727c46fba963bee7ddcce0fb52853979cadccb7de764a389dcb48" Mar 18 08:27:36 crc kubenswrapper[4917]: I0318 08:27:36.860516 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-pjhws" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.253054 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.253619 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c" containerName="kube-state-metrics" containerID="cri-o://498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4" gracePeriod=30 Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.719520 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.742449 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2hsb\" (UniqueName: \"kubernetes.io/projected/d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c-kube-api-access-c2hsb\") pod \"d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c\" (UID: \"d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c\") " Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.749405 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c-kube-api-access-c2hsb" (OuterVolumeSpecName: "kube-api-access-c2hsb") pod "d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c" (UID: "d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c"). InnerVolumeSpecName "kube-api-access-c2hsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.846639 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2hsb\" (UniqueName: \"kubernetes.io/projected/d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c-kube-api-access-c2hsb\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.888148 4917 generic.go:334] "Generic (PLEG): container finished" podID="d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c" containerID="498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4" exitCode=2 Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.888194 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c","Type":"ContainerDied","Data":"498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4"} Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.888225 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c","Type":"ContainerDied","Data":"6f1162f90257725167a737fbfa67bc17f17cafe9198f4f7ce624b9c3bb86cdf9"} Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.888246 4917 scope.go:117] "RemoveContainer" containerID="498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.888255 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.920356 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.933008 4917 scope.go:117] "RemoveContainer" containerID="498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4" Mar 18 08:27:39 crc kubenswrapper[4917]: E0318 08:27:39.933343 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4\": container with ID starting with 498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4 not found: ID does not exist" containerID="498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.933384 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4"} err="failed to get container status \"498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4\": rpc error: code = NotFound desc = could not find container \"498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4\": container with ID starting with 498accb3f4a02ae0b39946107c7c241d045f391106af24b20ed996eba9a84fc4 not found: ID does not exist" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.934650 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.947701 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 08:27:39 crc kubenswrapper[4917]: E0318 08:27:39.948222 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="registry-server" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.948246 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="registry-server" Mar 18 08:27:39 crc kubenswrapper[4917]: E0318 08:27:39.948271 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="extract-utilities" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.948281 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="extract-utilities" Mar 18 08:27:39 crc kubenswrapper[4917]: E0318 08:27:39.948299 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c" containerName="kube-state-metrics" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.948308 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c" containerName="kube-state-metrics" Mar 18 08:27:39 crc kubenswrapper[4917]: E0318 08:27:39.948326 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3609520-f250-40d8-8444-360d760da047" containerName="aodh-db-sync" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.948334 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3609520-f250-40d8-8444-360d760da047" containerName="aodh-db-sync" Mar 18 08:27:39 crc kubenswrapper[4917]: E0318 08:27:39.948358 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="extract-content" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.948366 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="extract-content" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.948624 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3609520-f250-40d8-8444-360d760da047" containerName="aodh-db-sync" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.948645 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f0511a-4483-45ae-820c-36161e39a281" containerName="registry-server" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.948665 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c" containerName="kube-state-metrics" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.949483 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.951744 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.952100 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 18 08:27:39 crc kubenswrapper[4917]: I0318 08:27:39.963405 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.049955 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6639b737-2219-4fdb-9a93-4e0155460477-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.050077 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6639b737-2219-4fdb-9a93-4e0155460477-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.050117 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr7xv\" (UniqueName: \"kubernetes.io/projected/6639b737-2219-4fdb-9a93-4e0155460477-kube-api-access-lr7xv\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.050135 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6639b737-2219-4fdb-9a93-4e0155460477-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.153280 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6639b737-2219-4fdb-9a93-4e0155460477-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.154016 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6639b737-2219-4fdb-9a93-4e0155460477-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.154205 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr7xv\" (UniqueName: \"kubernetes.io/projected/6639b737-2219-4fdb-9a93-4e0155460477-kube-api-access-lr7xv\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.154283 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6639b737-2219-4fdb-9a93-4e0155460477-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.166381 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6639b737-2219-4fdb-9a93-4e0155460477-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.167194 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6639b737-2219-4fdb-9a93-4e0155460477-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.168933 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6639b737-2219-4fdb-9a93-4e0155460477-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.176984 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr7xv\" (UniqueName: \"kubernetes.io/projected/6639b737-2219-4fdb-9a93-4e0155460477-kube-api-access-lr7xv\") pod \"kube-state-metrics-0\" (UID: \"6639b737-2219-4fdb-9a93-4e0155460477\") " pod="openstack/kube-state-metrics-0" Mar 18 08:27:40 crc kubenswrapper[4917]: I0318 08:27:40.272066 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:40.763753 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:40.899861 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6639b737-2219-4fdb-9a93-4e0155460477","Type":"ContainerStarted","Data":"571f3fe973c4f077005da0ff2980c49f02cd9c870c63b46028beab2dbb6a6b0e"} Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.131527 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.131859 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="ceilometer-central-agent" containerID="cri-o://a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7" gracePeriod=30 Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.131929 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="sg-core" containerID="cri-o://e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec" gracePeriod=30 Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.131928 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="proxy-httpd" containerID="cri-o://7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b" gracePeriod=30 Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.132014 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="ceilometer-notification-agent" containerID="cri-o://abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9" gracePeriod=30 Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.230610 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.236848 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.254150 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.254327 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.254446 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-xwrh9" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.279768 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-combined-ca-bundle\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.279829 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-scripts\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.279862 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnsh6\" (UniqueName: \"kubernetes.io/projected/38df44bc-4081-4ed3-bee0-1b9b284a9986-kube-api-access-wnsh6\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.279903 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-config-data\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.287730 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.381677 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-config-data\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.381967 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-combined-ca-bundle\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.382049 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-scripts\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.382119 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnsh6\" (UniqueName: \"kubernetes.io/projected/38df44bc-4081-4ed3-bee0-1b9b284a9986-kube-api-access-wnsh6\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.389548 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-scripts\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.389570 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-combined-ca-bundle\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.389701 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-config-data\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.405274 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnsh6\" (UniqueName: \"kubernetes.io/projected/38df44bc-4081-4ed3-bee0-1b9b284a9986-kube-api-access-wnsh6\") pod \"aodh-0\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.657531 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.783540 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c" path="/var/lib/kubelet/pods/d3b63b27-15a8-4ee8-b66c-4cd6fcbeae4c/volumes" Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.915358 4917 generic.go:334] "Generic (PLEG): container finished" podID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerID="7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b" exitCode=0 Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.915385 4917 generic.go:334] "Generic (PLEG): container finished" podID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerID="e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec" exitCode=2 Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.915393 4917 generic.go:334] "Generic (PLEG): container finished" podID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerID="a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7" exitCode=0 Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.915426 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerDied","Data":"7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b"} Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.915455 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerDied","Data":"e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec"} Mar 18 08:27:41 crc kubenswrapper[4917]: I0318 08:27:41.915464 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerDied","Data":"a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7"} Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.147902 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 08:27:42 crc kubenswrapper[4917]: W0318 08:27:42.162735 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38df44bc_4081_4ed3_bee0_1b9b284a9986.slice/crio-a2f418e68a40547bd4b340175b6b68c7d67c5c65cc1e31a0621d280722d28f7c WatchSource:0}: Error finding container a2f418e68a40547bd4b340175b6b68c7d67c5c65cc1e31a0621d280722d28f7c: Status 404 returned error can't find the container with id a2f418e68a40547bd4b340175b6b68c7d67c5c65cc1e31a0621d280722d28f7c Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.882455 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.919064 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-sg-core-conf-yaml\") pod \"74564dae-2d5c-4b3e-9132-eb991eedb26c\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.919166 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-scripts\") pod \"74564dae-2d5c-4b3e-9132-eb991eedb26c\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.919189 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-log-httpd\") pod \"74564dae-2d5c-4b3e-9132-eb991eedb26c\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.919235 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-config-data\") pod \"74564dae-2d5c-4b3e-9132-eb991eedb26c\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.919264 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-combined-ca-bundle\") pod \"74564dae-2d5c-4b3e-9132-eb991eedb26c\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.919294 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-run-httpd\") pod \"74564dae-2d5c-4b3e-9132-eb991eedb26c\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.919350 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxw4d\" (UniqueName: \"kubernetes.io/projected/74564dae-2d5c-4b3e-9132-eb991eedb26c-kube-api-access-rxw4d\") pod \"74564dae-2d5c-4b3e-9132-eb991eedb26c\" (UID: \"74564dae-2d5c-4b3e-9132-eb991eedb26c\") " Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.921574 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "74564dae-2d5c-4b3e-9132-eb991eedb26c" (UID: "74564dae-2d5c-4b3e-9132-eb991eedb26c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.928016 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "74564dae-2d5c-4b3e-9132-eb991eedb26c" (UID: "74564dae-2d5c-4b3e-9132-eb991eedb26c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.932837 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74564dae-2d5c-4b3e-9132-eb991eedb26c-kube-api-access-rxw4d" (OuterVolumeSpecName: "kube-api-access-rxw4d") pod "74564dae-2d5c-4b3e-9132-eb991eedb26c" (UID: "74564dae-2d5c-4b3e-9132-eb991eedb26c"). InnerVolumeSpecName "kube-api-access-rxw4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.933005 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-scripts" (OuterVolumeSpecName: "scripts") pod "74564dae-2d5c-4b3e-9132-eb991eedb26c" (UID: "74564dae-2d5c-4b3e-9132-eb991eedb26c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.978970 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerStarted","Data":"d35e6786f42bb4a27b7741d5b2ab997f955c5b396f03a7d57bef38ffdbd7d5b9"} Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.979023 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerStarted","Data":"a2f418e68a40547bd4b340175b6b68c7d67c5c65cc1e31a0621d280722d28f7c"} Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.994779 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "74564dae-2d5c-4b3e-9132-eb991eedb26c" (UID: "74564dae-2d5c-4b3e-9132-eb991eedb26c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:42 crc kubenswrapper[4917]: I0318 08:27:42.995274 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6639b737-2219-4fdb-9a93-4e0155460477","Type":"ContainerStarted","Data":"822412e5f52aa104dd2dae6221d9d4f56c4de8e76ab471675da27ef6f587261e"} Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.002787 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.016782 4917 generic.go:334] "Generic (PLEG): container finished" podID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerID="abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9" exitCode=0 Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.016850 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerDied","Data":"abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9"} Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.016905 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"74564dae-2d5c-4b3e-9132-eb991eedb26c","Type":"ContainerDied","Data":"d34a30d52ddc34ec42759db254e32b916c0284ab143e339c49ae1f2be22c3976"} Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.016924 4917 scope.go:117] "RemoveContainer" containerID="7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.016976 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.046009 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.046047 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxw4d\" (UniqueName: \"kubernetes.io/projected/74564dae-2d5c-4b3e-9132-eb991eedb26c-kube-api-access-rxw4d\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.046062 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.046074 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.046084 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/74564dae-2d5c-4b3e-9132-eb991eedb26c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.050400 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.8149924410000002 podStartE2EDuration="4.050376213s" podCreationTimestamp="2026-03-18 08:27:39 +0000 UTC" firstStartedPulling="2026-03-18 08:27:40.769762513 +0000 UTC m=+6045.710917227" lastFinishedPulling="2026-03-18 08:27:42.005146285 +0000 UTC m=+6046.946300999" observedRunningTime="2026-03-18 08:27:43.031335711 +0000 UTC m=+6047.972490435" watchObservedRunningTime="2026-03-18 08:27:43.050376213 +0000 UTC m=+6047.991530947" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.106841 4917 scope.go:117] "RemoveContainer" containerID="e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.130510 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-config-data" (OuterVolumeSpecName: "config-data") pod "74564dae-2d5c-4b3e-9132-eb991eedb26c" (UID: "74564dae-2d5c-4b3e-9132-eb991eedb26c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.152620 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.159830 4917 scope.go:117] "RemoveContainer" containerID="abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.190309 4917 scope.go:117] "RemoveContainer" containerID="a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.200638 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74564dae-2d5c-4b3e-9132-eb991eedb26c" (UID: "74564dae-2d5c-4b3e-9132-eb991eedb26c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.217753 4917 scope.go:117] "RemoveContainer" containerID="7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b" Mar 18 08:27:43 crc kubenswrapper[4917]: E0318 08:27:43.220187 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b\": container with ID starting with 7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b not found: ID does not exist" containerID="7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.220251 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b"} err="failed to get container status \"7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b\": rpc error: code = NotFound desc = could not find container \"7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b\": container with ID starting with 7fd9c5e295405a143fcdee9866e73fba6d0c610bc0fede5e17ccfdc5bfd0fe4b not found: ID does not exist" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.220280 4917 scope.go:117] "RemoveContainer" containerID="e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec" Mar 18 08:27:43 crc kubenswrapper[4917]: E0318 08:27:43.221066 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec\": container with ID starting with e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec not found: ID does not exist" containerID="e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.221100 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec"} err="failed to get container status \"e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec\": rpc error: code = NotFound desc = could not find container \"e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec\": container with ID starting with e1f0cb3ee80fa26fefb31e9a59d4077b4fc42877138003b20b0339fc8b2988ec not found: ID does not exist" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.221134 4917 scope.go:117] "RemoveContainer" containerID="abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9" Mar 18 08:27:43 crc kubenswrapper[4917]: E0318 08:27:43.222038 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9\": container with ID starting with abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9 not found: ID does not exist" containerID="abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.222077 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9"} err="failed to get container status \"abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9\": rpc error: code = NotFound desc = could not find container \"abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9\": container with ID starting with abf27eeafa8b0278b09058822f4fb860f1d860e7cb97c8be1a4e1237669e8ff9 not found: ID does not exist" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.222108 4917 scope.go:117] "RemoveContainer" containerID="a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7" Mar 18 08:27:43 crc kubenswrapper[4917]: E0318 08:27:43.222442 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7\": container with ID starting with a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7 not found: ID does not exist" containerID="a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.222460 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7"} err="failed to get container status \"a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7\": rpc error: code = NotFound desc = could not find container \"a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7\": container with ID starting with a89e5735c5745f2bb359d70ce8ca5a1ac0848ba9eb7acfd18b604884b9b900f7 not found: ID does not exist" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.254410 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74564dae-2d5c-4b3e-9132-eb991eedb26c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.398408 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.407368 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.424496 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:43 crc kubenswrapper[4917]: E0318 08:27:43.424864 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="proxy-httpd" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.424881 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="proxy-httpd" Mar 18 08:27:43 crc kubenswrapper[4917]: E0318 08:27:43.424903 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="ceilometer-notification-agent" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.424910 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="ceilometer-notification-agent" Mar 18 08:27:43 crc kubenswrapper[4917]: E0318 08:27:43.424935 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="ceilometer-central-agent" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.424940 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="ceilometer-central-agent" Mar 18 08:27:43 crc kubenswrapper[4917]: E0318 08:27:43.424959 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="sg-core" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.424965 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="sg-core" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.425126 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="ceilometer-central-agent" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.425142 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="sg-core" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.425163 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="ceilometer-notification-agent" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.425174 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" containerName="proxy-httpd" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.427154 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.430601 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.430835 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.430954 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.441321 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.464115 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.464173 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-run-httpd\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.464203 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.464235 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-config-data\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.464284 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-log-httpd\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.464320 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.464347 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl66q\" (UniqueName: \"kubernetes.io/projected/6ecd3e21-afc0-4e90-b397-a6639a01d23a-kube-api-access-tl66q\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.464393 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-scripts\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.516597 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:43 crc kubenswrapper[4917]: E0318 08:27:43.517303 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[ceilometer-tls-certs combined-ca-bundle config-data kube-api-access-tl66q log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="6ecd3e21-afc0-4e90-b397-a6639a01d23a" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.566652 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl66q\" (UniqueName: \"kubernetes.io/projected/6ecd3e21-afc0-4e90-b397-a6639a01d23a-kube-api-access-tl66q\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.566753 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-scripts\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.566820 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.566851 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-run-httpd\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.566880 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.566963 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-config-data\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.567023 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-log-httpd\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.567085 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.567490 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-run-httpd\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.572192 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.572494 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-config-data\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.572699 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-log-httpd\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.575294 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.575350 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-scripts\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.578423 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.591894 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl66q\" (UniqueName: \"kubernetes.io/projected/6ecd3e21-afc0-4e90-b397-a6639a01d23a-kube-api-access-tl66q\") pod \"ceilometer-0\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " pod="openstack/ceilometer-0" Mar 18 08:27:43 crc kubenswrapper[4917]: I0318 08:27:43.795874 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74564dae-2d5c-4b3e-9132-eb991eedb26c" path="/var/lib/kubelet/pods/74564dae-2d5c-4b3e-9132-eb991eedb26c/volumes" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.027779 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.134309 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.178182 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-ceilometer-tls-certs\") pod \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.178317 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl66q\" (UniqueName: \"kubernetes.io/projected/6ecd3e21-afc0-4e90-b397-a6639a01d23a-kube-api-access-tl66q\") pod \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.178410 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-combined-ca-bundle\") pod \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.178444 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-sg-core-conf-yaml\") pod \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.178502 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-log-httpd\") pod \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.178529 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-config-data\") pod \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.178619 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-scripts\") pod \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.178711 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-run-httpd\") pod \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\" (UID: \"6ecd3e21-afc0-4e90-b397-a6639a01d23a\") " Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.178886 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6ecd3e21-afc0-4e90-b397-a6639a01d23a" (UID: "6ecd3e21-afc0-4e90-b397-a6639a01d23a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.179255 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6ecd3e21-afc0-4e90-b397-a6639a01d23a" (UID: "6ecd3e21-afc0-4e90-b397-a6639a01d23a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.179844 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.179865 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6ecd3e21-afc0-4e90-b397-a6639a01d23a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.183056 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-scripts" (OuterVolumeSpecName: "scripts") pod "6ecd3e21-afc0-4e90-b397-a6639a01d23a" (UID: "6ecd3e21-afc0-4e90-b397-a6639a01d23a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.184642 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-config-data" (OuterVolumeSpecName: "config-data") pod "6ecd3e21-afc0-4e90-b397-a6639a01d23a" (UID: "6ecd3e21-afc0-4e90-b397-a6639a01d23a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.185577 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ecd3e21-afc0-4e90-b397-a6639a01d23a" (UID: "6ecd3e21-afc0-4e90-b397-a6639a01d23a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.185906 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6ecd3e21-afc0-4e90-b397-a6639a01d23a" (UID: "6ecd3e21-afc0-4e90-b397-a6639a01d23a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.186975 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ecd3e21-afc0-4e90-b397-a6639a01d23a-kube-api-access-tl66q" (OuterVolumeSpecName: "kube-api-access-tl66q") pod "6ecd3e21-afc0-4e90-b397-a6639a01d23a" (UID: "6ecd3e21-afc0-4e90-b397-a6639a01d23a"). InnerVolumeSpecName "kube-api-access-tl66q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.187457 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6ecd3e21-afc0-4e90-b397-a6639a01d23a" (UID: "6ecd3e21-afc0-4e90-b397-a6639a01d23a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.282320 4917 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.282356 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl66q\" (UniqueName: \"kubernetes.io/projected/6ecd3e21-afc0-4e90-b397-a6639a01d23a-kube-api-access-tl66q\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.282370 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.282381 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.282392 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.282405 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ecd3e21-afc0-4e90-b397-a6639a01d23a-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:44 crc kubenswrapper[4917]: I0318 08:27:44.639490 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.038327 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.044517 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerStarted","Data":"6d4a8fca29be64e35229c02b032652aeb43e7760f7d32ab675ef7c8c0a55fe67"} Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.097652 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.107981 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.133939 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.136316 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.138237 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.139300 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.139528 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.144098 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.201178 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.201219 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zlt9\" (UniqueName: \"kubernetes.io/projected/33c87d6a-3e90-4559-9734-994c36e5f7cd-kube-api-access-5zlt9\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.201279 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-config-data\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.201326 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.201345 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.201393 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-run-httpd\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.201430 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-log-httpd\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.201453 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-scripts\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.303898 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.303960 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zlt9\" (UniqueName: \"kubernetes.io/projected/33c87d6a-3e90-4559-9734-994c36e5f7cd-kube-api-access-5zlt9\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.304052 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-config-data\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.304125 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.304156 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.304207 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-run-httpd\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.304247 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-log-httpd\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.304267 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-scripts\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.305853 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-run-httpd\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.307440 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-log-httpd\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.309239 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-scripts\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.309274 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.311142 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.312646 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-config-data\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.314944 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.324871 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zlt9\" (UniqueName: \"kubernetes.io/projected/33c87d6a-3e90-4559-9734-994c36e5f7cd-kube-api-access-5zlt9\") pod \"ceilometer-0\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.468955 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.783498 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ecd3e21-afc0-4e90-b397-a6639a01d23a" path="/var/lib/kubelet/pods/6ecd3e21-afc0-4e90-b397-a6639a01d23a/volumes" Mar 18 08:27:45 crc kubenswrapper[4917]: I0318 08:27:45.937125 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:46 crc kubenswrapper[4917]: I0318 08:27:46.048139 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerStarted","Data":"b1bbaeab0f0df305c4a845f1cf801ae1486abfa8c14cba72057a2d05a00ff40f"} Mar 18 08:27:46 crc kubenswrapper[4917]: I0318 08:27:46.050067 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerStarted","Data":"7d41a8ec8c1079edb9822c880a124f350f18026018420a1583725b683e91fa8e"} Mar 18 08:27:47 crc kubenswrapper[4917]: I0318 08:27:47.079985 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerStarted","Data":"aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608"} Mar 18 08:27:47 crc kubenswrapper[4917]: I0318 08:27:47.085043 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerStarted","Data":"152f2c582262d7f7b036b17ab0254a156f5810cb001f3a650bc11b1ae9a149e2"} Mar 18 08:27:47 crc kubenswrapper[4917]: I0318 08:27:47.085287 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-api" containerID="cri-o://d35e6786f42bb4a27b7741d5b2ab997f955c5b396f03a7d57bef38ffdbd7d5b9" gracePeriod=30 Mar 18 08:27:47 crc kubenswrapper[4917]: I0318 08:27:47.085709 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-listener" containerID="cri-o://152f2c582262d7f7b036b17ab0254a156f5810cb001f3a650bc11b1ae9a149e2" gracePeriod=30 Mar 18 08:27:47 crc kubenswrapper[4917]: I0318 08:27:47.085731 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-evaluator" containerID="cri-o://6d4a8fca29be64e35229c02b032652aeb43e7760f7d32ab675ef7c8c0a55fe67" gracePeriod=30 Mar 18 08:27:47 crc kubenswrapper[4917]: I0318 08:27:47.085731 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-notifier" containerID="cri-o://b1bbaeab0f0df305c4a845f1cf801ae1486abfa8c14cba72057a2d05a00ff40f" gracePeriod=30 Mar 18 08:27:47 crc kubenswrapper[4917]: I0318 08:27:47.115280 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.561613739 podStartE2EDuration="6.115202975s" podCreationTimestamp="2026-03-18 08:27:41 +0000 UTC" firstStartedPulling="2026-03-18 08:27:42.167207704 +0000 UTC m=+6047.108362418" lastFinishedPulling="2026-03-18 08:27:46.72079694 +0000 UTC m=+6051.661951654" observedRunningTime="2026-03-18 08:27:47.108098403 +0000 UTC m=+6052.049253127" watchObservedRunningTime="2026-03-18 08:27:47.115202975 +0000 UTC m=+6052.056357699" Mar 18 08:27:47 crc kubenswrapper[4917]: I0318 08:27:47.399682 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:48 crc kubenswrapper[4917]: I0318 08:27:48.119013 4917 generic.go:334] "Generic (PLEG): container finished" podID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerID="b1bbaeab0f0df305c4a845f1cf801ae1486abfa8c14cba72057a2d05a00ff40f" exitCode=0 Mar 18 08:27:48 crc kubenswrapper[4917]: I0318 08:27:48.119318 4917 generic.go:334] "Generic (PLEG): container finished" podID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerID="6d4a8fca29be64e35229c02b032652aeb43e7760f7d32ab675ef7c8c0a55fe67" exitCode=0 Mar 18 08:27:48 crc kubenswrapper[4917]: I0318 08:27:48.119330 4917 generic.go:334] "Generic (PLEG): container finished" podID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerID="d35e6786f42bb4a27b7741d5b2ab997f955c5b396f03a7d57bef38ffdbd7d5b9" exitCode=0 Mar 18 08:27:48 crc kubenswrapper[4917]: I0318 08:27:48.119219 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerDied","Data":"b1bbaeab0f0df305c4a845f1cf801ae1486abfa8c14cba72057a2d05a00ff40f"} Mar 18 08:27:48 crc kubenswrapper[4917]: I0318 08:27:48.119427 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerDied","Data":"6d4a8fca29be64e35229c02b032652aeb43e7760f7d32ab675ef7c8c0a55fe67"} Mar 18 08:27:48 crc kubenswrapper[4917]: I0318 08:27:48.119445 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerDied","Data":"d35e6786f42bb4a27b7741d5b2ab997f955c5b396f03a7d57bef38ffdbd7d5b9"} Mar 18 08:27:48 crc kubenswrapper[4917]: I0318 08:27:48.135263 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerStarted","Data":"ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c"} Mar 18 08:27:49 crc kubenswrapper[4917]: I0318 08:27:49.146830 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerStarted","Data":"eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff"} Mar 18 08:27:50 crc kubenswrapper[4917]: I0318 08:27:50.293127 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 18 08:27:51 crc kubenswrapper[4917]: I0318 08:27:51.164725 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerStarted","Data":"151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf"} Mar 18 08:27:51 crc kubenswrapper[4917]: I0318 08:27:51.165183 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 08:27:51 crc kubenswrapper[4917]: I0318 08:27:51.164938 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="sg-core" containerID="cri-o://eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff" gracePeriod=30 Mar 18 08:27:51 crc kubenswrapper[4917]: I0318 08:27:51.164901 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="ceilometer-central-agent" containerID="cri-o://aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608" gracePeriod=30 Mar 18 08:27:51 crc kubenswrapper[4917]: I0318 08:27:51.164955 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="proxy-httpd" containerID="cri-o://151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf" gracePeriod=30 Mar 18 08:27:51 crc kubenswrapper[4917]: I0318 08:27:51.164974 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="ceilometer-notification-agent" containerID="cri-o://ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c" gracePeriod=30 Mar 18 08:27:51 crc kubenswrapper[4917]: I0318 08:27:51.192424 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.173689364 podStartE2EDuration="6.192402877s" podCreationTimestamp="2026-03-18 08:27:45 +0000 UTC" firstStartedPulling="2026-03-18 08:27:45.942749041 +0000 UTC m=+6050.883903755" lastFinishedPulling="2026-03-18 08:27:49.961462544 +0000 UTC m=+6054.902617268" observedRunningTime="2026-03-18 08:27:51.184829164 +0000 UTC m=+6056.125983898" watchObservedRunningTime="2026-03-18 08:27:51.192402877 +0000 UTC m=+6056.133557591" Mar 18 08:27:52 crc kubenswrapper[4917]: I0318 08:27:52.177873 4917 generic.go:334] "Generic (PLEG): container finished" podID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerID="151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf" exitCode=0 Mar 18 08:27:52 crc kubenswrapper[4917]: I0318 08:27:52.177908 4917 generic.go:334] "Generic (PLEG): container finished" podID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerID="eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff" exitCode=2 Mar 18 08:27:52 crc kubenswrapper[4917]: I0318 08:27:52.177916 4917 generic.go:334] "Generic (PLEG): container finished" podID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerID="ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c" exitCode=0 Mar 18 08:27:52 crc kubenswrapper[4917]: I0318 08:27:52.177937 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerDied","Data":"151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf"} Mar 18 08:27:52 crc kubenswrapper[4917]: I0318 08:27:52.177964 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerDied","Data":"eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff"} Mar 18 08:27:52 crc kubenswrapper[4917]: I0318 08:27:52.177976 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerDied","Data":"ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c"} Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.712459 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.806376 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-scripts\") pod \"33c87d6a-3e90-4559-9734-994c36e5f7cd\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.806487 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-ceilometer-tls-certs\") pod \"33c87d6a-3e90-4559-9734-994c36e5f7cd\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.806564 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-sg-core-conf-yaml\") pod \"33c87d6a-3e90-4559-9734-994c36e5f7cd\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.806608 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-combined-ca-bundle\") pod \"33c87d6a-3e90-4559-9734-994c36e5f7cd\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.806647 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-log-httpd\") pod \"33c87d6a-3e90-4559-9734-994c36e5f7cd\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.806700 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-run-httpd\") pod \"33c87d6a-3e90-4559-9734-994c36e5f7cd\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.806779 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zlt9\" (UniqueName: \"kubernetes.io/projected/33c87d6a-3e90-4559-9734-994c36e5f7cd-kube-api-access-5zlt9\") pod \"33c87d6a-3e90-4559-9734-994c36e5f7cd\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.806808 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-config-data\") pod \"33c87d6a-3e90-4559-9734-994c36e5f7cd\" (UID: \"33c87d6a-3e90-4559-9734-994c36e5f7cd\") " Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.808801 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "33c87d6a-3e90-4559-9734-994c36e5f7cd" (UID: "33c87d6a-3e90-4559-9734-994c36e5f7cd"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.809723 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "33c87d6a-3e90-4559-9734-994c36e5f7cd" (UID: "33c87d6a-3e90-4559-9734-994c36e5f7cd"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.812792 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c87d6a-3e90-4559-9734-994c36e5f7cd-kube-api-access-5zlt9" (OuterVolumeSpecName: "kube-api-access-5zlt9") pod "33c87d6a-3e90-4559-9734-994c36e5f7cd" (UID: "33c87d6a-3e90-4559-9734-994c36e5f7cd"). InnerVolumeSpecName "kube-api-access-5zlt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.865381 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-scripts" (OuterVolumeSpecName: "scripts") pod "33c87d6a-3e90-4559-9734-994c36e5f7cd" (UID: "33c87d6a-3e90-4559-9734-994c36e5f7cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.884265 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "33c87d6a-3e90-4559-9734-994c36e5f7cd" (UID: "33c87d6a-3e90-4559-9734-994c36e5f7cd"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.895578 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "33c87d6a-3e90-4559-9734-994c36e5f7cd" (UID: "33c87d6a-3e90-4559-9734-994c36e5f7cd"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.909271 4917 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.909299 4917 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.909310 4917 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.909319 4917 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/33c87d6a-3e90-4559-9734-994c36e5f7cd-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.909329 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zlt9\" (UniqueName: \"kubernetes.io/projected/33c87d6a-3e90-4559-9734-994c36e5f7cd-kube-api-access-5zlt9\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.909338 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.920052 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "33c87d6a-3e90-4559-9734-994c36e5f7cd" (UID: "33c87d6a-3e90-4559-9734-994c36e5f7cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:54 crc kubenswrapper[4917]: I0318 08:27:54.953347 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-config-data" (OuterVolumeSpecName: "config-data") pod "33c87d6a-3e90-4559-9734-994c36e5f7cd" (UID: "33c87d6a-3e90-4559-9734-994c36e5f7cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.010987 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.011199 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33c87d6a-3e90-4559-9734-994c36e5f7cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.210362 4917 generic.go:334] "Generic (PLEG): container finished" podID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerID="aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608" exitCode=0 Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.210403 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerDied","Data":"aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608"} Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.210427 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"33c87d6a-3e90-4559-9734-994c36e5f7cd","Type":"ContainerDied","Data":"7d41a8ec8c1079edb9822c880a124f350f18026018420a1583725b683e91fa8e"} Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.210445 4917 scope.go:117] "RemoveContainer" containerID="151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.210439 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.243681 4917 scope.go:117] "RemoveContainer" containerID="eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.250733 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.262037 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.278709 4917 scope.go:117] "RemoveContainer" containerID="ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.293474 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:55 crc kubenswrapper[4917]: E0318 08:27:55.294128 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="proxy-httpd" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.294159 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="proxy-httpd" Mar 18 08:27:55 crc kubenswrapper[4917]: E0318 08:27:55.294189 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="ceilometer-notification-agent" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.294200 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="ceilometer-notification-agent" Mar 18 08:27:55 crc kubenswrapper[4917]: E0318 08:27:55.294233 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="ceilometer-central-agent" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.294245 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="ceilometer-central-agent" Mar 18 08:27:55 crc kubenswrapper[4917]: E0318 08:27:55.294310 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="sg-core" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.294322 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="sg-core" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.294677 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="proxy-httpd" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.294707 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="sg-core" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.294731 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="ceilometer-central-agent" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.294751 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" containerName="ceilometer-notification-agent" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.300493 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.301606 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.304518 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.304623 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.304751 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.334787 4917 scope.go:117] "RemoveContainer" containerID="aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.382698 4917 scope.go:117] "RemoveContainer" containerID="151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf" Mar 18 08:27:55 crc kubenswrapper[4917]: E0318 08:27:55.383170 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf\": container with ID starting with 151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf not found: ID does not exist" containerID="151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.383197 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf"} err="failed to get container status \"151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf\": rpc error: code = NotFound desc = could not find container \"151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf\": container with ID starting with 151ccb4d1cec002947971713a4d6011f02d38d210b0b31bc688e50c1d7694bdf not found: ID does not exist" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.383217 4917 scope.go:117] "RemoveContainer" containerID="eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff" Mar 18 08:27:55 crc kubenswrapper[4917]: E0318 08:27:55.383520 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff\": container with ID starting with eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff not found: ID does not exist" containerID="eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.383538 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff"} err="failed to get container status \"eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff\": rpc error: code = NotFound desc = could not find container \"eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff\": container with ID starting with eba718a6c658b6d6e4d86d2382683fc273ecb581e3d2a8eb9bec8e743a8f95ff not found: ID does not exist" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.383551 4917 scope.go:117] "RemoveContainer" containerID="ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c" Mar 18 08:27:55 crc kubenswrapper[4917]: E0318 08:27:55.383915 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c\": container with ID starting with ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c not found: ID does not exist" containerID="ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.383934 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c"} err="failed to get container status \"ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c\": rpc error: code = NotFound desc = could not find container \"ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c\": container with ID starting with ad2c216b5afde1ed73f1692b53eaa069f7ace55513798583d476b95a63e5b78c not found: ID does not exist" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.383945 4917 scope.go:117] "RemoveContainer" containerID="aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608" Mar 18 08:27:55 crc kubenswrapper[4917]: E0318 08:27:55.384161 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608\": container with ID starting with aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608 not found: ID does not exist" containerID="aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.384182 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608"} err="failed to get container status \"aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608\": rpc error: code = NotFound desc = could not find container \"aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608\": container with ID starting with aede4a8287b19cb74fadce5f61d0abfe07c28b99a55f5f65b46414bf979c2608 not found: ID does not exist" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.418209 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2ff8\" (UniqueName: \"kubernetes.io/projected/93f15a4a-a697-4645-b0fb-1954e6f028c2-kube-api-access-b2ff8\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.418253 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f15a4a-a697-4645-b0fb-1954e6f028c2-log-httpd\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.418278 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.418313 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-config-data\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.418355 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f15a4a-a697-4645-b0fb-1954e6f028c2-run-httpd\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.418383 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.418404 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-scripts\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.418466 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.520118 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2ff8\" (UniqueName: \"kubernetes.io/projected/93f15a4a-a697-4645-b0fb-1954e6f028c2-kube-api-access-b2ff8\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.520182 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f15a4a-a697-4645-b0fb-1954e6f028c2-log-httpd\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.520686 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.520649 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f15a4a-a697-4645-b0fb-1954e6f028c2-log-httpd\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.520786 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-config-data\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.521209 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f15a4a-a697-4645-b0fb-1954e6f028c2-run-httpd\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.521290 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.521645 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93f15a4a-a697-4645-b0fb-1954e6f028c2-run-httpd\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.521795 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-scripts\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.521892 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.532962 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.532980 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.532963 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-scripts\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.532980 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.534063 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f15a4a-a697-4645-b0fb-1954e6f028c2-config-data\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.536663 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2ff8\" (UniqueName: \"kubernetes.io/projected/93f15a4a-a697-4645-b0fb-1954e6f028c2-kube-api-access-b2ff8\") pod \"ceilometer-0\" (UID: \"93f15a4a-a697-4645-b0fb-1954e6f028c2\") " pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.679364 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 18 08:27:55 crc kubenswrapper[4917]: I0318 08:27:55.788724 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33c87d6a-3e90-4559-9734-994c36e5f7cd" path="/var/lib/kubelet/pods/33c87d6a-3e90-4559-9734-994c36e5f7cd/volumes" Mar 18 08:27:56 crc kubenswrapper[4917]: I0318 08:27:56.171755 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 18 08:27:56 crc kubenswrapper[4917]: I0318 08:27:56.221557 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f15a4a-a697-4645-b0fb-1954e6f028c2","Type":"ContainerStarted","Data":"3a0f0f220dd60597f84ff7cc8cceba29bd86572183df2be8e5d432eb6eb05cf5"} Mar 18 08:27:57 crc kubenswrapper[4917]: I0318 08:27:57.236375 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f15a4a-a697-4645-b0fb-1954e6f028c2","Type":"ContainerStarted","Data":"7a0bf1817177e39d2f3e53a737668359602fa194217b1da91919176165db398c"} Mar 18 08:27:57 crc kubenswrapper[4917]: I0318 08:27:57.238246 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f15a4a-a697-4645-b0fb-1954e6f028c2","Type":"ContainerStarted","Data":"72f01880e4fc3bc2c3a306ce669499ba7c95b8c09da3445a94f915a58d80265c"} Mar 18 08:27:59 crc kubenswrapper[4917]: I0318 08:27:59.274265 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f15a4a-a697-4645-b0fb-1954e6f028c2","Type":"ContainerStarted","Data":"08fd17621672de14ce97a26dcb10c254a3520710d4b6895a1c7b2c401a2d5b07"} Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.183698 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563708-l55kp"] Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.186679 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563708-l55kp" Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.189708 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.190017 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.190535 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.195478 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563708-l55kp"] Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.231165 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpltd\" (UniqueName: \"kubernetes.io/projected/d8bcfe64-82d3-4024-938b-9ce3dbff7968-kube-api-access-cpltd\") pod \"auto-csr-approver-29563708-l55kp\" (UID: \"d8bcfe64-82d3-4024-938b-9ce3dbff7968\") " pod="openshift-infra/auto-csr-approver-29563708-l55kp" Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.333150 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpltd\" (UniqueName: \"kubernetes.io/projected/d8bcfe64-82d3-4024-938b-9ce3dbff7968-kube-api-access-cpltd\") pod \"auto-csr-approver-29563708-l55kp\" (UID: \"d8bcfe64-82d3-4024-938b-9ce3dbff7968\") " pod="openshift-infra/auto-csr-approver-29563708-l55kp" Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.357010 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpltd\" (UniqueName: \"kubernetes.io/projected/d8bcfe64-82d3-4024-938b-9ce3dbff7968-kube-api-access-cpltd\") pod \"auto-csr-approver-29563708-l55kp\" (UID: \"d8bcfe64-82d3-4024-938b-9ce3dbff7968\") " pod="openshift-infra/auto-csr-approver-29563708-l55kp" Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.507064 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563708-l55kp" Mar 18 08:28:00 crc kubenswrapper[4917]: I0318 08:28:00.968264 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563708-l55kp"] Mar 18 08:28:01 crc kubenswrapper[4917]: I0318 08:28:01.307064 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563708-l55kp" event={"ID":"d8bcfe64-82d3-4024-938b-9ce3dbff7968","Type":"ContainerStarted","Data":"bcf74ab1dedb0512e64cf630d8f8184a5e2d467986820b0ee5592d524c692173"} Mar 18 08:28:02 crc kubenswrapper[4917]: I0318 08:28:02.317125 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563708-l55kp" event={"ID":"d8bcfe64-82d3-4024-938b-9ce3dbff7968","Type":"ContainerStarted","Data":"160ba14f53215355bbdfc69af8fb581cec01e69cf3f4365356f7a919ed91deb3"} Mar 18 08:28:02 crc kubenswrapper[4917]: I0318 08:28:02.321775 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93f15a4a-a697-4645-b0fb-1954e6f028c2","Type":"ContainerStarted","Data":"8104557d523eaeb69909cffca43eb1635690405e0a72907d341291d02b79a050"} Mar 18 08:28:02 crc kubenswrapper[4917]: I0318 08:28:02.322499 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 18 08:28:02 crc kubenswrapper[4917]: I0318 08:28:02.335311 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563708-l55kp" podStartSLOduration=1.369368592 podStartE2EDuration="2.335296229s" podCreationTimestamp="2026-03-18 08:28:00 +0000 UTC" firstStartedPulling="2026-03-18 08:28:00.979439185 +0000 UTC m=+6065.920593899" lastFinishedPulling="2026-03-18 08:28:01.945366782 +0000 UTC m=+6066.886521536" observedRunningTime="2026-03-18 08:28:02.333202837 +0000 UTC m=+6067.274357561" watchObservedRunningTime="2026-03-18 08:28:02.335296229 +0000 UTC m=+6067.276450953" Mar 18 08:28:02 crc kubenswrapper[4917]: I0318 08:28:02.364364 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.056596086 podStartE2EDuration="7.364344172s" podCreationTimestamp="2026-03-18 08:27:55 +0000 UTC" firstStartedPulling="2026-03-18 08:27:56.178283376 +0000 UTC m=+6061.119438090" lastFinishedPulling="2026-03-18 08:28:01.486031462 +0000 UTC m=+6066.427186176" observedRunningTime="2026-03-18 08:28:02.356245957 +0000 UTC m=+6067.297400701" watchObservedRunningTime="2026-03-18 08:28:02.364344172 +0000 UTC m=+6067.305498906" Mar 18 08:28:02 crc kubenswrapper[4917]: I0318 08:28:02.928656 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:28:02 crc kubenswrapper[4917]: I0318 08:28:02.928721 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:28:03 crc kubenswrapper[4917]: I0318 08:28:03.334642 4917 generic.go:334] "Generic (PLEG): container finished" podID="d8bcfe64-82d3-4024-938b-9ce3dbff7968" containerID="160ba14f53215355bbdfc69af8fb581cec01e69cf3f4365356f7a919ed91deb3" exitCode=0 Mar 18 08:28:03 crc kubenswrapper[4917]: I0318 08:28:03.334839 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563708-l55kp" event={"ID":"d8bcfe64-82d3-4024-938b-9ce3dbff7968","Type":"ContainerDied","Data":"160ba14f53215355bbdfc69af8fb581cec01e69cf3f4365356f7a919ed91deb3"} Mar 18 08:28:04 crc kubenswrapper[4917]: I0318 08:28:04.830327 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563708-l55kp" Mar 18 08:28:05 crc kubenswrapper[4917]: I0318 08:28:05.034247 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpltd\" (UniqueName: \"kubernetes.io/projected/d8bcfe64-82d3-4024-938b-9ce3dbff7968-kube-api-access-cpltd\") pod \"d8bcfe64-82d3-4024-938b-9ce3dbff7968\" (UID: \"d8bcfe64-82d3-4024-938b-9ce3dbff7968\") " Mar 18 08:28:05 crc kubenswrapper[4917]: I0318 08:28:05.040901 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bcfe64-82d3-4024-938b-9ce3dbff7968-kube-api-access-cpltd" (OuterVolumeSpecName: "kube-api-access-cpltd") pod "d8bcfe64-82d3-4024-938b-9ce3dbff7968" (UID: "d8bcfe64-82d3-4024-938b-9ce3dbff7968"). InnerVolumeSpecName "kube-api-access-cpltd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:28:05 crc kubenswrapper[4917]: I0318 08:28:05.138685 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpltd\" (UniqueName: \"kubernetes.io/projected/d8bcfe64-82d3-4024-938b-9ce3dbff7968-kube-api-access-cpltd\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:05 crc kubenswrapper[4917]: I0318 08:28:05.363963 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563708-l55kp" event={"ID":"d8bcfe64-82d3-4024-938b-9ce3dbff7968","Type":"ContainerDied","Data":"bcf74ab1dedb0512e64cf630d8f8184a5e2d467986820b0ee5592d524c692173"} Mar 18 08:28:05 crc kubenswrapper[4917]: I0318 08:28:05.364028 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcf74ab1dedb0512e64cf630d8f8184a5e2d467986820b0ee5592d524c692173" Mar 18 08:28:05 crc kubenswrapper[4917]: I0318 08:28:05.364107 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563708-l55kp" Mar 18 08:28:05 crc kubenswrapper[4917]: I0318 08:28:05.416695 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563702-ff84h"] Mar 18 08:28:05 crc kubenswrapper[4917]: I0318 08:28:05.428123 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563702-ff84h"] Mar 18 08:28:05 crc kubenswrapper[4917]: I0318 08:28:05.785635 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee" path="/var/lib/kubelet/pods/9d1371e1-80c1-4eb2-b37a-6f59d5cb56ee/volumes" Mar 18 08:28:10 crc kubenswrapper[4917]: I0318 08:28:10.987118 4917 scope.go:117] "RemoveContainer" containerID="e0eda9d1182ddb76052050161544ff533285454c7d324252a80e38b3f21a460d" Mar 18 08:28:11 crc kubenswrapper[4917]: I0318 08:28:11.057952 4917 scope.go:117] "RemoveContainer" containerID="ed10f7eb8c55d5d0c5006599edf62280def30b7222e40e9b7ebd740498928072" Mar 18 08:28:11 crc kubenswrapper[4917]: I0318 08:28:11.115687 4917 scope.go:117] "RemoveContainer" containerID="7b08ad1249c59b5a7e65fc8ea1aec2131606293b33d09167c0e08b56af3b99a5" Mar 18 08:28:11 crc kubenswrapper[4917]: I0318 08:28:11.149565 4917 scope.go:117] "RemoveContainer" containerID="a26e01fa5bef84357e1705e752a1986eb3c10aa325805749bd6763853cb7e6f3" Mar 18 08:28:11 crc kubenswrapper[4917]: I0318 08:28:11.171854 4917 scope.go:117] "RemoveContainer" containerID="1f83e74f415788165f2245ebe9f12b53e625bace09100edd3e46077225459a5f" Mar 18 08:28:17 crc kubenswrapper[4917]: E0318 08:28:17.416815 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8bcfe64_82d3_4024_938b_9ce3dbff7968.slice/crio-bcf74ab1dedb0512e64cf630d8f8184a5e2d467986820b0ee5592d524c692173\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8bcfe64_82d3_4024_938b_9ce3dbff7968.slice\": RecentStats: unable to find data in memory cache]" Mar 18 08:28:17 crc kubenswrapper[4917]: I0318 08:28:17.509117 4917 generic.go:334] "Generic (PLEG): container finished" podID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerID="152f2c582262d7f7b036b17ab0254a156f5810cb001f3a650bc11b1ae9a149e2" exitCode=137 Mar 18 08:28:17 crc kubenswrapper[4917]: I0318 08:28:17.509181 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerDied","Data":"152f2c582262d7f7b036b17ab0254a156f5810cb001f3a650bc11b1ae9a149e2"} Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.198431 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.344076 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-combined-ca-bundle\") pod \"38df44bc-4081-4ed3-bee0-1b9b284a9986\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.344137 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-config-data\") pod \"38df44bc-4081-4ed3-bee0-1b9b284a9986\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.344374 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-scripts\") pod \"38df44bc-4081-4ed3-bee0-1b9b284a9986\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.345117 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnsh6\" (UniqueName: \"kubernetes.io/projected/38df44bc-4081-4ed3-bee0-1b9b284a9986-kube-api-access-wnsh6\") pod \"38df44bc-4081-4ed3-bee0-1b9b284a9986\" (UID: \"38df44bc-4081-4ed3-bee0-1b9b284a9986\") " Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.350203 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-scripts" (OuterVolumeSpecName: "scripts") pod "38df44bc-4081-4ed3-bee0-1b9b284a9986" (UID: "38df44bc-4081-4ed3-bee0-1b9b284a9986"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.351376 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38df44bc-4081-4ed3-bee0-1b9b284a9986-kube-api-access-wnsh6" (OuterVolumeSpecName: "kube-api-access-wnsh6") pod "38df44bc-4081-4ed3-bee0-1b9b284a9986" (UID: "38df44bc-4081-4ed3-bee0-1b9b284a9986"). InnerVolumeSpecName "kube-api-access-wnsh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.447810 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnsh6\" (UniqueName: \"kubernetes.io/projected/38df44bc-4081-4ed3-bee0-1b9b284a9986-kube-api-access-wnsh6\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.447852 4917 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-scripts\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.457782 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38df44bc-4081-4ed3-bee0-1b9b284a9986" (UID: "38df44bc-4081-4ed3-bee0-1b9b284a9986"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.473949 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-config-data" (OuterVolumeSpecName: "config-data") pod "38df44bc-4081-4ed3-bee0-1b9b284a9986" (UID: "38df44bc-4081-4ed3-bee0-1b9b284a9986"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.536672 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"38df44bc-4081-4ed3-bee0-1b9b284a9986","Type":"ContainerDied","Data":"a2f418e68a40547bd4b340175b6b68c7d67c5c65cc1e31a0621d280722d28f7c"} Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.536743 4917 scope.go:117] "RemoveContainer" containerID="152f2c582262d7f7b036b17ab0254a156f5810cb001f3a650bc11b1ae9a149e2" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.536797 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.549334 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.549385 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38df44bc-4081-4ed3-bee0-1b9b284a9986-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.567308 4917 scope.go:117] "RemoveContainer" containerID="b1bbaeab0f0df305c4a845f1cf801ae1486abfa8c14cba72057a2d05a00ff40f" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.587752 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.597640 4917 scope.go:117] "RemoveContainer" containerID="6d4a8fca29be64e35229c02b032652aeb43e7760f7d32ab675ef7c8c0a55fe67" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.619545 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.621827 4917 scope.go:117] "RemoveContainer" containerID="d35e6786f42bb4a27b7741d5b2ab997f955c5b396f03a7d57bef38ffdbd7d5b9" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.636642 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 18 08:28:18 crc kubenswrapper[4917]: E0318 08:28:18.637050 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-api" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637067 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-api" Mar 18 08:28:18 crc kubenswrapper[4917]: E0318 08:28:18.637089 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bcfe64-82d3-4024-938b-9ce3dbff7968" containerName="oc" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637094 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bcfe64-82d3-4024-938b-9ce3dbff7968" containerName="oc" Mar 18 08:28:18 crc kubenswrapper[4917]: E0318 08:28:18.637116 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-listener" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637122 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-listener" Mar 18 08:28:18 crc kubenswrapper[4917]: E0318 08:28:18.637140 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-evaluator" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637365 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-evaluator" Mar 18 08:28:18 crc kubenswrapper[4917]: E0318 08:28:18.637385 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-notifier" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637391 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-notifier" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637597 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bcfe64-82d3-4024-938b-9ce3dbff7968" containerName="oc" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637621 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-notifier" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637632 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-evaluator" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637642 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-api" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.637654 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" containerName="aodh-listener" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.639535 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.642116 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.642166 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.642247 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-xwrh9" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.642352 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.643789 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.659997 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.753670 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-config-data\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.753774 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-scripts\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.754032 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvw2f\" (UniqueName: \"kubernetes.io/projected/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-kube-api-access-tvw2f\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.754142 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-public-tls-certs\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.754633 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.754667 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-internal-tls-certs\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.856669 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.856842 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-internal-tls-certs\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.856960 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-config-data\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.857054 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-scripts\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.857100 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvw2f\" (UniqueName: \"kubernetes.io/projected/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-kube-api-access-tvw2f\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.857161 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-public-tls-certs\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.861818 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-scripts\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.861912 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-public-tls-certs\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.862496 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-config-data\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.862615 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-internal-tls-certs\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.862923 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-combined-ca-bundle\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.878294 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvw2f\" (UniqueName: \"kubernetes.io/projected/47cdaf9a-44d8-4517-9e86-2e855a5dbfb4-kube-api-access-tvw2f\") pod \"aodh-0\" (UID: \"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4\") " pod="openstack/aodh-0" Mar 18 08:28:18 crc kubenswrapper[4917]: I0318 08:28:18.958531 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 18 08:28:19 crc kubenswrapper[4917]: I0318 08:28:19.468531 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 18 08:28:19 crc kubenswrapper[4917]: I0318 08:28:19.549233 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4","Type":"ContainerStarted","Data":"2d3ae7cb0c3ecf1d91016cb2da2f04580b095a4378a087e7d58962091eda5211"} Mar 18 08:28:19 crc kubenswrapper[4917]: I0318 08:28:19.794561 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38df44bc-4081-4ed3-bee0-1b9b284a9986" path="/var/lib/kubelet/pods/38df44bc-4081-4ed3-bee0-1b9b284a9986/volumes" Mar 18 08:28:20 crc kubenswrapper[4917]: I0318 08:28:20.562306 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4","Type":"ContainerStarted","Data":"8f44960bda0a20c048c1c7206ef8aa179ca3ec4fcf13c7f92c6798d29791066c"} Mar 18 08:28:20 crc kubenswrapper[4917]: I0318 08:28:20.562701 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4","Type":"ContainerStarted","Data":"eae6424719f8aafd77fe25d1a5afd9e71f9ae4c75819ac463e0f845f6543e041"} Mar 18 08:28:21 crc kubenswrapper[4917]: I0318 08:28:21.578348 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4","Type":"ContainerStarted","Data":"b19030b50220fd116cd0202997996b22635763a1ac529489b812f742f0a95de5"} Mar 18 08:28:21 crc kubenswrapper[4917]: I0318 08:28:21.578780 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"47cdaf9a-44d8-4517-9e86-2e855a5dbfb4","Type":"ContainerStarted","Data":"db6f26ebb430193ea93179dd5960ca3f053c40db209765cc01d605063c510135"} Mar 18 08:28:21 crc kubenswrapper[4917]: I0318 08:28:21.634264 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.295956498 podStartE2EDuration="3.634237814s" podCreationTimestamp="2026-03-18 08:28:18 +0000 UTC" firstStartedPulling="2026-03-18 08:28:19.47922576 +0000 UTC m=+6084.420380474" lastFinishedPulling="2026-03-18 08:28:20.817507076 +0000 UTC m=+6085.758661790" observedRunningTime="2026-03-18 08:28:21.609685439 +0000 UTC m=+6086.550840193" watchObservedRunningTime="2026-03-18 08:28:21.634237814 +0000 UTC m=+6086.575392568" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.558152 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dcd485b5c-dbd4p"] Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.561331 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.564163 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.569393 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dcd485b5c-dbd4p"] Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.695079 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-openstack-cell1\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.695145 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.695173 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-dns-svc\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.695316 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.695377 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-config\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.695656 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9vp8\" (UniqueName: \"kubernetes.io/projected/9481fc03-81d6-4d7e-92ce-d2854306754f-kube-api-access-v9vp8\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.740514 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dcd485b5c-dbd4p"] Mar 18 08:28:23 crc kubenswrapper[4917]: E0318 08:28:23.741438 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-v9vp8 openstack-cell1 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" podUID="9481fc03-81d6-4d7e-92ce-d2854306754f" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.772107 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68c69f87f-zcd25"] Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.774384 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.792570 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68c69f87f-zcd25"] Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.800883 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.800939 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-dns-svc\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.800999 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.801036 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-config\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.801108 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9vp8\" (UniqueName: \"kubernetes.io/projected/9481fc03-81d6-4d7e-92ce-d2854306754f-kube-api-access-v9vp8\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.801160 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-openstack-cell1\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.802101 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-openstack-cell1\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.802361 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-sb\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.802696 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-config\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.803169 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-nb\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.803767 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-dns-svc\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.819961 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9vp8\" (UniqueName: \"kubernetes.io/projected/9481fc03-81d6-4d7e-92ce-d2854306754f-kube-api-access-v9vp8\") pod \"dnsmasq-dns-5dcd485b5c-dbd4p\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.902931 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-ovsdbserver-nb\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.903025 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-config\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.903256 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdtz\" (UniqueName: \"kubernetes.io/projected/3a7032b3-8006-4159-ad39-67fd51cbda21-kube-api-access-2pdtz\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.903361 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-dns-svc\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.903428 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-openstack-cell1\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:23 crc kubenswrapper[4917]: I0318 08:28:23.903464 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-ovsdbserver-sb\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.005860 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-ovsdbserver-nb\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.006233 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-config\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.006389 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdtz\" (UniqueName: \"kubernetes.io/projected/3a7032b3-8006-4159-ad39-67fd51cbda21-kube-api-access-2pdtz\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.006463 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-dns-svc\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.006517 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-openstack-cell1\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.006549 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-ovsdbserver-sb\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.007424 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-ovsdbserver-nb\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.007565 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-ovsdbserver-sb\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.009248 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-dns-svc\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.010696 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-config\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.010926 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3a7032b3-8006-4159-ad39-67fd51cbda21-openstack-cell1\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.033863 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdtz\" (UniqueName: \"kubernetes.io/projected/3a7032b3-8006-4159-ad39-67fd51cbda21-kube-api-access-2pdtz\") pod \"dnsmasq-dns-68c69f87f-zcd25\" (UID: \"3a7032b3-8006-4159-ad39-67fd51cbda21\") " pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.099270 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.392392 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl"] Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.394052 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.403358 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.403489 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.404024 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.404185 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.416928 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl"] Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.514016 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dgvv\" (UniqueName: \"kubernetes.io/projected/44b6d485-9af3-465d-82a5-076b82497187-kube-api-access-2dgvv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.514139 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.514178 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.514244 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.615788 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.618738 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.619000 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.619240 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dgvv\" (UniqueName: \"kubernetes.io/projected/44b6d485-9af3-465d-82a5-076b82497187-kube-api-access-2dgvv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: W0318 08:28:24.619242 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a7032b3_8006_4159_ad39_67fd51cbda21.slice/crio-b006f0344ee64c3bf032e606af6f25b4b7658507f10b26fbf46bec3ecb0c984a WatchSource:0}: Error finding container b006f0344ee64c3bf032e606af6f25b4b7658507f10b26fbf46bec3ecb0c984a: Status 404 returned error can't find the container with id b006f0344ee64c3bf032e606af6f25b4b7658507f10b26fbf46bec3ecb0c984a Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.624971 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.625058 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.626490 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.627793 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.628402 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68c69f87f-zcd25"] Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.639980 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dgvv\" (UniqueName: \"kubernetes.io/projected/44b6d485-9af3-465d-82a5-076b82497187-kube-api-access-2dgvv\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.718112 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.752718 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.825475 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-sb\") pod \"9481fc03-81d6-4d7e-92ce-d2854306754f\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.827158 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9vp8\" (UniqueName: \"kubernetes.io/projected/9481fc03-81d6-4d7e-92ce-d2854306754f-kube-api-access-v9vp8\") pod \"9481fc03-81d6-4d7e-92ce-d2854306754f\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.827190 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-openstack-cell1\") pod \"9481fc03-81d6-4d7e-92ce-d2854306754f\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.827249 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-dns-svc\") pod \"9481fc03-81d6-4d7e-92ce-d2854306754f\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.827274 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-nb\") pod \"9481fc03-81d6-4d7e-92ce-d2854306754f\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.827340 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-config\") pod \"9481fc03-81d6-4d7e-92ce-d2854306754f\" (UID: \"9481fc03-81d6-4d7e-92ce-d2854306754f\") " Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.825999 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9481fc03-81d6-4d7e-92ce-d2854306754f" (UID: "9481fc03-81d6-4d7e-92ce-d2854306754f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.830797 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-config" (OuterVolumeSpecName: "config") pod "9481fc03-81d6-4d7e-92ce-d2854306754f" (UID: "9481fc03-81d6-4d7e-92ce-d2854306754f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.831405 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9481fc03-81d6-4d7e-92ce-d2854306754f" (UID: "9481fc03-81d6-4d7e-92ce-d2854306754f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.831675 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9481fc03-81d6-4d7e-92ce-d2854306754f" (UID: "9481fc03-81d6-4d7e-92ce-d2854306754f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.831681 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "9481fc03-81d6-4d7e-92ce-d2854306754f" (UID: "9481fc03-81d6-4d7e-92ce-d2854306754f"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.834960 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9481fc03-81d6-4d7e-92ce-d2854306754f-kube-api-access-v9vp8" (OuterVolumeSpecName: "kube-api-access-v9vp8") pod "9481fc03-81d6-4d7e-92ce-d2854306754f" (UID: "9481fc03-81d6-4d7e-92ce-d2854306754f"). InnerVolumeSpecName "kube-api-access-v9vp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.931794 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.931828 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9vp8\" (UniqueName: \"kubernetes.io/projected/9481fc03-81d6-4d7e-92ce-d2854306754f-kube-api-access-v9vp8\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.931841 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.931856 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.931866 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:24 crc kubenswrapper[4917]: I0318 08:28:24.931877 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9481fc03-81d6-4d7e-92ce-d2854306754f-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:25 crc kubenswrapper[4917]: I0318 08:28:25.281881 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl"] Mar 18 08:28:25 crc kubenswrapper[4917]: W0318 08:28:25.286438 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44b6d485_9af3_465d_82a5_076b82497187.slice/crio-8b145431e15c9715a887e2aa9633f9953d3367f004f3afc85f1d9d1dcce46046 WatchSource:0}: Error finding container 8b145431e15c9715a887e2aa9633f9953d3367f004f3afc85f1d9d1dcce46046: Status 404 returned error can't find the container with id 8b145431e15c9715a887e2aa9633f9953d3367f004f3afc85f1d9d1dcce46046 Mar 18 08:28:25 crc kubenswrapper[4917]: I0318 08:28:25.646153 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" event={"ID":"44b6d485-9af3-465d-82a5-076b82497187","Type":"ContainerStarted","Data":"8b145431e15c9715a887e2aa9633f9953d3367f004f3afc85f1d9d1dcce46046"} Mar 18 08:28:25 crc kubenswrapper[4917]: I0318 08:28:25.648537 4917 generic.go:334] "Generic (PLEG): container finished" podID="3a7032b3-8006-4159-ad39-67fd51cbda21" containerID="14e39bdd88db8ce4580ad2144192ffe92897707ee4c653fb1c5480af7dddce31" exitCode=0 Mar 18 08:28:25 crc kubenswrapper[4917]: I0318 08:28:25.648635 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c69f87f-zcd25" event={"ID":"3a7032b3-8006-4159-ad39-67fd51cbda21","Type":"ContainerDied","Data":"14e39bdd88db8ce4580ad2144192ffe92897707ee4c653fb1c5480af7dddce31"} Mar 18 08:28:25 crc kubenswrapper[4917]: I0318 08:28:25.648680 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c69f87f-zcd25" event={"ID":"3a7032b3-8006-4159-ad39-67fd51cbda21","Type":"ContainerStarted","Data":"b006f0344ee64c3bf032e606af6f25b4b7658507f10b26fbf46bec3ecb0c984a"} Mar 18 08:28:25 crc kubenswrapper[4917]: I0318 08:28:25.648694 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dcd485b5c-dbd4p" Mar 18 08:28:25 crc kubenswrapper[4917]: I0318 08:28:25.698058 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 18 08:28:26 crc kubenswrapper[4917]: I0318 08:28:26.004967 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dcd485b5c-dbd4p"] Mar 18 08:28:26 crc kubenswrapper[4917]: I0318 08:28:26.020678 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dcd485b5c-dbd4p"] Mar 18 08:28:26 crc kubenswrapper[4917]: I0318 08:28:26.661204 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68c69f87f-zcd25" event={"ID":"3a7032b3-8006-4159-ad39-67fd51cbda21","Type":"ContainerStarted","Data":"423b1a97d2c9de1223a5a61a538fa8db9ceb632aa12252c8fb5255ac23e2bfd9"} Mar 18 08:28:26 crc kubenswrapper[4917]: I0318 08:28:26.661458 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:26 crc kubenswrapper[4917]: I0318 08:28:26.688887 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68c69f87f-zcd25" podStartSLOduration=3.688865 podStartE2EDuration="3.688865s" podCreationTimestamp="2026-03-18 08:28:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 08:28:26.679784411 +0000 UTC m=+6091.620939155" watchObservedRunningTime="2026-03-18 08:28:26.688865 +0000 UTC m=+6091.630019724" Mar 18 08:28:27 crc kubenswrapper[4917]: I0318 08:28:27.793215 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9481fc03-81d6-4d7e-92ce-d2854306754f" path="/var/lib/kubelet/pods/9481fc03-81d6-4d7e-92ce-d2854306754f/volumes" Mar 18 08:28:28 crc kubenswrapper[4917]: I0318 08:28:28.060324 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-e690-account-create-update-xnq4k"] Mar 18 08:28:28 crc kubenswrapper[4917]: I0318 08:28:28.070966 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gm7kh"] Mar 18 08:28:28 crc kubenswrapper[4917]: I0318 08:28:28.079402 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-e690-account-create-update-xnq4k"] Mar 18 08:28:28 crc kubenswrapper[4917]: I0318 08:28:28.086546 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gm7kh"] Mar 18 08:28:29 crc kubenswrapper[4917]: I0318 08:28:29.783397 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1d2452-5873-42b8-9c28-a00cb39bc927" path="/var/lib/kubelet/pods/0b1d2452-5873-42b8-9c28-a00cb39bc927/volumes" Mar 18 08:28:29 crc kubenswrapper[4917]: I0318 08:28:29.784076 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82e0e08f-d0a3-4279-b848-c9a7e6c1e62b" path="/var/lib/kubelet/pods/82e0e08f-d0a3-4279-b848-c9a7e6c1e62b/volumes" Mar 18 08:28:32 crc kubenswrapper[4917]: I0318 08:28:32.928748 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:28:32 crc kubenswrapper[4917]: I0318 08:28:32.929229 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:28:32 crc kubenswrapper[4917]: I0318 08:28:32.929272 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:28:32 crc kubenswrapper[4917]: I0318 08:28:32.930018 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:28:32 crc kubenswrapper[4917]: I0318 08:28:32.930070 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" gracePeriod=600 Mar 18 08:28:33 crc kubenswrapper[4917]: I0318 08:28:33.739074 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" exitCode=0 Mar 18 08:28:33 crc kubenswrapper[4917]: I0318 08:28:33.739121 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b"} Mar 18 08:28:33 crc kubenswrapper[4917]: I0318 08:28:33.739178 4917 scope.go:117] "RemoveContainer" containerID="2d945d8f3fb288f4abf13f7d56da1feb3070f1b5588474b04c510533002bdbef" Mar 18 08:28:34 crc kubenswrapper[4917]: I0318 08:28:34.100730 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68c69f87f-zcd25" Mar 18 08:28:34 crc kubenswrapper[4917]: I0318 08:28:34.165708 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5645c74dc7-9k229"] Mar 18 08:28:34 crc kubenswrapper[4917]: I0318 08:28:34.166241 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" podUID="a87fcd6c-0fa1-46c8-b3e4-460366737a7e" containerName="dnsmasq-dns" containerID="cri-o://451583e8f44362961f425c6156cd9956e9f550da9110cb946c3301a38ed2924b" gracePeriod=10 Mar 18 08:28:34 crc kubenswrapper[4917]: I0318 08:28:34.750250 4917 generic.go:334] "Generic (PLEG): container finished" podID="a87fcd6c-0fa1-46c8-b3e4-460366737a7e" containerID="451583e8f44362961f425c6156cd9956e9f550da9110cb946c3301a38ed2924b" exitCode=0 Mar 18 08:28:34 crc kubenswrapper[4917]: I0318 08:28:34.750293 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" event={"ID":"a87fcd6c-0fa1-46c8-b3e4-460366737a7e","Type":"ContainerDied","Data":"451583e8f44362961f425c6156cd9956e9f550da9110cb946c3301a38ed2924b"} Mar 18 08:28:35 crc kubenswrapper[4917]: E0318 08:28:35.599316 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:28:35 crc kubenswrapper[4917]: I0318 08:28:35.785460 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:28:35 crc kubenswrapper[4917]: E0318 08:28:35.786070 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.026105 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.064675 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-config\") pod \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.064747 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-sb\") pod \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.064856 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-dns-svc\") pod \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.064891 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz2mc\" (UniqueName: \"kubernetes.io/projected/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-kube-api-access-cz2mc\") pod \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.064920 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-nb\") pod \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\" (UID: \"a87fcd6c-0fa1-46c8-b3e4-460366737a7e\") " Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.073840 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-kube-api-access-cz2mc" (OuterVolumeSpecName: "kube-api-access-cz2mc") pod "a87fcd6c-0fa1-46c8-b3e4-460366737a7e" (UID: "a87fcd6c-0fa1-46c8-b3e4-460366737a7e"). InnerVolumeSpecName "kube-api-access-cz2mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.128891 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a87fcd6c-0fa1-46c8-b3e4-460366737a7e" (UID: "a87fcd6c-0fa1-46c8-b3e4-460366737a7e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.137661 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a87fcd6c-0fa1-46c8-b3e4-460366737a7e" (UID: "a87fcd6c-0fa1-46c8-b3e4-460366737a7e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.141468 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-config" (OuterVolumeSpecName: "config") pod "a87fcd6c-0fa1-46c8-b3e4-460366737a7e" (UID: "a87fcd6c-0fa1-46c8-b3e4-460366737a7e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.164415 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a87fcd6c-0fa1-46c8-b3e4-460366737a7e" (UID: "a87fcd6c-0fa1-46c8-b3e4-460366737a7e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.167060 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.167081 4917 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.167092 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz2mc\" (UniqueName: \"kubernetes.io/projected/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-kube-api-access-cz2mc\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.167100 4917 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.167109 4917 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a87fcd6c-0fa1-46c8-b3e4-460366737a7e-config\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.805199 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" event={"ID":"44b6d485-9af3-465d-82a5-076b82497187","Type":"ContainerStarted","Data":"7b4e55c50e3470c1e1a680be819362f953643cd7f08f6bfbcbafd8b229142e43"} Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.808242 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" event={"ID":"a87fcd6c-0fa1-46c8-b3e4-460366737a7e","Type":"ContainerDied","Data":"a8d7ac0c39c08add89fc47cac6a1f0f3582313f1c07bbcc905b8acd8d14c30ac"} Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.808310 4917 scope.go:117] "RemoveContainer" containerID="451583e8f44362961f425c6156cd9956e9f550da9110cb946c3301a38ed2924b" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.808515 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5645c74dc7-9k229" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.828883 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" podStartSLOduration=2.413513681 podStartE2EDuration="12.828867809s" podCreationTimestamp="2026-03-18 08:28:24 +0000 UTC" firstStartedPulling="2026-03-18 08:28:25.288931978 +0000 UTC m=+6090.230086692" lastFinishedPulling="2026-03-18 08:28:35.704286106 +0000 UTC m=+6100.645440820" observedRunningTime="2026-03-18 08:28:36.821696885 +0000 UTC m=+6101.762851679" watchObservedRunningTime="2026-03-18 08:28:36.828867809 +0000 UTC m=+6101.770022523" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.854008 4917 scope.go:117] "RemoveContainer" containerID="daa4227352c09a3b53e7569f0ab7e8e74ce833b89b2ccf1c13c729d4285c5b8c" Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.863953 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5645c74dc7-9k229"] Mar 18 08:28:36 crc kubenswrapper[4917]: I0318 08:28:36.874865 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5645c74dc7-9k229"] Mar 18 08:28:37 crc kubenswrapper[4917]: I0318 08:28:37.792388 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87fcd6c-0fa1-46c8-b3e4-460366737a7e" path="/var/lib/kubelet/pods/a87fcd6c-0fa1-46c8-b3e4-460366737a7e/volumes" Mar 18 08:28:47 crc kubenswrapper[4917]: I0318 08:28:47.773258 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:28:47 crc kubenswrapper[4917]: E0318 08:28:47.774197 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:28:49 crc kubenswrapper[4917]: I0318 08:28:49.984506 4917 generic.go:334] "Generic (PLEG): container finished" podID="44b6d485-9af3-465d-82a5-076b82497187" containerID="7b4e55c50e3470c1e1a680be819362f953643cd7f08f6bfbcbafd8b229142e43" exitCode=0 Mar 18 08:28:49 crc kubenswrapper[4917]: I0318 08:28:49.984577 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" event={"ID":"44b6d485-9af3-465d-82a5-076b82497187","Type":"ContainerDied","Data":"7b4e55c50e3470c1e1a680be819362f953643cd7f08f6bfbcbafd8b229142e43"} Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.446786 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.518339 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-pre-adoption-validation-combined-ca-bundle\") pod \"44b6d485-9af3-465d-82a5-076b82497187\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.518447 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-inventory\") pod \"44b6d485-9af3-465d-82a5-076b82497187\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.518511 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dgvv\" (UniqueName: \"kubernetes.io/projected/44b6d485-9af3-465d-82a5-076b82497187-kube-api-access-2dgvv\") pod \"44b6d485-9af3-465d-82a5-076b82497187\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.518538 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-ssh-key-openstack-cell1\") pod \"44b6d485-9af3-465d-82a5-076b82497187\" (UID: \"44b6d485-9af3-465d-82a5-076b82497187\") " Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.523731 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "44b6d485-9af3-465d-82a5-076b82497187" (UID: "44b6d485-9af3-465d-82a5-076b82497187"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.524449 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b6d485-9af3-465d-82a5-076b82497187-kube-api-access-2dgvv" (OuterVolumeSpecName: "kube-api-access-2dgvv") pod "44b6d485-9af3-465d-82a5-076b82497187" (UID: "44b6d485-9af3-465d-82a5-076b82497187"). InnerVolumeSpecName "kube-api-access-2dgvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.549334 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-inventory" (OuterVolumeSpecName: "inventory") pod "44b6d485-9af3-465d-82a5-076b82497187" (UID: "44b6d485-9af3-465d-82a5-076b82497187"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.549881 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "44b6d485-9af3-465d-82a5-076b82497187" (UID: "44b6d485-9af3-465d-82a5-076b82497187"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.621431 4917 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.621464 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.621475 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dgvv\" (UniqueName: \"kubernetes.io/projected/44b6d485-9af3-465d-82a5-076b82497187-kube-api-access-2dgvv\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:51 crc kubenswrapper[4917]: I0318 08:28:51.621485 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/44b6d485-9af3-465d-82a5-076b82497187-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:28:52 crc kubenswrapper[4917]: I0318 08:28:52.013258 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" event={"ID":"44b6d485-9af3-465d-82a5-076b82497187","Type":"ContainerDied","Data":"8b145431e15c9715a887e2aa9633f9953d3367f004f3afc85f1d9d1dcce46046"} Mar 18 08:28:52 crc kubenswrapper[4917]: I0318 08:28:52.013318 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b145431e15c9715a887e2aa9633f9953d3367f004f3afc85f1d9d1dcce46046" Mar 18 08:28:52 crc kubenswrapper[4917]: I0318 08:28:52.013396 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.292733 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj"] Mar 18 08:28:56 crc kubenswrapper[4917]: E0318 08:28:56.293883 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b6d485-9af3-465d-82a5-076b82497187" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.293904 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b6d485-9af3-465d-82a5-076b82497187" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 18 08:28:56 crc kubenswrapper[4917]: E0318 08:28:56.293951 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87fcd6c-0fa1-46c8-b3e4-460366737a7e" containerName="init" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.293961 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87fcd6c-0fa1-46c8-b3e4-460366737a7e" containerName="init" Mar 18 08:28:56 crc kubenswrapper[4917]: E0318 08:28:56.293973 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87fcd6c-0fa1-46c8-b3e4-460366737a7e" containerName="dnsmasq-dns" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.293980 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87fcd6c-0fa1-46c8-b3e4-460366737a7e" containerName="dnsmasq-dns" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.294201 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87fcd6c-0fa1-46c8-b3e4-460366737a7e" containerName="dnsmasq-dns" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.294230 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b6d485-9af3-465d-82a5-076b82497187" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.295065 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.297062 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.297788 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.298120 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.298608 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.306573 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj"] Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.422447 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.422545 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.422661 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2cc7\" (UniqueName: \"kubernetes.io/projected/afc5f89c-2aa9-4a99-8fda-794ab379f98a-kube-api-access-d2cc7\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.422702 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.524939 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.525119 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.525215 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2cc7\" (UniqueName: \"kubernetes.io/projected/afc5f89c-2aa9-4a99-8fda-794ab379f98a-kube-api-access-d2cc7\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.525391 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.539259 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.539513 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.540191 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.542622 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2cc7\" (UniqueName: \"kubernetes.io/projected/afc5f89c-2aa9-4a99-8fda-794ab379f98a-kube-api-access-d2cc7\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:56 crc kubenswrapper[4917]: I0318 08:28:56.624172 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:28:57 crc kubenswrapper[4917]: I0318 08:28:57.219391 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj"] Mar 18 08:28:58 crc kubenswrapper[4917]: I0318 08:28:58.052639 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-k7f54"] Mar 18 08:28:58 crc kubenswrapper[4917]: I0318 08:28:58.063099 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-k7f54"] Mar 18 08:28:58 crc kubenswrapper[4917]: I0318 08:28:58.103166 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" event={"ID":"afc5f89c-2aa9-4a99-8fda-794ab379f98a","Type":"ContainerStarted","Data":"71e5c818b04e35a627659c7237cf34f3f4b23bc99404b2fc84e0d59f20beb4df"} Mar 18 08:28:58 crc kubenswrapper[4917]: I0318 08:28:58.103215 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" event={"ID":"afc5f89c-2aa9-4a99-8fda-794ab379f98a","Type":"ContainerStarted","Data":"e285b92613bef04d979b0fa882b982ccd7fdbddba44b6c4669788980dafc2a9c"} Mar 18 08:28:58 crc kubenswrapper[4917]: I0318 08:28:58.134407 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" podStartSLOduration=1.71007836 podStartE2EDuration="2.13438686s" podCreationTimestamp="2026-03-18 08:28:56 +0000 UTC" firstStartedPulling="2026-03-18 08:28:57.233754108 +0000 UTC m=+6122.174908842" lastFinishedPulling="2026-03-18 08:28:57.658062628 +0000 UTC m=+6122.599217342" observedRunningTime="2026-03-18 08:28:58.125103155 +0000 UTC m=+6123.066257889" watchObservedRunningTime="2026-03-18 08:28:58.13438686 +0000 UTC m=+6123.075541584" Mar 18 08:28:58 crc kubenswrapper[4917]: I0318 08:28:58.773314 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:28:58 crc kubenswrapper[4917]: E0318 08:28:58.774290 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:28:59 crc kubenswrapper[4917]: I0318 08:28:59.788361 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c6bdbaf-275f-43d6-b502-37e885457649" path="/var/lib/kubelet/pods/3c6bdbaf-275f-43d6-b502-37e885457649/volumes" Mar 18 08:29:10 crc kubenswrapper[4917]: I0318 08:29:10.773013 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:29:10 crc kubenswrapper[4917]: E0318 08:29:10.773955 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:29:11 crc kubenswrapper[4917]: I0318 08:29:11.366657 4917 scope.go:117] "RemoveContainer" containerID="09387f4abe131d4c4d1ea2f47aa2b51b59892523e42bc89acbb47cfc7db3d531" Mar 18 08:29:11 crc kubenswrapper[4917]: I0318 08:29:11.388414 4917 scope.go:117] "RemoveContainer" containerID="8aeb64520d48c0a4cc6d7c6710c1fc281898f64f7a729e8b6529cb874e37ffa0" Mar 18 08:29:11 crc kubenswrapper[4917]: I0318 08:29:11.485769 4917 scope.go:117] "RemoveContainer" containerID="656c593bede78f589afef3d6f8137261b7ec3a8c8d9d0914356baf2172bd0b58" Mar 18 08:29:22 crc kubenswrapper[4917]: I0318 08:29:22.772903 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:29:22 crc kubenswrapper[4917]: E0318 08:29:22.775133 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:29:30 crc kubenswrapper[4917]: I0318 08:29:30.056474 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-vmff7"] Mar 18 08:29:30 crc kubenswrapper[4917]: I0318 08:29:30.065392 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3ec6-account-create-update-mg7k5"] Mar 18 08:29:30 crc kubenswrapper[4917]: I0318 08:29:30.073310 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-vmff7"] Mar 18 08:29:30 crc kubenswrapper[4917]: I0318 08:29:30.081347 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3ec6-account-create-update-mg7k5"] Mar 18 08:29:31 crc kubenswrapper[4917]: I0318 08:29:31.791030 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0464b93f-ba11-4e92-8b47-5eac3f0e08c6" path="/var/lib/kubelet/pods/0464b93f-ba11-4e92-8b47-5eac3f0e08c6/volumes" Mar 18 08:29:31 crc kubenswrapper[4917]: I0318 08:29:31.792113 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba4b545-6574-461b-88fb-11801374835a" path="/var/lib/kubelet/pods/6ba4b545-6574-461b-88fb-11801374835a/volumes" Mar 18 08:29:37 crc kubenswrapper[4917]: I0318 08:29:37.773313 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:29:37 crc kubenswrapper[4917]: E0318 08:29:37.775491 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:29:40 crc kubenswrapper[4917]: I0318 08:29:40.046578 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-22dvx"] Mar 18 08:29:40 crc kubenswrapper[4917]: I0318 08:29:40.058566 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-22dvx"] Mar 18 08:29:41 crc kubenswrapper[4917]: I0318 08:29:41.788809 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db65ec91-ca07-49f0-8863-4c1a210f92b4" path="/var/lib/kubelet/pods/db65ec91-ca07-49f0-8863-4c1a210f92b4/volumes" Mar 18 08:29:48 crc kubenswrapper[4917]: I0318 08:29:48.772849 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:29:48 crc kubenswrapper[4917]: E0318 08:29:48.774428 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.176685 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563710-vtsvw"] Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.179598 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563710-vtsvw" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.183542 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.183759 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.183859 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.190669 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp"] Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.192286 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.197068 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.197215 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.219631 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563710-vtsvw"] Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.248675 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp"] Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.348786 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b9f4ce5-4103-41a4-8874-39c1d4697c45-config-volume\") pod \"collect-profiles-29563710-lfvrp\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.348845 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppzfd\" (UniqueName: \"kubernetes.io/projected/4b9f4ce5-4103-41a4-8874-39c1d4697c45-kube-api-access-ppzfd\") pod \"collect-profiles-29563710-lfvrp\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.348903 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmtmp\" (UniqueName: \"kubernetes.io/projected/ba68bd60-c600-4887-a9fd-ba58eea4b3f0-kube-api-access-fmtmp\") pod \"auto-csr-approver-29563710-vtsvw\" (UID: \"ba68bd60-c600-4887-a9fd-ba58eea4b3f0\") " pod="openshift-infra/auto-csr-approver-29563710-vtsvw" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.348931 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b9f4ce5-4103-41a4-8874-39c1d4697c45-secret-volume\") pod \"collect-profiles-29563710-lfvrp\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.451031 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b9f4ce5-4103-41a4-8874-39c1d4697c45-config-volume\") pod \"collect-profiles-29563710-lfvrp\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.451103 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppzfd\" (UniqueName: \"kubernetes.io/projected/4b9f4ce5-4103-41a4-8874-39c1d4697c45-kube-api-access-ppzfd\") pod \"collect-profiles-29563710-lfvrp\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.451189 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmtmp\" (UniqueName: \"kubernetes.io/projected/ba68bd60-c600-4887-a9fd-ba58eea4b3f0-kube-api-access-fmtmp\") pod \"auto-csr-approver-29563710-vtsvw\" (UID: \"ba68bd60-c600-4887-a9fd-ba58eea4b3f0\") " pod="openshift-infra/auto-csr-approver-29563710-vtsvw" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.451236 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b9f4ce5-4103-41a4-8874-39c1d4697c45-secret-volume\") pod \"collect-profiles-29563710-lfvrp\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.452131 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b9f4ce5-4103-41a4-8874-39c1d4697c45-config-volume\") pod \"collect-profiles-29563710-lfvrp\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.469590 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b9f4ce5-4103-41a4-8874-39c1d4697c45-secret-volume\") pod \"collect-profiles-29563710-lfvrp\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.475354 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppzfd\" (UniqueName: \"kubernetes.io/projected/4b9f4ce5-4103-41a4-8874-39c1d4697c45-kube-api-access-ppzfd\") pod \"collect-profiles-29563710-lfvrp\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.477655 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmtmp\" (UniqueName: \"kubernetes.io/projected/ba68bd60-c600-4887-a9fd-ba58eea4b3f0-kube-api-access-fmtmp\") pod \"auto-csr-approver-29563710-vtsvw\" (UID: \"ba68bd60-c600-4887-a9fd-ba58eea4b3f0\") " pod="openshift-infra/auto-csr-approver-29563710-vtsvw" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.510632 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563710-vtsvw" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.531900 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:00 crc kubenswrapper[4917]: I0318 08:30:00.977632 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563710-vtsvw"] Mar 18 08:30:00 crc kubenswrapper[4917]: W0318 08:30:00.979884 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba68bd60_c600_4887_a9fd_ba58eea4b3f0.slice/crio-a36add9be560ae36454e2613fbe946f65c408de17eefd5cf4b3a1e65f954d510 WatchSource:0}: Error finding container a36add9be560ae36454e2613fbe946f65c408de17eefd5cf4b3a1e65f954d510: Status 404 returned error can't find the container with id a36add9be560ae36454e2613fbe946f65c408de17eefd5cf4b3a1e65f954d510 Mar 18 08:30:01 crc kubenswrapper[4917]: I0318 08:30:01.076265 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp"] Mar 18 08:30:01 crc kubenswrapper[4917]: I0318 08:30:01.773596 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:30:01 crc kubenswrapper[4917]: E0318 08:30:01.775075 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:30:01 crc kubenswrapper[4917]: I0318 08:30:01.931171 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563710-vtsvw" event={"ID":"ba68bd60-c600-4887-a9fd-ba58eea4b3f0","Type":"ContainerStarted","Data":"a36add9be560ae36454e2613fbe946f65c408de17eefd5cf4b3a1e65f954d510"} Mar 18 08:30:01 crc kubenswrapper[4917]: I0318 08:30:01.933542 4917 generic.go:334] "Generic (PLEG): container finished" podID="4b9f4ce5-4103-41a4-8874-39c1d4697c45" containerID="436adbffa75c926bea21413d97049e7a68a9cc3d5ebec46fd13a9b904ad67375" exitCode=0 Mar 18 08:30:01 crc kubenswrapper[4917]: I0318 08:30:01.933617 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" event={"ID":"4b9f4ce5-4103-41a4-8874-39c1d4697c45","Type":"ContainerDied","Data":"436adbffa75c926bea21413d97049e7a68a9cc3d5ebec46fd13a9b904ad67375"} Mar 18 08:30:01 crc kubenswrapper[4917]: I0318 08:30:01.933651 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" event={"ID":"4b9f4ce5-4103-41a4-8874-39c1d4697c45","Type":"ContainerStarted","Data":"8dad4ad29d397e1c9fe4976c6f892beba6faef3d5d27a9f28149e5b6cf1cade5"} Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.291940 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.414343 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppzfd\" (UniqueName: \"kubernetes.io/projected/4b9f4ce5-4103-41a4-8874-39c1d4697c45-kube-api-access-ppzfd\") pod \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.414462 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b9f4ce5-4103-41a4-8874-39c1d4697c45-config-volume\") pod \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.414689 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b9f4ce5-4103-41a4-8874-39c1d4697c45-secret-volume\") pod \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\" (UID: \"4b9f4ce5-4103-41a4-8874-39c1d4697c45\") " Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.416366 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b9f4ce5-4103-41a4-8874-39c1d4697c45-config-volume" (OuterVolumeSpecName: "config-volume") pod "4b9f4ce5-4103-41a4-8874-39c1d4697c45" (UID: "4b9f4ce5-4103-41a4-8874-39c1d4697c45"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.419756 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b9f4ce5-4103-41a4-8874-39c1d4697c45-kube-api-access-ppzfd" (OuterVolumeSpecName: "kube-api-access-ppzfd") pod "4b9f4ce5-4103-41a4-8874-39c1d4697c45" (UID: "4b9f4ce5-4103-41a4-8874-39c1d4697c45"). InnerVolumeSpecName "kube-api-access-ppzfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.420817 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9f4ce5-4103-41a4-8874-39c1d4697c45-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4b9f4ce5-4103-41a4-8874-39c1d4697c45" (UID: "4b9f4ce5-4103-41a4-8874-39c1d4697c45"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.517085 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b9f4ce5-4103-41a4-8874-39c1d4697c45-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.517565 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppzfd\" (UniqueName: \"kubernetes.io/projected/4b9f4ce5-4103-41a4-8874-39c1d4697c45-kube-api-access-ppzfd\") on node \"crc\" DevicePath \"\"" Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.517601 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b9f4ce5-4103-41a4-8874-39c1d4697c45-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.964378 4917 generic.go:334] "Generic (PLEG): container finished" podID="ba68bd60-c600-4887-a9fd-ba58eea4b3f0" containerID="ed8a3cc7d33c0f6d756c193594c6355a91f476282cb6db875f7cf119466eebba" exitCode=0 Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.964711 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563710-vtsvw" event={"ID":"ba68bd60-c600-4887-a9fd-ba58eea4b3f0","Type":"ContainerDied","Data":"ed8a3cc7d33c0f6d756c193594c6355a91f476282cb6db875f7cf119466eebba"} Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.967880 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" event={"ID":"4b9f4ce5-4103-41a4-8874-39c1d4697c45","Type":"ContainerDied","Data":"8dad4ad29d397e1c9fe4976c6f892beba6faef3d5d27a9f28149e5b6cf1cade5"} Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.967928 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dad4ad29d397e1c9fe4976c6f892beba6faef3d5d27a9f28149e5b6cf1cade5" Mar 18 08:30:03 crc kubenswrapper[4917]: I0318 08:30:03.968002 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp" Mar 18 08:30:04 crc kubenswrapper[4917]: I0318 08:30:04.400768 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h"] Mar 18 08:30:04 crc kubenswrapper[4917]: I0318 08:30:04.409693 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563665-dr46h"] Mar 18 08:30:05 crc kubenswrapper[4917]: I0318 08:30:05.398138 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563710-vtsvw" Mar 18 08:30:05 crc kubenswrapper[4917]: I0318 08:30:05.458032 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmtmp\" (UniqueName: \"kubernetes.io/projected/ba68bd60-c600-4887-a9fd-ba58eea4b3f0-kube-api-access-fmtmp\") pod \"ba68bd60-c600-4887-a9fd-ba58eea4b3f0\" (UID: \"ba68bd60-c600-4887-a9fd-ba58eea4b3f0\") " Mar 18 08:30:05 crc kubenswrapper[4917]: I0318 08:30:05.466231 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba68bd60-c600-4887-a9fd-ba58eea4b3f0-kube-api-access-fmtmp" (OuterVolumeSpecName: "kube-api-access-fmtmp") pod "ba68bd60-c600-4887-a9fd-ba58eea4b3f0" (UID: "ba68bd60-c600-4887-a9fd-ba58eea4b3f0"). InnerVolumeSpecName "kube-api-access-fmtmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:30:05 crc kubenswrapper[4917]: I0318 08:30:05.560449 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmtmp\" (UniqueName: \"kubernetes.io/projected/ba68bd60-c600-4887-a9fd-ba58eea4b3f0-kube-api-access-fmtmp\") on node \"crc\" DevicePath \"\"" Mar 18 08:30:05 crc kubenswrapper[4917]: I0318 08:30:05.790360 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9563789-4f20-4653-a02c-3b1aeb3b7cd7" path="/var/lib/kubelet/pods/c9563789-4f20-4653-a02c-3b1aeb3b7cd7/volumes" Mar 18 08:30:05 crc kubenswrapper[4917]: I0318 08:30:05.996511 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563710-vtsvw" event={"ID":"ba68bd60-c600-4887-a9fd-ba58eea4b3f0","Type":"ContainerDied","Data":"a36add9be560ae36454e2613fbe946f65c408de17eefd5cf4b3a1e65f954d510"} Mar 18 08:30:05 crc kubenswrapper[4917]: I0318 08:30:05.996548 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a36add9be560ae36454e2613fbe946f65c408de17eefd5cf4b3a1e65f954d510" Mar 18 08:30:05 crc kubenswrapper[4917]: I0318 08:30:05.996677 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563710-vtsvw" Mar 18 08:30:06 crc kubenswrapper[4917]: I0318 08:30:06.464685 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563704-b6dn6"] Mar 18 08:30:06 crc kubenswrapper[4917]: I0318 08:30:06.473198 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563704-b6dn6"] Mar 18 08:30:07 crc kubenswrapper[4917]: I0318 08:30:07.788706 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="733cf5ce-2b5c-49fb-a482-1594f3c449d8" path="/var/lib/kubelet/pods/733cf5ce-2b5c-49fb-a482-1594f3c449d8/volumes" Mar 18 08:30:11 crc kubenswrapper[4917]: I0318 08:30:11.668662 4917 scope.go:117] "RemoveContainer" containerID="c87c6fbdf41abeeb527bd0ee80a5037f1b0ea28c1e70a89a4e2e716e92cf1154" Mar 18 08:30:11 crc kubenswrapper[4917]: I0318 08:30:11.707169 4917 scope.go:117] "RemoveContainer" containerID="d9e0d6c79443b937bb561dffbc9d2f8eae57ec7e81fc0055e4c49e59ba417e12" Mar 18 08:30:11 crc kubenswrapper[4917]: I0318 08:30:11.775550 4917 scope.go:117] "RemoveContainer" containerID="26d4875fd22bf23619e35eb98d60c380db5075bdbd1e5fcb71b11be4033bfdeb" Mar 18 08:30:11 crc kubenswrapper[4917]: I0318 08:30:11.816805 4917 scope.go:117] "RemoveContainer" containerID="a9e70d4d7a0a28fe98fa44ac445d4c17303344845d458cbb1adf656bf91c940d" Mar 18 08:30:11 crc kubenswrapper[4917]: I0318 08:30:11.879023 4917 scope.go:117] "RemoveContainer" containerID="2a468d7267b1c821da59bbc01f67fa4a484b687a8fa6268586999baa993da4e7" Mar 18 08:30:11 crc kubenswrapper[4917]: I0318 08:30:11.917041 4917 scope.go:117] "RemoveContainer" containerID="5a5e2d6de9601a4b348e325efe8c643983d152225f4d268ae22fc57374091857" Mar 18 08:30:12 crc kubenswrapper[4917]: I0318 08:30:12.157392 4917 scope.go:117] "RemoveContainer" containerID="71084d56800a918dadab70c67f49ab25742f94e671e9b330a83e074d8835e906" Mar 18 08:30:14 crc kubenswrapper[4917]: I0318 08:30:14.785711 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:30:14 crc kubenswrapper[4917]: E0318 08:30:14.786522 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:30:28 crc kubenswrapper[4917]: I0318 08:30:28.773376 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:30:28 crc kubenswrapper[4917]: E0318 08:30:28.774139 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:30:38 crc kubenswrapper[4917]: I0318 08:30:38.050219 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-fkll6"] Mar 18 08:30:38 crc kubenswrapper[4917]: I0318 08:30:38.064767 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-fkll6"] Mar 18 08:30:38 crc kubenswrapper[4917]: I0318 08:30:38.081378 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-25be-account-create-update-xsm5n"] Mar 18 08:30:38 crc kubenswrapper[4917]: I0318 08:30:38.090153 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-25be-account-create-update-xsm5n"] Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.067163 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-xfjbc"] Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.087148 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a0e3-account-create-update-wvbdw"] Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.107857 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nj8qx"] Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.126047 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nj8qx"] Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.138553 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-xfjbc"] Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.149412 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-d6b7-account-create-update-c8b9k"] Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.158643 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a0e3-account-create-update-wvbdw"] Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.166256 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-d6b7-account-create-update-c8b9k"] Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.772908 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:30:39 crc kubenswrapper[4917]: E0318 08:30:39.773643 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.784690 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001a88e1-5ca3-42d1-a8f9-3f7ff1da3612" path="/var/lib/kubelet/pods/001a88e1-5ca3-42d1-a8f9-3f7ff1da3612/volumes" Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.785243 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0549cd0a-fd3c-401a-8caf-9e0348faccb9" path="/var/lib/kubelet/pods/0549cd0a-fd3c-401a-8caf-9e0348faccb9/volumes" Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.785775 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702e1fec-e89d-4ff3-9e73-25e19d2d1fdd" path="/var/lib/kubelet/pods/702e1fec-e89d-4ff3-9e73-25e19d2d1fdd/volumes" Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.786371 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91286214-725d-4737-b3d0-ddc623a822a5" path="/var/lib/kubelet/pods/91286214-725d-4737-b3d0-ddc623a822a5/volumes" Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.787417 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2741122-0601-450f-832a-ce3d4f5f3d01" path="/var/lib/kubelet/pods/d2741122-0601-450f-832a-ce3d4f5f3d01/volumes" Mar 18 08:30:39 crc kubenswrapper[4917]: I0318 08:30:39.787996 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e472482c-2d27-49bc-816c-29e6cd155b1d" path="/var/lib/kubelet/pods/e472482c-2d27-49bc-816c-29e6cd155b1d/volumes" Mar 18 08:30:54 crc kubenswrapper[4917]: I0318 08:30:54.773262 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:30:54 crc kubenswrapper[4917]: E0318 08:30:54.774200 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:31:01 crc kubenswrapper[4917]: I0318 08:31:01.058108 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrwwh"] Mar 18 08:31:01 crc kubenswrapper[4917]: I0318 08:31:01.068694 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mrwwh"] Mar 18 08:31:01 crc kubenswrapper[4917]: I0318 08:31:01.787427 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed9261d-f69b-4758-97e2-156e4ad13843" path="/var/lib/kubelet/pods/3ed9261d-f69b-4758-97e2-156e4ad13843/volumes" Mar 18 08:31:08 crc kubenswrapper[4917]: I0318 08:31:08.773248 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:31:08 crc kubenswrapper[4917]: E0318 08:31:08.774023 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:31:12 crc kubenswrapper[4917]: I0318 08:31:12.282697 4917 scope.go:117] "RemoveContainer" containerID="5f0678f82975df2dc116603e4b78f74d3b22b82db440e4024b19859d27f1ebf7" Mar 18 08:31:12 crc kubenswrapper[4917]: I0318 08:31:12.314012 4917 scope.go:117] "RemoveContainer" containerID="06d880291bfbccd6b0c51cb447fbaddd8bebd8e00c143310d62f7d3c313ac133" Mar 18 08:31:12 crc kubenswrapper[4917]: I0318 08:31:12.383509 4917 scope.go:117] "RemoveContainer" containerID="4f95bf531ae7f8eda3c1949ee691b3a05cc67a9583bc60a9e9b52cefea3d7d4a" Mar 18 08:31:12 crc kubenswrapper[4917]: I0318 08:31:12.434954 4917 scope.go:117] "RemoveContainer" containerID="a07cba8f2bf2f194d04b9cc291b68f59a5f81b62b1efce386af26283985188a0" Mar 18 08:31:12 crc kubenswrapper[4917]: I0318 08:31:12.505793 4917 scope.go:117] "RemoveContainer" containerID="b664062f8079bd223dc49e33078611e48a8a728cc8a32d56308a3c108cf1c6b5" Mar 18 08:31:12 crc kubenswrapper[4917]: I0318 08:31:12.536960 4917 scope.go:117] "RemoveContainer" containerID="a4f79f03aaef63c09b94a5298ce1defc82d65f122022ab9c9ff89ed5d46f2202" Mar 18 08:31:12 crc kubenswrapper[4917]: I0318 08:31:12.567167 4917 scope.go:117] "RemoveContainer" containerID="609ab9fb6d7e0e7ec7909a91693f05d4960d5ad597a2350644394478e2988fa5" Mar 18 08:31:15 crc kubenswrapper[4917]: I0318 08:31:15.038309 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n6nkf"] Mar 18 08:31:15 crc kubenswrapper[4917]: I0318 08:31:15.048722 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-n6nkf"] Mar 18 08:31:15 crc kubenswrapper[4917]: I0318 08:31:15.802168 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d282ff2-1c74-458b-a879-c01075f8f136" path="/var/lib/kubelet/pods/4d282ff2-1c74-458b-a879-c01075f8f136/volumes" Mar 18 08:31:16 crc kubenswrapper[4917]: I0318 08:31:16.034000 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-r4467"] Mar 18 08:31:16 crc kubenswrapper[4917]: I0318 08:31:16.042030 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-r4467"] Mar 18 08:31:17 crc kubenswrapper[4917]: I0318 08:31:17.788558 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="884cd94d-843d-4872-b61a-caa5a9f39d3c" path="/var/lib/kubelet/pods/884cd94d-843d-4872-b61a-caa5a9f39d3c/volumes" Mar 18 08:31:19 crc kubenswrapper[4917]: I0318 08:31:19.772920 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:31:19 crc kubenswrapper[4917]: E0318 08:31:19.773518 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:31:34 crc kubenswrapper[4917]: I0318 08:31:34.773091 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:31:34 crc kubenswrapper[4917]: E0318 08:31:34.774523 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:31:45 crc kubenswrapper[4917]: I0318 08:31:45.778542 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:31:45 crc kubenswrapper[4917]: E0318 08:31:45.779576 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:31:56 crc kubenswrapper[4917]: I0318 08:31:56.829259 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:31:56 crc kubenswrapper[4917]: E0318 08:31:56.830898 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.156893 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563712-zmcdj"] Mar 18 08:32:00 crc kubenswrapper[4917]: E0318 08:32:00.157695 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9f4ce5-4103-41a4-8874-39c1d4697c45" containerName="collect-profiles" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.157706 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9f4ce5-4103-41a4-8874-39c1d4697c45" containerName="collect-profiles" Mar 18 08:32:00 crc kubenswrapper[4917]: E0318 08:32:00.157730 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba68bd60-c600-4887-a9fd-ba58eea4b3f0" containerName="oc" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.157737 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba68bd60-c600-4887-a9fd-ba58eea4b3f0" containerName="oc" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.157909 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9f4ce5-4103-41a4-8874-39c1d4697c45" containerName="collect-profiles" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.157929 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba68bd60-c600-4887-a9fd-ba58eea4b3f0" containerName="oc" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.158551 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563712-zmcdj" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.165120 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.165409 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.165522 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.166802 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563712-zmcdj"] Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.306824 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdqm4\" (UniqueName: \"kubernetes.io/projected/aa089f54-03e3-45d1-9b06-f703a5c95062-kube-api-access-hdqm4\") pod \"auto-csr-approver-29563712-zmcdj\" (UID: \"aa089f54-03e3-45d1-9b06-f703a5c95062\") " pod="openshift-infra/auto-csr-approver-29563712-zmcdj" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.408891 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdqm4\" (UniqueName: \"kubernetes.io/projected/aa089f54-03e3-45d1-9b06-f703a5c95062-kube-api-access-hdqm4\") pod \"auto-csr-approver-29563712-zmcdj\" (UID: \"aa089f54-03e3-45d1-9b06-f703a5c95062\") " pod="openshift-infra/auto-csr-approver-29563712-zmcdj" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.442717 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdqm4\" (UniqueName: \"kubernetes.io/projected/aa089f54-03e3-45d1-9b06-f703a5c95062-kube-api-access-hdqm4\") pod \"auto-csr-approver-29563712-zmcdj\" (UID: \"aa089f54-03e3-45d1-9b06-f703a5c95062\") " pod="openshift-infra/auto-csr-approver-29563712-zmcdj" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.477461 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563712-zmcdj" Mar 18 08:32:00 crc kubenswrapper[4917]: I0318 08:32:00.992468 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563712-zmcdj"] Mar 18 08:32:01 crc kubenswrapper[4917]: W0318 08:32:01.016852 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa089f54_03e3_45d1_9b06_f703a5c95062.slice/crio-bfa57b31697129107ec4b9a9eca0c933c5ce03ff7c6add71f320cbb4884c456d WatchSource:0}: Error finding container bfa57b31697129107ec4b9a9eca0c933c5ce03ff7c6add71f320cbb4884c456d: Status 404 returned error can't find the container with id bfa57b31697129107ec4b9a9eca0c933c5ce03ff7c6add71f320cbb4884c456d Mar 18 08:32:01 crc kubenswrapper[4917]: I0318 08:32:01.020473 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:32:01 crc kubenswrapper[4917]: I0318 08:32:01.299344 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563712-zmcdj" event={"ID":"aa089f54-03e3-45d1-9b06-f703a5c95062","Type":"ContainerStarted","Data":"bfa57b31697129107ec4b9a9eca0c933c5ce03ff7c6add71f320cbb4884c456d"} Mar 18 08:32:02 crc kubenswrapper[4917]: I0318 08:32:02.053877 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5r96w"] Mar 18 08:32:02 crc kubenswrapper[4917]: I0318 08:32:02.067755 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5r96w"] Mar 18 08:32:03 crc kubenswrapper[4917]: I0318 08:32:03.322362 4917 generic.go:334] "Generic (PLEG): container finished" podID="aa089f54-03e3-45d1-9b06-f703a5c95062" containerID="55fd0823b2fcc11378ca57c8fc4624d7b03dce26497cdb0478f2b860b815b0c6" exitCode=0 Mar 18 08:32:03 crc kubenswrapper[4917]: I0318 08:32:03.322442 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563712-zmcdj" event={"ID":"aa089f54-03e3-45d1-9b06-f703a5c95062","Type":"ContainerDied","Data":"55fd0823b2fcc11378ca57c8fc4624d7b03dce26497cdb0478f2b860b815b0c6"} Mar 18 08:32:03 crc kubenswrapper[4917]: E0318 08:32:03.406725 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa089f54_03e3_45d1_9b06_f703a5c95062.slice/crio-conmon-55fd0823b2fcc11378ca57c8fc4624d7b03dce26497cdb0478f2b860b815b0c6.scope\": RecentStats: unable to find data in memory cache]" Mar 18 08:32:03 crc kubenswrapper[4917]: I0318 08:32:03.789791 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7150ade9-aa2b-4bed-89b7-eee73cc12aec" path="/var/lib/kubelet/pods/7150ade9-aa2b-4bed-89b7-eee73cc12aec/volumes" Mar 18 08:32:04 crc kubenswrapper[4917]: I0318 08:32:04.772305 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563712-zmcdj" Mar 18 08:32:04 crc kubenswrapper[4917]: I0318 08:32:04.818947 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdqm4\" (UniqueName: \"kubernetes.io/projected/aa089f54-03e3-45d1-9b06-f703a5c95062-kube-api-access-hdqm4\") pod \"aa089f54-03e3-45d1-9b06-f703a5c95062\" (UID: \"aa089f54-03e3-45d1-9b06-f703a5c95062\") " Mar 18 08:32:04 crc kubenswrapper[4917]: I0318 08:32:04.836027 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa089f54-03e3-45d1-9b06-f703a5c95062-kube-api-access-hdqm4" (OuterVolumeSpecName: "kube-api-access-hdqm4") pod "aa089f54-03e3-45d1-9b06-f703a5c95062" (UID: "aa089f54-03e3-45d1-9b06-f703a5c95062"). InnerVolumeSpecName "kube-api-access-hdqm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:32:04 crc kubenswrapper[4917]: I0318 08:32:04.922028 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdqm4\" (UniqueName: \"kubernetes.io/projected/aa089f54-03e3-45d1-9b06-f703a5c95062-kube-api-access-hdqm4\") on node \"crc\" DevicePath \"\"" Mar 18 08:32:05 crc kubenswrapper[4917]: I0318 08:32:05.350356 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563712-zmcdj" event={"ID":"aa089f54-03e3-45d1-9b06-f703a5c95062","Type":"ContainerDied","Data":"bfa57b31697129107ec4b9a9eca0c933c5ce03ff7c6add71f320cbb4884c456d"} Mar 18 08:32:05 crc kubenswrapper[4917]: I0318 08:32:05.350425 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa57b31697129107ec4b9a9eca0c933c5ce03ff7c6add71f320cbb4884c456d" Mar 18 08:32:05 crc kubenswrapper[4917]: I0318 08:32:05.350497 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563712-zmcdj" Mar 18 08:32:05 crc kubenswrapper[4917]: I0318 08:32:05.855273 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563706-pg6sc"] Mar 18 08:32:05 crc kubenswrapper[4917]: I0318 08:32:05.868656 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563706-pg6sc"] Mar 18 08:32:07 crc kubenswrapper[4917]: I0318 08:32:07.785419 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c0b0cea-1278-459b-917b-63e0cbd85682" path="/var/lib/kubelet/pods/6c0b0cea-1278-459b-917b-63e0cbd85682/volumes" Mar 18 08:32:12 crc kubenswrapper[4917]: I0318 08:32:12.756214 4917 scope.go:117] "RemoveContainer" containerID="d297982df4c019212acf7285b5d98c16566978de2a7f101dc7f43fcbd6e698f9" Mar 18 08:32:12 crc kubenswrapper[4917]: I0318 08:32:12.774206 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:32:12 crc kubenswrapper[4917]: E0318 08:32:12.775344 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:32:12 crc kubenswrapper[4917]: I0318 08:32:12.833527 4917 scope.go:117] "RemoveContainer" containerID="bae5b3dfbf128b43befe51d9cd05ba1f2ba369e66724cc6a761a17534c62a93f" Mar 18 08:32:12 crc kubenswrapper[4917]: I0318 08:32:12.875226 4917 scope.go:117] "RemoveContainer" containerID="b02c5707f7015d86527767982899b91f560279436ab036838438cb5612a49bab" Mar 18 08:32:12 crc kubenswrapper[4917]: I0318 08:32:12.931066 4917 scope.go:117] "RemoveContainer" containerID="aa4c8355032f08349a9fb5c70ed5784082d5e7beae1e22d888e720ec008cd596" Mar 18 08:32:24 crc kubenswrapper[4917]: I0318 08:32:24.773611 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:32:24 crc kubenswrapper[4917]: E0318 08:32:24.774553 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:32:36 crc kubenswrapper[4917]: I0318 08:32:36.773247 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:32:36 crc kubenswrapper[4917]: E0318 08:32:36.774185 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:32:49 crc kubenswrapper[4917]: I0318 08:32:49.773905 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:32:49 crc kubenswrapper[4917]: E0318 08:32:49.775075 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:33:02 crc kubenswrapper[4917]: I0318 08:33:02.773938 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:33:02 crc kubenswrapper[4917]: E0318 08:33:02.774954 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.681629 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xvqv8"] Mar 18 08:33:08 crc kubenswrapper[4917]: E0318 08:33:08.682783 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa089f54-03e3-45d1-9b06-f703a5c95062" containerName="oc" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.682802 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa089f54-03e3-45d1-9b06-f703a5c95062" containerName="oc" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.683089 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa089f54-03e3-45d1-9b06-f703a5c95062" containerName="oc" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.685207 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.695835 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvqv8"] Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.806285 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-utilities\") pod \"certified-operators-xvqv8\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.806494 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs44j\" (UniqueName: \"kubernetes.io/projected/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-kube-api-access-qs44j\") pod \"certified-operators-xvqv8\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.806752 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-catalog-content\") pod \"certified-operators-xvqv8\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.908777 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-utilities\") pod \"certified-operators-xvqv8\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.908876 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs44j\" (UniqueName: \"kubernetes.io/projected/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-kube-api-access-qs44j\") pod \"certified-operators-xvqv8\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.908975 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-catalog-content\") pod \"certified-operators-xvqv8\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.910190 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-utilities\") pod \"certified-operators-xvqv8\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.910487 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-catalog-content\") pod \"certified-operators-xvqv8\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:08 crc kubenswrapper[4917]: I0318 08:33:08.944983 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs44j\" (UniqueName: \"kubernetes.io/projected/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-kube-api-access-qs44j\") pod \"certified-operators-xvqv8\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:09 crc kubenswrapper[4917]: I0318 08:33:09.015712 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:09 crc kubenswrapper[4917]: I0318 08:33:09.539717 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvqv8"] Mar 18 08:33:10 crc kubenswrapper[4917]: I0318 08:33:10.100133 4917 generic.go:334] "Generic (PLEG): container finished" podID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerID="cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406" exitCode=0 Mar 18 08:33:10 crc kubenswrapper[4917]: I0318 08:33:10.100190 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvqv8" event={"ID":"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a","Type":"ContainerDied","Data":"cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406"} Mar 18 08:33:10 crc kubenswrapper[4917]: I0318 08:33:10.100224 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvqv8" event={"ID":"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a","Type":"ContainerStarted","Data":"b70e0244ba2beb4dd6bafc57ec17b520d3074e447d6f702b9ff78c092550c132"} Mar 18 08:33:12 crc kubenswrapper[4917]: I0318 08:33:12.126648 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvqv8" event={"ID":"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a","Type":"ContainerStarted","Data":"2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054"} Mar 18 08:33:13 crc kubenswrapper[4917]: I0318 08:33:13.139047 4917 generic.go:334] "Generic (PLEG): container finished" podID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerID="2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054" exitCode=0 Mar 18 08:33:13 crc kubenswrapper[4917]: I0318 08:33:13.139172 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvqv8" event={"ID":"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a","Type":"ContainerDied","Data":"2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054"} Mar 18 08:33:14 crc kubenswrapper[4917]: I0318 08:33:14.152438 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvqv8" event={"ID":"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a","Type":"ContainerStarted","Data":"95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29"} Mar 18 08:33:14 crc kubenswrapper[4917]: I0318 08:33:14.175695 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xvqv8" podStartSLOduration=2.421562461 podStartE2EDuration="6.175677427s" podCreationTimestamp="2026-03-18 08:33:08 +0000 UTC" firstStartedPulling="2026-03-18 08:33:10.10361339 +0000 UTC m=+6375.044768104" lastFinishedPulling="2026-03-18 08:33:13.857728326 +0000 UTC m=+6378.798883070" observedRunningTime="2026-03-18 08:33:14.167409246 +0000 UTC m=+6379.108563980" watchObservedRunningTime="2026-03-18 08:33:14.175677427 +0000 UTC m=+6379.116832151" Mar 18 08:33:14 crc kubenswrapper[4917]: I0318 08:33:14.772770 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:33:14 crc kubenswrapper[4917]: E0318 08:33:14.773471 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:33:19 crc kubenswrapper[4917]: I0318 08:33:19.017413 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:19 crc kubenswrapper[4917]: I0318 08:33:19.017991 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:19 crc kubenswrapper[4917]: I0318 08:33:19.076041 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:19 crc kubenswrapper[4917]: I0318 08:33:19.274261 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:19 crc kubenswrapper[4917]: I0318 08:33:19.465123 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvqv8"] Mar 18 08:33:21 crc kubenswrapper[4917]: I0318 08:33:21.233051 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xvqv8" podUID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerName="registry-server" containerID="cri-o://95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29" gracePeriod=2 Mar 18 08:33:21 crc kubenswrapper[4917]: I0318 08:33:21.814189 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:21 crc kubenswrapper[4917]: I0318 08:33:21.914113 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs44j\" (UniqueName: \"kubernetes.io/projected/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-kube-api-access-qs44j\") pod \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " Mar 18 08:33:21 crc kubenswrapper[4917]: I0318 08:33:21.914493 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-utilities\") pod \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " Mar 18 08:33:21 crc kubenswrapper[4917]: I0318 08:33:21.914688 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-catalog-content\") pod \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\" (UID: \"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a\") " Mar 18 08:33:21 crc kubenswrapper[4917]: I0318 08:33:21.915141 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-utilities" (OuterVolumeSpecName: "utilities") pod "a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" (UID: "a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:33:21 crc kubenswrapper[4917]: I0318 08:33:21.915643 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:33:21 crc kubenswrapper[4917]: I0318 08:33:21.920701 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-kube-api-access-qs44j" (OuterVolumeSpecName: "kube-api-access-qs44j") pod "a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" (UID: "a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a"). InnerVolumeSpecName "kube-api-access-qs44j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:33:21 crc kubenswrapper[4917]: I0318 08:33:21.976179 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" (UID: "a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.018252 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.018307 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs44j\" (UniqueName: \"kubernetes.io/projected/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a-kube-api-access-qs44j\") on node \"crc\" DevicePath \"\"" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.245884 4917 generic.go:334] "Generic (PLEG): container finished" podID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerID="95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29" exitCode=0 Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.245933 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvqv8" event={"ID":"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a","Type":"ContainerDied","Data":"95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29"} Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.245971 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvqv8" event={"ID":"a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a","Type":"ContainerDied","Data":"b70e0244ba2beb4dd6bafc57ec17b520d3074e447d6f702b9ff78c092550c132"} Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.245994 4917 scope.go:117] "RemoveContainer" containerID="95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.245987 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvqv8" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.296573 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvqv8"] Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.302825 4917 scope.go:117] "RemoveContainer" containerID="2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.306449 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xvqv8"] Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.338543 4917 scope.go:117] "RemoveContainer" containerID="cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.392733 4917 scope.go:117] "RemoveContainer" containerID="95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29" Mar 18 08:33:22 crc kubenswrapper[4917]: E0318 08:33:22.395221 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29\": container with ID starting with 95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29 not found: ID does not exist" containerID="95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.395356 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29"} err="failed to get container status \"95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29\": rpc error: code = NotFound desc = could not find container \"95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29\": container with ID starting with 95c542950a07f3bf047bcb1bc65394ae272d15d729f401b0fe890772c693fb29 not found: ID does not exist" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.395469 4917 scope.go:117] "RemoveContainer" containerID="2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054" Mar 18 08:33:22 crc kubenswrapper[4917]: E0318 08:33:22.395846 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054\": container with ID starting with 2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054 not found: ID does not exist" containerID="2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.395948 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054"} err="failed to get container status \"2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054\": rpc error: code = NotFound desc = could not find container \"2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054\": container with ID starting with 2bfc75cb14255e5225ac99b041f6ff25ff568e6bd576f89b49efaca9b6113054 not found: ID does not exist" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.395981 4917 scope.go:117] "RemoveContainer" containerID="cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406" Mar 18 08:33:22 crc kubenswrapper[4917]: E0318 08:33:22.396377 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406\": container with ID starting with cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406 not found: ID does not exist" containerID="cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406" Mar 18 08:33:22 crc kubenswrapper[4917]: I0318 08:33:22.396433 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406"} err="failed to get container status \"cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406\": rpc error: code = NotFound desc = could not find container \"cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406\": container with ID starting with cf3a763ac471b71505e841f75dc4447f457a958d6446c0828d60eb9c092bf406 not found: ID does not exist" Mar 18 08:33:23 crc kubenswrapper[4917]: I0318 08:33:23.793809 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" path="/var/lib/kubelet/pods/a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a/volumes" Mar 18 08:33:25 crc kubenswrapper[4917]: I0318 08:33:25.783125 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:33:25 crc kubenswrapper[4917]: E0318 08:33:25.783880 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:33:39 crc kubenswrapper[4917]: I0318 08:33:39.775016 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:33:40 crc kubenswrapper[4917]: I0318 08:33:40.467739 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"845a384dbbd152118e1d1f211f7ebf1b3f540d7bed6649dd90db902b2bd0a837"} Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.168410 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563714-8nt7f"] Mar 18 08:34:00 crc kubenswrapper[4917]: E0318 08:34:00.169511 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerName="extract-content" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.169528 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerName="extract-content" Mar 18 08:34:00 crc kubenswrapper[4917]: E0318 08:34:00.169565 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerName="extract-utilities" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.169574 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerName="extract-utilities" Mar 18 08:34:00 crc kubenswrapper[4917]: E0318 08:34:00.169685 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerName="registry-server" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.169695 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerName="registry-server" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.169944 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5cba5f6-5109-4c6e-b5b4-76aa8224ad0a" containerName="registry-server" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.170864 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563714-8nt7f" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.173673 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.176014 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.179276 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563714-8nt7f"] Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.179855 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.298456 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/89f42d14-be42-4a2b-945d-d768f6232280-kube-api-access-7ds2n\") pod \"auto-csr-approver-29563714-8nt7f\" (UID: \"89f42d14-be42-4a2b-945d-d768f6232280\") " pod="openshift-infra/auto-csr-approver-29563714-8nt7f" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.400813 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/89f42d14-be42-4a2b-945d-d768f6232280-kube-api-access-7ds2n\") pod \"auto-csr-approver-29563714-8nt7f\" (UID: \"89f42d14-be42-4a2b-945d-d768f6232280\") " pod="openshift-infra/auto-csr-approver-29563714-8nt7f" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.421777 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/89f42d14-be42-4a2b-945d-d768f6232280-kube-api-access-7ds2n\") pod \"auto-csr-approver-29563714-8nt7f\" (UID: \"89f42d14-be42-4a2b-945d-d768f6232280\") " pod="openshift-infra/auto-csr-approver-29563714-8nt7f" Mar 18 08:34:00 crc kubenswrapper[4917]: I0318 08:34:00.490520 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563714-8nt7f" Mar 18 08:34:01 crc kubenswrapper[4917]: I0318 08:34:01.106086 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563714-8nt7f"] Mar 18 08:34:01 crc kubenswrapper[4917]: I0318 08:34:01.689189 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563714-8nt7f" event={"ID":"89f42d14-be42-4a2b-945d-d768f6232280","Type":"ContainerStarted","Data":"00e0c27f383fac6cfe4d37a3ded2961cab962ff988377fe048bec5bba7b4c97b"} Mar 18 08:34:02 crc kubenswrapper[4917]: I0318 08:34:02.708687 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563714-8nt7f" event={"ID":"89f42d14-be42-4a2b-945d-d768f6232280","Type":"ContainerStarted","Data":"72da59dd220a82ba05a8c9953a9195c199ca4e26cb47d5940265beff3651b234"} Mar 18 08:34:02 crc kubenswrapper[4917]: I0318 08:34:02.742288 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563714-8nt7f" podStartSLOduration=1.508363994 podStartE2EDuration="2.742265392s" podCreationTimestamp="2026-03-18 08:34:00 +0000 UTC" firstStartedPulling="2026-03-18 08:34:01.108305592 +0000 UTC m=+6426.049460306" lastFinishedPulling="2026-03-18 08:34:02.34220699 +0000 UTC m=+6427.283361704" observedRunningTime="2026-03-18 08:34:02.728348124 +0000 UTC m=+6427.669502878" watchObservedRunningTime="2026-03-18 08:34:02.742265392 +0000 UTC m=+6427.683420136" Mar 18 08:34:03 crc kubenswrapper[4917]: I0318 08:34:03.724869 4917 generic.go:334] "Generic (PLEG): container finished" podID="89f42d14-be42-4a2b-945d-d768f6232280" containerID="72da59dd220a82ba05a8c9953a9195c199ca4e26cb47d5940265beff3651b234" exitCode=0 Mar 18 08:34:03 crc kubenswrapper[4917]: I0318 08:34:03.724944 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563714-8nt7f" event={"ID":"89f42d14-be42-4a2b-945d-d768f6232280","Type":"ContainerDied","Data":"72da59dd220a82ba05a8c9953a9195c199ca4e26cb47d5940265beff3651b234"} Mar 18 08:34:05 crc kubenswrapper[4917]: I0318 08:34:05.158349 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563714-8nt7f" Mar 18 08:34:05 crc kubenswrapper[4917]: I0318 08:34:05.337451 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/89f42d14-be42-4a2b-945d-d768f6232280-kube-api-access-7ds2n\") pod \"89f42d14-be42-4a2b-945d-d768f6232280\" (UID: \"89f42d14-be42-4a2b-945d-d768f6232280\") " Mar 18 08:34:05 crc kubenswrapper[4917]: I0318 08:34:05.343558 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f42d14-be42-4a2b-945d-d768f6232280-kube-api-access-7ds2n" (OuterVolumeSpecName: "kube-api-access-7ds2n") pod "89f42d14-be42-4a2b-945d-d768f6232280" (UID: "89f42d14-be42-4a2b-945d-d768f6232280"). InnerVolumeSpecName "kube-api-access-7ds2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:34:05 crc kubenswrapper[4917]: I0318 08:34:05.440472 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ds2n\" (UniqueName: \"kubernetes.io/projected/89f42d14-be42-4a2b-945d-d768f6232280-kube-api-access-7ds2n\") on node \"crc\" DevicePath \"\"" Mar 18 08:34:05 crc kubenswrapper[4917]: I0318 08:34:05.767500 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563714-8nt7f" event={"ID":"89f42d14-be42-4a2b-945d-d768f6232280","Type":"ContainerDied","Data":"00e0c27f383fac6cfe4d37a3ded2961cab962ff988377fe048bec5bba7b4c97b"} Mar 18 08:34:05 crc kubenswrapper[4917]: I0318 08:34:05.767569 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00e0c27f383fac6cfe4d37a3ded2961cab962ff988377fe048bec5bba7b4c97b" Mar 18 08:34:05 crc kubenswrapper[4917]: I0318 08:34:05.767701 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563714-8nt7f" Mar 18 08:34:05 crc kubenswrapper[4917]: I0318 08:34:05.830174 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563708-l55kp"] Mar 18 08:34:05 crc kubenswrapper[4917]: I0318 08:34:05.838177 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563708-l55kp"] Mar 18 08:34:07 crc kubenswrapper[4917]: I0318 08:34:07.812877 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8bcfe64-82d3-4024-938b-9ce3dbff7968" path="/var/lib/kubelet/pods/d8bcfe64-82d3-4024-938b-9ce3dbff7968/volumes" Mar 18 08:34:13 crc kubenswrapper[4917]: I0318 08:34:13.102475 4917 scope.go:117] "RemoveContainer" containerID="160ba14f53215355bbdfc69af8fb581cec01e69cf3f4365356f7a919ed91deb3" Mar 18 08:34:41 crc kubenswrapper[4917]: I0318 08:34:41.063120 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-4jrlf"] Mar 18 08:34:41 crc kubenswrapper[4917]: I0318 08:34:41.073728 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-4jrlf"] Mar 18 08:34:41 crc kubenswrapper[4917]: I0318 08:34:41.083896 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-9c12-account-create-update-2dh7r"] Mar 18 08:34:41 crc kubenswrapper[4917]: I0318 08:34:41.097125 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-9c12-account-create-update-2dh7r"] Mar 18 08:34:41 crc kubenswrapper[4917]: I0318 08:34:41.791547 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305453bf-9adc-46d4-b810-fcd1e8f67a77" path="/var/lib/kubelet/pods/305453bf-9adc-46d4-b810-fcd1e8f67a77/volumes" Mar 18 08:34:41 crc kubenswrapper[4917]: I0318 08:34:41.792798 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f16e6bc3-1886-4de2-83ca-aae1c7f3d44f" path="/var/lib/kubelet/pods/f16e6bc3-1886-4de2-83ca-aae1c7f3d44f/volumes" Mar 18 08:34:55 crc kubenswrapper[4917]: I0318 08:34:55.065349 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-n46nm"] Mar 18 08:34:55 crc kubenswrapper[4917]: I0318 08:34:55.076121 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-n46nm"] Mar 18 08:34:55 crc kubenswrapper[4917]: I0318 08:34:55.807225 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb781c6a-6adb-484c-aba7-f4f894e8f812" path="/var/lib/kubelet/pods/cb781c6a-6adb-484c-aba7-f4f894e8f812/volumes" Mar 18 08:35:13 crc kubenswrapper[4917]: I0318 08:35:13.195117 4917 scope.go:117] "RemoveContainer" containerID="d9e3c6db6cc71c1006b65a7bbb203387622f774bbce48f15cd4e49c3be902f96" Mar 18 08:35:13 crc kubenswrapper[4917]: I0318 08:35:13.241227 4917 scope.go:117] "RemoveContainer" containerID="c08f7ae69d431cb796ae6e5cf42f147051f26eb79004457c6e60321f1ea36c15" Mar 18 08:35:13 crc kubenswrapper[4917]: I0318 08:35:13.306270 4917 scope.go:117] "RemoveContainer" containerID="f135ab1cfb701faad5d5609ab0e2fe07366d31cc88055ffbfbbfefcb09efa9d2" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.146497 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563716-djmfw"] Mar 18 08:36:00 crc kubenswrapper[4917]: E0318 08:36:00.149362 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f42d14-be42-4a2b-945d-d768f6232280" containerName="oc" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.149388 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f42d14-be42-4a2b-945d-d768f6232280" containerName="oc" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.149691 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f42d14-be42-4a2b-945d-d768f6232280" containerName="oc" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.150706 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563716-djmfw" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.153116 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.153706 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.154278 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.174594 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563716-djmfw"] Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.320214 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh8dv\" (UniqueName: \"kubernetes.io/projected/fce264c0-97a8-4ff0-8c7d-01658b2298c2-kube-api-access-lh8dv\") pod \"auto-csr-approver-29563716-djmfw\" (UID: \"fce264c0-97a8-4ff0-8c7d-01658b2298c2\") " pod="openshift-infra/auto-csr-approver-29563716-djmfw" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.422646 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh8dv\" (UniqueName: \"kubernetes.io/projected/fce264c0-97a8-4ff0-8c7d-01658b2298c2-kube-api-access-lh8dv\") pod \"auto-csr-approver-29563716-djmfw\" (UID: \"fce264c0-97a8-4ff0-8c7d-01658b2298c2\") " pod="openshift-infra/auto-csr-approver-29563716-djmfw" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.451733 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh8dv\" (UniqueName: \"kubernetes.io/projected/fce264c0-97a8-4ff0-8c7d-01658b2298c2-kube-api-access-lh8dv\") pod \"auto-csr-approver-29563716-djmfw\" (UID: \"fce264c0-97a8-4ff0-8c7d-01658b2298c2\") " pod="openshift-infra/auto-csr-approver-29563716-djmfw" Mar 18 08:36:00 crc kubenswrapper[4917]: I0318 08:36:00.471096 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563716-djmfw" Mar 18 08:36:01 crc kubenswrapper[4917]: I0318 08:36:01.009009 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563716-djmfw"] Mar 18 08:36:01 crc kubenswrapper[4917]: I0318 08:36:01.069897 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563716-djmfw" event={"ID":"fce264c0-97a8-4ff0-8c7d-01658b2298c2","Type":"ContainerStarted","Data":"b5a100eff2e064ee0e2e49f09fc813aa29000ed84602a68c23288b338b7906fc"} Mar 18 08:36:02 crc kubenswrapper[4917]: I0318 08:36:02.929342 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:36:02 crc kubenswrapper[4917]: I0318 08:36:02.930006 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:36:03 crc kubenswrapper[4917]: I0318 08:36:03.093149 4917 generic.go:334] "Generic (PLEG): container finished" podID="fce264c0-97a8-4ff0-8c7d-01658b2298c2" containerID="1eec3c20ed5c19ca160c43ef38e8e84942a628030081b12e5f73816bfbacd2e1" exitCode=0 Mar 18 08:36:03 crc kubenswrapper[4917]: I0318 08:36:03.093237 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563716-djmfw" event={"ID":"fce264c0-97a8-4ff0-8c7d-01658b2298c2","Type":"ContainerDied","Data":"1eec3c20ed5c19ca160c43ef38e8e84942a628030081b12e5f73816bfbacd2e1"} Mar 18 08:36:04 crc kubenswrapper[4917]: I0318 08:36:04.525368 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563716-djmfw" Mar 18 08:36:04 crc kubenswrapper[4917]: I0318 08:36:04.616098 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh8dv\" (UniqueName: \"kubernetes.io/projected/fce264c0-97a8-4ff0-8c7d-01658b2298c2-kube-api-access-lh8dv\") pod \"fce264c0-97a8-4ff0-8c7d-01658b2298c2\" (UID: \"fce264c0-97a8-4ff0-8c7d-01658b2298c2\") " Mar 18 08:36:04 crc kubenswrapper[4917]: I0318 08:36:04.627126 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fce264c0-97a8-4ff0-8c7d-01658b2298c2-kube-api-access-lh8dv" (OuterVolumeSpecName: "kube-api-access-lh8dv") pod "fce264c0-97a8-4ff0-8c7d-01658b2298c2" (UID: "fce264c0-97a8-4ff0-8c7d-01658b2298c2"). InnerVolumeSpecName "kube-api-access-lh8dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:36:04 crc kubenswrapper[4917]: I0318 08:36:04.720015 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh8dv\" (UniqueName: \"kubernetes.io/projected/fce264c0-97a8-4ff0-8c7d-01658b2298c2-kube-api-access-lh8dv\") on node \"crc\" DevicePath \"\"" Mar 18 08:36:05 crc kubenswrapper[4917]: I0318 08:36:05.121444 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563716-djmfw" event={"ID":"fce264c0-97a8-4ff0-8c7d-01658b2298c2","Type":"ContainerDied","Data":"b5a100eff2e064ee0e2e49f09fc813aa29000ed84602a68c23288b338b7906fc"} Mar 18 08:36:05 crc kubenswrapper[4917]: I0318 08:36:05.121499 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563716-djmfw" Mar 18 08:36:05 crc kubenswrapper[4917]: I0318 08:36:05.121512 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5a100eff2e064ee0e2e49f09fc813aa29000ed84602a68c23288b338b7906fc" Mar 18 08:36:05 crc kubenswrapper[4917]: I0318 08:36:05.620163 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563710-vtsvw"] Mar 18 08:36:05 crc kubenswrapper[4917]: I0318 08:36:05.635265 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563710-vtsvw"] Mar 18 08:36:05 crc kubenswrapper[4917]: I0318 08:36:05.792192 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba68bd60-c600-4887-a9fd-ba58eea4b3f0" path="/var/lib/kubelet/pods/ba68bd60-c600-4887-a9fd-ba58eea4b3f0/volumes" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.746265 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-smwq6"] Mar 18 08:36:06 crc kubenswrapper[4917]: E0318 08:36:06.747089 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fce264c0-97a8-4ff0-8c7d-01658b2298c2" containerName="oc" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.747105 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="fce264c0-97a8-4ff0-8c7d-01658b2298c2" containerName="oc" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.747390 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="fce264c0-97a8-4ff0-8c7d-01658b2298c2" containerName="oc" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.749282 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.765913 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smwq6"] Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.880515 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-catalog-content\") pod \"community-operators-smwq6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.880560 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-utilities\") pod \"community-operators-smwq6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.880606 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rxf6\" (UniqueName: \"kubernetes.io/projected/7e0f4f17-e240-4f50-bd22-30ed034907a6-kube-api-access-7rxf6\") pod \"community-operators-smwq6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.982338 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-catalog-content\") pod \"community-operators-smwq6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.982396 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-utilities\") pod \"community-operators-smwq6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.982423 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rxf6\" (UniqueName: \"kubernetes.io/projected/7e0f4f17-e240-4f50-bd22-30ed034907a6-kube-api-access-7rxf6\") pod \"community-operators-smwq6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.986733 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-utilities\") pod \"community-operators-smwq6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:06 crc kubenswrapper[4917]: I0318 08:36:06.986958 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-catalog-content\") pod \"community-operators-smwq6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:07 crc kubenswrapper[4917]: I0318 08:36:07.001776 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rxf6\" (UniqueName: \"kubernetes.io/projected/7e0f4f17-e240-4f50-bd22-30ed034907a6-kube-api-access-7rxf6\") pod \"community-operators-smwq6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:07 crc kubenswrapper[4917]: I0318 08:36:07.115954 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:07 crc kubenswrapper[4917]: I0318 08:36:07.592724 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-smwq6"] Mar 18 08:36:08 crc kubenswrapper[4917]: I0318 08:36:08.162217 4917 generic.go:334] "Generic (PLEG): container finished" podID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerID="ff71ec2202ffe518ee2517a7cd2c8c03696791c37767ae1cb0c0f41b5b662fb1" exitCode=0 Mar 18 08:36:08 crc kubenswrapper[4917]: I0318 08:36:08.162284 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smwq6" event={"ID":"7e0f4f17-e240-4f50-bd22-30ed034907a6","Type":"ContainerDied","Data":"ff71ec2202ffe518ee2517a7cd2c8c03696791c37767ae1cb0c0f41b5b662fb1"} Mar 18 08:36:08 crc kubenswrapper[4917]: I0318 08:36:08.162706 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smwq6" event={"ID":"7e0f4f17-e240-4f50-bd22-30ed034907a6","Type":"ContainerStarted","Data":"66d83662532f2d22f0e3b77cf2090f17e9d9e1794cc151a881d99a5019b587a3"} Mar 18 08:36:09 crc kubenswrapper[4917]: I0318 08:36:09.177877 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smwq6" event={"ID":"7e0f4f17-e240-4f50-bd22-30ed034907a6","Type":"ContainerStarted","Data":"4be9ac2bb9b366acec13e775f485026ca61c3d350af77761b1316be69bb8c814"} Mar 18 08:36:11 crc kubenswrapper[4917]: I0318 08:36:11.201750 4917 generic.go:334] "Generic (PLEG): container finished" podID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerID="4be9ac2bb9b366acec13e775f485026ca61c3d350af77761b1316be69bb8c814" exitCode=0 Mar 18 08:36:11 crc kubenswrapper[4917]: I0318 08:36:11.202303 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smwq6" event={"ID":"7e0f4f17-e240-4f50-bd22-30ed034907a6","Type":"ContainerDied","Data":"4be9ac2bb9b366acec13e775f485026ca61c3d350af77761b1316be69bb8c814"} Mar 18 08:36:12 crc kubenswrapper[4917]: I0318 08:36:12.218706 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smwq6" event={"ID":"7e0f4f17-e240-4f50-bd22-30ed034907a6","Type":"ContainerStarted","Data":"825710ed77ac632dfac349b8caed9f06a8e97e3d20cd503cae750d67ec0e90d9"} Mar 18 08:36:12 crc kubenswrapper[4917]: I0318 08:36:12.262490 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-smwq6" podStartSLOduration=2.782175392 podStartE2EDuration="6.262445366s" podCreationTimestamp="2026-03-18 08:36:06 +0000 UTC" firstStartedPulling="2026-03-18 08:36:08.1659025 +0000 UTC m=+6553.107057214" lastFinishedPulling="2026-03-18 08:36:11.646172464 +0000 UTC m=+6556.587327188" observedRunningTime="2026-03-18 08:36:12.244833179 +0000 UTC m=+6557.185987973" watchObservedRunningTime="2026-03-18 08:36:12.262445366 +0000 UTC m=+6557.203600090" Mar 18 08:36:13 crc kubenswrapper[4917]: I0318 08:36:13.446731 4917 scope.go:117] "RemoveContainer" containerID="ed8a3cc7d33c0f6d756c193594c6355a91f476282cb6db875f7cf119466eebba" Mar 18 08:36:17 crc kubenswrapper[4917]: I0318 08:36:17.116692 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:17 crc kubenswrapper[4917]: I0318 08:36:17.117771 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:17 crc kubenswrapper[4917]: I0318 08:36:17.182182 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:17 crc kubenswrapper[4917]: I0318 08:36:17.346277 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:17 crc kubenswrapper[4917]: I0318 08:36:17.416053 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smwq6"] Mar 18 08:36:19 crc kubenswrapper[4917]: I0318 08:36:19.302424 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-smwq6" podUID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerName="registry-server" containerID="cri-o://825710ed77ac632dfac349b8caed9f06a8e97e3d20cd503cae750d67ec0e90d9" gracePeriod=2 Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.315305 4917 generic.go:334] "Generic (PLEG): container finished" podID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerID="825710ed77ac632dfac349b8caed9f06a8e97e3d20cd503cae750d67ec0e90d9" exitCode=0 Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.315377 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smwq6" event={"ID":"7e0f4f17-e240-4f50-bd22-30ed034907a6","Type":"ContainerDied","Data":"825710ed77ac632dfac349b8caed9f06a8e97e3d20cd503cae750d67ec0e90d9"} Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.315704 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-smwq6" event={"ID":"7e0f4f17-e240-4f50-bd22-30ed034907a6","Type":"ContainerDied","Data":"66d83662532f2d22f0e3b77cf2090f17e9d9e1794cc151a881d99a5019b587a3"} Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.315725 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d83662532f2d22f0e3b77cf2090f17e9d9e1794cc151a881d99a5019b587a3" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.356088 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.415987 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rxf6\" (UniqueName: \"kubernetes.io/projected/7e0f4f17-e240-4f50-bd22-30ed034907a6-kube-api-access-7rxf6\") pod \"7e0f4f17-e240-4f50-bd22-30ed034907a6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.416292 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-catalog-content\") pod \"7e0f4f17-e240-4f50-bd22-30ed034907a6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.416395 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-utilities\") pod \"7e0f4f17-e240-4f50-bd22-30ed034907a6\" (UID: \"7e0f4f17-e240-4f50-bd22-30ed034907a6\") " Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.420506 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-utilities" (OuterVolumeSpecName: "utilities") pod "7e0f4f17-e240-4f50-bd22-30ed034907a6" (UID: "7e0f4f17-e240-4f50-bd22-30ed034907a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.433063 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0f4f17-e240-4f50-bd22-30ed034907a6-kube-api-access-7rxf6" (OuterVolumeSpecName: "kube-api-access-7rxf6") pod "7e0f4f17-e240-4f50-bd22-30ed034907a6" (UID: "7e0f4f17-e240-4f50-bd22-30ed034907a6"). InnerVolumeSpecName "kube-api-access-7rxf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.442446 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j6rqk"] Mar 18 08:36:20 crc kubenswrapper[4917]: E0318 08:36:20.442927 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerName="registry-server" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.442951 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerName="registry-server" Mar 18 08:36:20 crc kubenswrapper[4917]: E0318 08:36:20.442984 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerName="extract-utilities" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.442995 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerName="extract-utilities" Mar 18 08:36:20 crc kubenswrapper[4917]: E0318 08:36:20.443014 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerName="extract-content" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.443026 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerName="extract-content" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.443304 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0f4f17-e240-4f50-bd22-30ed034907a6" containerName="registry-server" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.445271 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.466723 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6rqk"] Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.504127 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e0f4f17-e240-4f50-bd22-30ed034907a6" (UID: "7e0f4f17-e240-4f50-bd22-30ed034907a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.520782 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnsqs\" (UniqueName: \"kubernetes.io/projected/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-kube-api-access-qnsqs\") pod \"redhat-marketplace-j6rqk\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.520914 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-catalog-content\") pod \"redhat-marketplace-j6rqk\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.520984 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-utilities\") pod \"redhat-marketplace-j6rqk\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.521172 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rxf6\" (UniqueName: \"kubernetes.io/projected/7e0f4f17-e240-4f50-bd22-30ed034907a6-kube-api-access-7rxf6\") on node \"crc\" DevicePath \"\"" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.521194 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.521207 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e0f4f17-e240-4f50-bd22-30ed034907a6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.623183 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnsqs\" (UniqueName: \"kubernetes.io/projected/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-kube-api-access-qnsqs\") pod \"redhat-marketplace-j6rqk\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.623260 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-catalog-content\") pod \"redhat-marketplace-j6rqk\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.623309 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-utilities\") pod \"redhat-marketplace-j6rqk\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.623853 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-catalog-content\") pod \"redhat-marketplace-j6rqk\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.623877 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-utilities\") pod \"redhat-marketplace-j6rqk\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.639433 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnsqs\" (UniqueName: \"kubernetes.io/projected/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-kube-api-access-qnsqs\") pod \"redhat-marketplace-j6rqk\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:20 crc kubenswrapper[4917]: I0318 08:36:20.813787 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:21 crc kubenswrapper[4917]: I0318 08:36:21.259255 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6rqk"] Mar 18 08:36:21 crc kubenswrapper[4917]: W0318 08:36:21.262291 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9c4e97c_8c87_4213_9d0d_fe25eb31093c.slice/crio-3660223172fbfee102b1c41edbce59189772729b9b52faf3e86fd4a3dd76cf60 WatchSource:0}: Error finding container 3660223172fbfee102b1c41edbce59189772729b9b52faf3e86fd4a3dd76cf60: Status 404 returned error can't find the container with id 3660223172fbfee102b1c41edbce59189772729b9b52faf3e86fd4a3dd76cf60 Mar 18 08:36:21 crc kubenswrapper[4917]: I0318 08:36:21.326157 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6rqk" event={"ID":"b9c4e97c-8c87-4213-9d0d-fe25eb31093c","Type":"ContainerStarted","Data":"3660223172fbfee102b1c41edbce59189772729b9b52faf3e86fd4a3dd76cf60"} Mar 18 08:36:21 crc kubenswrapper[4917]: I0318 08:36:21.326189 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-smwq6" Mar 18 08:36:21 crc kubenswrapper[4917]: I0318 08:36:21.370515 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-smwq6"] Mar 18 08:36:21 crc kubenswrapper[4917]: I0318 08:36:21.377627 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-smwq6"] Mar 18 08:36:21 crc kubenswrapper[4917]: I0318 08:36:21.793458 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0f4f17-e240-4f50-bd22-30ed034907a6" path="/var/lib/kubelet/pods/7e0f4f17-e240-4f50-bd22-30ed034907a6/volumes" Mar 18 08:36:22 crc kubenswrapper[4917]: I0318 08:36:22.343399 4917 generic.go:334] "Generic (PLEG): container finished" podID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerID="4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6" exitCode=0 Mar 18 08:36:22 crc kubenswrapper[4917]: I0318 08:36:22.343604 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6rqk" event={"ID":"b9c4e97c-8c87-4213-9d0d-fe25eb31093c","Type":"ContainerDied","Data":"4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6"} Mar 18 08:36:23 crc kubenswrapper[4917]: I0318 08:36:23.361323 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6rqk" event={"ID":"b9c4e97c-8c87-4213-9d0d-fe25eb31093c","Type":"ContainerStarted","Data":"26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832"} Mar 18 08:36:24 crc kubenswrapper[4917]: I0318 08:36:24.374830 4917 generic.go:334] "Generic (PLEG): container finished" podID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerID="26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832" exitCode=0 Mar 18 08:36:24 crc kubenswrapper[4917]: I0318 08:36:24.374932 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6rqk" event={"ID":"b9c4e97c-8c87-4213-9d0d-fe25eb31093c","Type":"ContainerDied","Data":"26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832"} Mar 18 08:36:25 crc kubenswrapper[4917]: I0318 08:36:25.388992 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6rqk" event={"ID":"b9c4e97c-8c87-4213-9d0d-fe25eb31093c","Type":"ContainerStarted","Data":"52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6"} Mar 18 08:36:25 crc kubenswrapper[4917]: I0318 08:36:25.418608 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j6rqk" podStartSLOduration=2.765424451 podStartE2EDuration="5.418553684s" podCreationTimestamp="2026-03-18 08:36:20 +0000 UTC" firstStartedPulling="2026-03-18 08:36:22.345740631 +0000 UTC m=+6567.286895355" lastFinishedPulling="2026-03-18 08:36:24.998869854 +0000 UTC m=+6569.940024588" observedRunningTime="2026-03-18 08:36:25.417275503 +0000 UTC m=+6570.358430257" watchObservedRunningTime="2026-03-18 08:36:25.418553684 +0000 UTC m=+6570.359708458" Mar 18 08:36:30 crc kubenswrapper[4917]: I0318 08:36:30.814831 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:30 crc kubenswrapper[4917]: I0318 08:36:30.816746 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:30 crc kubenswrapper[4917]: I0318 08:36:30.865321 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:31 crc kubenswrapper[4917]: I0318 08:36:31.521833 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:31 crc kubenswrapper[4917]: I0318 08:36:31.592277 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6rqk"] Mar 18 08:36:32 crc kubenswrapper[4917]: I0318 08:36:32.929855 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:36:32 crc kubenswrapper[4917]: I0318 08:36:32.929944 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:36:33 crc kubenswrapper[4917]: I0318 08:36:33.488955 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j6rqk" podUID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerName="registry-server" containerID="cri-o://52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6" gracePeriod=2 Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.000889 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.134900 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-utilities\") pod \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.135041 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnsqs\" (UniqueName: \"kubernetes.io/projected/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-kube-api-access-qnsqs\") pod \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.135124 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-catalog-content\") pod \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\" (UID: \"b9c4e97c-8c87-4213-9d0d-fe25eb31093c\") " Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.135905 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-utilities" (OuterVolumeSpecName: "utilities") pod "b9c4e97c-8c87-4213-9d0d-fe25eb31093c" (UID: "b9c4e97c-8c87-4213-9d0d-fe25eb31093c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.150809 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-kube-api-access-qnsqs" (OuterVolumeSpecName: "kube-api-access-qnsqs") pod "b9c4e97c-8c87-4213-9d0d-fe25eb31093c" (UID: "b9c4e97c-8c87-4213-9d0d-fe25eb31093c"). InnerVolumeSpecName "kube-api-access-qnsqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.178359 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9c4e97c-8c87-4213-9d0d-fe25eb31093c" (UID: "b9c4e97c-8c87-4213-9d0d-fe25eb31093c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.237841 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnsqs\" (UniqueName: \"kubernetes.io/projected/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-kube-api-access-qnsqs\") on node \"crc\" DevicePath \"\"" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.237875 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.237884 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9c4e97c-8c87-4213-9d0d-fe25eb31093c-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.498767 4917 generic.go:334] "Generic (PLEG): container finished" podID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerID="52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6" exitCode=0 Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.498811 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j6rqk" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.498840 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6rqk" event={"ID":"b9c4e97c-8c87-4213-9d0d-fe25eb31093c","Type":"ContainerDied","Data":"52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6"} Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.498900 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j6rqk" event={"ID":"b9c4e97c-8c87-4213-9d0d-fe25eb31093c","Type":"ContainerDied","Data":"3660223172fbfee102b1c41edbce59189772729b9b52faf3e86fd4a3dd76cf60"} Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.498934 4917 scope.go:117] "RemoveContainer" containerID="52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.546722 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6rqk"] Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.551305 4917 scope.go:117] "RemoveContainer" containerID="26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.560027 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j6rqk"] Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.578885 4917 scope.go:117] "RemoveContainer" containerID="4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.626379 4917 scope.go:117] "RemoveContainer" containerID="52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6" Mar 18 08:36:34 crc kubenswrapper[4917]: E0318 08:36:34.626909 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6\": container with ID starting with 52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6 not found: ID does not exist" containerID="52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.626948 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6"} err="failed to get container status \"52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6\": rpc error: code = NotFound desc = could not find container \"52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6\": container with ID starting with 52d75c6712bf972c3ed5b24e07a16d1fa08a26baaa77ffd33b7585e35e30a6e6 not found: ID does not exist" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.626984 4917 scope.go:117] "RemoveContainer" containerID="26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832" Mar 18 08:36:34 crc kubenswrapper[4917]: E0318 08:36:34.627343 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832\": container with ID starting with 26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832 not found: ID does not exist" containerID="26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.627375 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832"} err="failed to get container status \"26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832\": rpc error: code = NotFound desc = could not find container \"26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832\": container with ID starting with 26f1d5de2431b17d43b4c4c425be8f8a01019dd1015915e9b9b627268ccd2832 not found: ID does not exist" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.627393 4917 scope.go:117] "RemoveContainer" containerID="4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6" Mar 18 08:36:34 crc kubenswrapper[4917]: E0318 08:36:34.627666 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6\": container with ID starting with 4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6 not found: ID does not exist" containerID="4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6" Mar 18 08:36:34 crc kubenswrapper[4917]: I0318 08:36:34.627692 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6"} err="failed to get container status \"4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6\": rpc error: code = NotFound desc = could not find container \"4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6\": container with ID starting with 4ead82f859ec574f426b87225ef0c6cb2eb34bb63cbc60435c5d2da0e65700e6 not found: ID does not exist" Mar 18 08:36:35 crc kubenswrapper[4917]: I0318 08:36:35.790489 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" path="/var/lib/kubelet/pods/b9c4e97c-8c87-4213-9d0d-fe25eb31093c/volumes" Mar 18 08:37:02 crc kubenswrapper[4917]: I0318 08:37:02.929161 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:37:02 crc kubenswrapper[4917]: I0318 08:37:02.929912 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:37:02 crc kubenswrapper[4917]: I0318 08:37:02.929976 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:37:02 crc kubenswrapper[4917]: I0318 08:37:02.931106 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"845a384dbbd152118e1d1f211f7ebf1b3f540d7bed6649dd90db902b2bd0a837"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:37:02 crc kubenswrapper[4917]: I0318 08:37:02.931195 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://845a384dbbd152118e1d1f211f7ebf1b3f540d7bed6649dd90db902b2bd0a837" gracePeriod=600 Mar 18 08:37:03 crc kubenswrapper[4917]: I0318 08:37:03.810230 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="845a384dbbd152118e1d1f211f7ebf1b3f540d7bed6649dd90db902b2bd0a837" exitCode=0 Mar 18 08:37:03 crc kubenswrapper[4917]: I0318 08:37:03.810284 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"845a384dbbd152118e1d1f211f7ebf1b3f540d7bed6649dd90db902b2bd0a837"} Mar 18 08:37:03 crc kubenswrapper[4917]: I0318 08:37:03.810987 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860"} Mar 18 08:37:03 crc kubenswrapper[4917]: I0318 08:37:03.811021 4917 scope.go:117] "RemoveContainer" containerID="63fd29f58924f451908accc2457127731f14b00272b16db5fb9faf44cccc9a9b" Mar 18 08:37:25 crc kubenswrapper[4917]: I0318 08:37:25.073052 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-f692-account-create-update-4l7xl"] Mar 18 08:37:25 crc kubenswrapper[4917]: I0318 08:37:25.083025 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-gz4lz"] Mar 18 08:37:25 crc kubenswrapper[4917]: I0318 08:37:25.091946 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-f692-account-create-update-4l7xl"] Mar 18 08:37:25 crc kubenswrapper[4917]: I0318 08:37:25.100250 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-gz4lz"] Mar 18 08:37:25 crc kubenswrapper[4917]: I0318 08:37:25.790678 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dca31488-cf6b-48a6-94f4-830e7a65fecc" path="/var/lib/kubelet/pods/dca31488-cf6b-48a6-94f4-830e7a65fecc/volumes" Mar 18 08:37:25 crc kubenswrapper[4917]: I0318 08:37:25.791807 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eac39fa3-e042-46f8-a673-653191c0b588" path="/var/lib/kubelet/pods/eac39fa3-e042-46f8-a673-653191c0b588/volumes" Mar 18 08:37:36 crc kubenswrapper[4917]: I0318 08:37:36.024380 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-pjhws"] Mar 18 08:37:36 crc kubenswrapper[4917]: I0318 08:37:36.032356 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-pjhws"] Mar 18 08:37:37 crc kubenswrapper[4917]: I0318 08:37:37.794501 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3609520-f250-40d8-8444-360d760da047" path="/var/lib/kubelet/pods/f3609520-f250-40d8-8444-360d760da047/volumes" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.151212 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563718-q7wfx"] Mar 18 08:38:00 crc kubenswrapper[4917]: E0318 08:38:00.152681 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerName="extract-utilities" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.152699 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerName="extract-utilities" Mar 18 08:38:00 crc kubenswrapper[4917]: E0318 08:38:00.152734 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerName="registry-server" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.152743 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerName="registry-server" Mar 18 08:38:00 crc kubenswrapper[4917]: E0318 08:38:00.152764 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerName="extract-content" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.152771 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerName="extract-content" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.153057 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c4e97c-8c87-4213-9d0d-fe25eb31093c" containerName="registry-server" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.153985 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563718-q7wfx" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.156877 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.157205 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.157381 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.166168 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563718-q7wfx"] Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.250313 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j68xr\" (UniqueName: \"kubernetes.io/projected/3765ce07-a7b5-4d0f-8924-e670428ff79e-kube-api-access-j68xr\") pod \"auto-csr-approver-29563718-q7wfx\" (UID: \"3765ce07-a7b5-4d0f-8924-e670428ff79e\") " pod="openshift-infra/auto-csr-approver-29563718-q7wfx" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.353073 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j68xr\" (UniqueName: \"kubernetes.io/projected/3765ce07-a7b5-4d0f-8924-e670428ff79e-kube-api-access-j68xr\") pod \"auto-csr-approver-29563718-q7wfx\" (UID: \"3765ce07-a7b5-4d0f-8924-e670428ff79e\") " pod="openshift-infra/auto-csr-approver-29563718-q7wfx" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.373309 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j68xr\" (UniqueName: \"kubernetes.io/projected/3765ce07-a7b5-4d0f-8924-e670428ff79e-kube-api-access-j68xr\") pod \"auto-csr-approver-29563718-q7wfx\" (UID: \"3765ce07-a7b5-4d0f-8924-e670428ff79e\") " pod="openshift-infra/auto-csr-approver-29563718-q7wfx" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.475069 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563718-q7wfx" Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.939976 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563718-q7wfx"] Mar 18 08:38:00 crc kubenswrapper[4917]: W0318 08:38:00.944493 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3765ce07_a7b5_4d0f_8924_e670428ff79e.slice/crio-4ebfb31c1e46e29fc9f17714f6b4350c6b54d8b5395af9929383498de767b051 WatchSource:0}: Error finding container 4ebfb31c1e46e29fc9f17714f6b4350c6b54d8b5395af9929383498de767b051: Status 404 returned error can't find the container with id 4ebfb31c1e46e29fc9f17714f6b4350c6b54d8b5395af9929383498de767b051 Mar 18 08:38:00 crc kubenswrapper[4917]: I0318 08:38:00.948844 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:38:01 crc kubenswrapper[4917]: I0318 08:38:01.831535 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563718-q7wfx" event={"ID":"3765ce07-a7b5-4d0f-8924-e670428ff79e","Type":"ContainerStarted","Data":"4ebfb31c1e46e29fc9f17714f6b4350c6b54d8b5395af9929383498de767b051"} Mar 18 08:38:02 crc kubenswrapper[4917]: I0318 08:38:02.843138 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563718-q7wfx" event={"ID":"3765ce07-a7b5-4d0f-8924-e670428ff79e","Type":"ContainerStarted","Data":"0cdac4b6b4578e1b72cb63e50201589245d63fde5cc6140bf83b1ed716913a8c"} Mar 18 08:38:02 crc kubenswrapper[4917]: I0318 08:38:02.867276 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563718-q7wfx" podStartSLOduration=1.5398964469999998 podStartE2EDuration="2.867249183s" podCreationTimestamp="2026-03-18 08:38:00 +0000 UTC" firstStartedPulling="2026-03-18 08:38:00.948648943 +0000 UTC m=+6665.889803657" lastFinishedPulling="2026-03-18 08:38:02.276001669 +0000 UTC m=+6667.217156393" observedRunningTime="2026-03-18 08:38:02.855617121 +0000 UTC m=+6667.796771845" watchObservedRunningTime="2026-03-18 08:38:02.867249183 +0000 UTC m=+6667.808403907" Mar 18 08:38:03 crc kubenswrapper[4917]: I0318 08:38:03.860105 4917 generic.go:334] "Generic (PLEG): container finished" podID="3765ce07-a7b5-4d0f-8924-e670428ff79e" containerID="0cdac4b6b4578e1b72cb63e50201589245d63fde5cc6140bf83b1ed716913a8c" exitCode=0 Mar 18 08:38:03 crc kubenswrapper[4917]: I0318 08:38:03.860186 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563718-q7wfx" event={"ID":"3765ce07-a7b5-4d0f-8924-e670428ff79e","Type":"ContainerDied","Data":"0cdac4b6b4578e1b72cb63e50201589245d63fde5cc6140bf83b1ed716913a8c"} Mar 18 08:38:05 crc kubenswrapper[4917]: I0318 08:38:05.396931 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563718-q7wfx" Mar 18 08:38:05 crc kubenswrapper[4917]: I0318 08:38:05.471908 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j68xr\" (UniqueName: \"kubernetes.io/projected/3765ce07-a7b5-4d0f-8924-e670428ff79e-kube-api-access-j68xr\") pod \"3765ce07-a7b5-4d0f-8924-e670428ff79e\" (UID: \"3765ce07-a7b5-4d0f-8924-e670428ff79e\") " Mar 18 08:38:05 crc kubenswrapper[4917]: I0318 08:38:05.481751 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3765ce07-a7b5-4d0f-8924-e670428ff79e-kube-api-access-j68xr" (OuterVolumeSpecName: "kube-api-access-j68xr") pod "3765ce07-a7b5-4d0f-8924-e670428ff79e" (UID: "3765ce07-a7b5-4d0f-8924-e670428ff79e"). InnerVolumeSpecName "kube-api-access-j68xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:38:05 crc kubenswrapper[4917]: I0318 08:38:05.576243 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j68xr\" (UniqueName: \"kubernetes.io/projected/3765ce07-a7b5-4d0f-8924-e670428ff79e-kube-api-access-j68xr\") on node \"crc\" DevicePath \"\"" Mar 18 08:38:05 crc kubenswrapper[4917]: I0318 08:38:05.883299 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563718-q7wfx" event={"ID":"3765ce07-a7b5-4d0f-8924-e670428ff79e","Type":"ContainerDied","Data":"4ebfb31c1e46e29fc9f17714f6b4350c6b54d8b5395af9929383498de767b051"} Mar 18 08:38:05 crc kubenswrapper[4917]: I0318 08:38:05.883342 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ebfb31c1e46e29fc9f17714f6b4350c6b54d8b5395af9929383498de767b051" Mar 18 08:38:05 crc kubenswrapper[4917]: I0318 08:38:05.883397 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563718-q7wfx" Mar 18 08:38:05 crc kubenswrapper[4917]: I0318 08:38:05.942023 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563712-zmcdj"] Mar 18 08:38:05 crc kubenswrapper[4917]: I0318 08:38:05.950847 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563712-zmcdj"] Mar 18 08:38:07 crc kubenswrapper[4917]: I0318 08:38:07.795484 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa089f54-03e3-45d1-9b06-f703a5c95062" path="/var/lib/kubelet/pods/aa089f54-03e3-45d1-9b06-f703a5c95062/volumes" Mar 18 08:38:13 crc kubenswrapper[4917]: I0318 08:38:13.575286 4917 scope.go:117] "RemoveContainer" containerID="741caf756728c20fa11c8bd13e56644318a18214db39bb7cf7939ba51a7d80af" Mar 18 08:38:13 crc kubenswrapper[4917]: I0318 08:38:13.664521 4917 scope.go:117] "RemoveContainer" containerID="55fd0823b2fcc11378ca57c8fc4624d7b03dce26497cdb0478f2b860b815b0c6" Mar 18 08:38:13 crc kubenswrapper[4917]: I0318 08:38:13.701046 4917 scope.go:117] "RemoveContainer" containerID="a77cc6e18818ef3c07a0b9b6677f8ee986aa2713960c8607533e0303a355f71e" Mar 18 08:38:13 crc kubenswrapper[4917]: I0318 08:38:13.743045 4917 scope.go:117] "RemoveContainer" containerID="c24c3c39aa319b019ca3da58ed6a1b3f24ef525292b15bf978f76eb634bc8e0c" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.069590 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hb2w2"] Mar 18 08:38:41 crc kubenswrapper[4917]: E0318 08:38:41.070360 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3765ce07-a7b5-4d0f-8924-e670428ff79e" containerName="oc" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.070374 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3765ce07-a7b5-4d0f-8924-e670428ff79e" containerName="oc" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.070573 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3765ce07-a7b5-4d0f-8924-e670428ff79e" containerName="oc" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.071891 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.089406 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hb2w2"] Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.181779 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-catalog-content\") pod \"redhat-operators-hb2w2\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.181857 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-utilities\") pod \"redhat-operators-hb2w2\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.181927 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpmfz\" (UniqueName: \"kubernetes.io/projected/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-kube-api-access-kpmfz\") pod \"redhat-operators-hb2w2\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.283498 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-catalog-content\") pod \"redhat-operators-hb2w2\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.283645 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-utilities\") pod \"redhat-operators-hb2w2\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.284057 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-catalog-content\") pod \"redhat-operators-hb2w2\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.284094 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-utilities\") pod \"redhat-operators-hb2w2\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.284221 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpmfz\" (UniqueName: \"kubernetes.io/projected/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-kube-api-access-kpmfz\") pod \"redhat-operators-hb2w2\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.307368 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpmfz\" (UniqueName: \"kubernetes.io/projected/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-kube-api-access-kpmfz\") pod \"redhat-operators-hb2w2\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.390229 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:41 crc kubenswrapper[4917]: I0318 08:38:41.841144 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hb2w2"] Mar 18 08:38:41 crc kubenswrapper[4917]: W0318 08:38:41.847777 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f0cee0c_9d8e_40e4_b21c_1be0104415e6.slice/crio-46e53fab6babeb0a721189f75435b7e76edf610ec64756ef6a1475b011d92ba5 WatchSource:0}: Error finding container 46e53fab6babeb0a721189f75435b7e76edf610ec64756ef6a1475b011d92ba5: Status 404 returned error can't find the container with id 46e53fab6babeb0a721189f75435b7e76edf610ec64756ef6a1475b011d92ba5 Mar 18 08:38:42 crc kubenswrapper[4917]: I0318 08:38:42.262725 4917 generic.go:334] "Generic (PLEG): container finished" podID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerID="fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce" exitCode=0 Mar 18 08:38:42 crc kubenswrapper[4917]: I0318 08:38:42.262767 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hb2w2" event={"ID":"0f0cee0c-9d8e-40e4-b21c-1be0104415e6","Type":"ContainerDied","Data":"fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce"} Mar 18 08:38:42 crc kubenswrapper[4917]: I0318 08:38:42.262789 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hb2w2" event={"ID":"0f0cee0c-9d8e-40e4-b21c-1be0104415e6","Type":"ContainerStarted","Data":"46e53fab6babeb0a721189f75435b7e76edf610ec64756ef6a1475b011d92ba5"} Mar 18 08:38:43 crc kubenswrapper[4917]: I0318 08:38:43.276863 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hb2w2" event={"ID":"0f0cee0c-9d8e-40e4-b21c-1be0104415e6","Type":"ContainerStarted","Data":"03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c"} Mar 18 08:38:49 crc kubenswrapper[4917]: I0318 08:38:49.368542 4917 generic.go:334] "Generic (PLEG): container finished" podID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerID="03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c" exitCode=0 Mar 18 08:38:49 crc kubenswrapper[4917]: I0318 08:38:49.368704 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hb2w2" event={"ID":"0f0cee0c-9d8e-40e4-b21c-1be0104415e6","Type":"ContainerDied","Data":"03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c"} Mar 18 08:38:50 crc kubenswrapper[4917]: I0318 08:38:50.383177 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hb2w2" event={"ID":"0f0cee0c-9d8e-40e4-b21c-1be0104415e6","Type":"ContainerStarted","Data":"2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700"} Mar 18 08:38:50 crc kubenswrapper[4917]: I0318 08:38:50.416912 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hb2w2" podStartSLOduration=1.92143395 podStartE2EDuration="9.416891386s" podCreationTimestamp="2026-03-18 08:38:41 +0000 UTC" firstStartedPulling="2026-03-18 08:38:42.264275855 +0000 UTC m=+6707.205430569" lastFinishedPulling="2026-03-18 08:38:49.759733291 +0000 UTC m=+6714.700888005" observedRunningTime="2026-03-18 08:38:50.407442086 +0000 UTC m=+6715.348596810" watchObservedRunningTime="2026-03-18 08:38:50.416891386 +0000 UTC m=+6715.358046110" Mar 18 08:38:51 crc kubenswrapper[4917]: I0318 08:38:51.391646 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:51 crc kubenswrapper[4917]: I0318 08:38:51.392085 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:38:52 crc kubenswrapper[4917]: I0318 08:38:52.448628 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hb2w2" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="registry-server" probeResult="failure" output=< Mar 18 08:38:52 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:38:52 crc kubenswrapper[4917]: > Mar 18 08:39:02 crc kubenswrapper[4917]: I0318 08:39:02.452702 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hb2w2" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="registry-server" probeResult="failure" output=< Mar 18 08:39:02 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:39:02 crc kubenswrapper[4917]: > Mar 18 08:39:09 crc kubenswrapper[4917]: I0318 08:39:09.591075 4917 generic.go:334] "Generic (PLEG): container finished" podID="afc5f89c-2aa9-4a99-8fda-794ab379f98a" containerID="71e5c818b04e35a627659c7237cf34f3f4b23bc99404b2fc84e0d59f20beb4df" exitCode=0 Mar 18 08:39:09 crc kubenswrapper[4917]: I0318 08:39:09.591189 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" event={"ID":"afc5f89c-2aa9-4a99-8fda-794ab379f98a","Type":"ContainerDied","Data":"71e5c818b04e35a627659c7237cf34f3f4b23bc99404b2fc84e0d59f20beb4df"} Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.127290 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.237699 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2cc7\" (UniqueName: \"kubernetes.io/projected/afc5f89c-2aa9-4a99-8fda-794ab379f98a-kube-api-access-d2cc7\") pod \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.237977 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-inventory\") pod \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.238152 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-ssh-key-openstack-cell1\") pod \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.238247 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-tripleo-cleanup-combined-ca-bundle\") pod \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\" (UID: \"afc5f89c-2aa9-4a99-8fda-794ab379f98a\") " Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.245113 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afc5f89c-2aa9-4a99-8fda-794ab379f98a-kube-api-access-d2cc7" (OuterVolumeSpecName: "kube-api-access-d2cc7") pod "afc5f89c-2aa9-4a99-8fda-794ab379f98a" (UID: "afc5f89c-2aa9-4a99-8fda-794ab379f98a"). InnerVolumeSpecName "kube-api-access-d2cc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.248746 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "afc5f89c-2aa9-4a99-8fda-794ab379f98a" (UID: "afc5f89c-2aa9-4a99-8fda-794ab379f98a"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.295245 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "afc5f89c-2aa9-4a99-8fda-794ab379f98a" (UID: "afc5f89c-2aa9-4a99-8fda-794ab379f98a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.295850 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-inventory" (OuterVolumeSpecName: "inventory") pod "afc5f89c-2aa9-4a99-8fda-794ab379f98a" (UID: "afc5f89c-2aa9-4a99-8fda-794ab379f98a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.340403 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.340436 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.340449 4917 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afc5f89c-2aa9-4a99-8fda-794ab379f98a-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.340461 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2cc7\" (UniqueName: \"kubernetes.io/projected/afc5f89c-2aa9-4a99-8fda-794ab379f98a-kube-api-access-d2cc7\") on node \"crc\" DevicePath \"\"" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.443997 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.490864 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.612271 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" event={"ID":"afc5f89c-2aa9-4a99-8fda-794ab379f98a","Type":"ContainerDied","Data":"e285b92613bef04d979b0fa882b982ccd7fdbddba44b6c4669788980dafc2a9c"} Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.612301 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj" Mar 18 08:39:11 crc kubenswrapper[4917]: I0318 08:39:11.612319 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e285b92613bef04d979b0fa882b982ccd7fdbddba44b6c4669788980dafc2a9c" Mar 18 08:39:12 crc kubenswrapper[4917]: I0318 08:39:12.275968 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hb2w2"] Mar 18 08:39:12 crc kubenswrapper[4917]: I0318 08:39:12.625103 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hb2w2" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="registry-server" containerID="cri-o://2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700" gracePeriod=2 Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.104238 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.289731 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-utilities\") pod \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.289973 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpmfz\" (UniqueName: \"kubernetes.io/projected/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-kube-api-access-kpmfz\") pod \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.290044 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-catalog-content\") pod \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\" (UID: \"0f0cee0c-9d8e-40e4-b21c-1be0104415e6\") " Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.291145 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-utilities" (OuterVolumeSpecName: "utilities") pod "0f0cee0c-9d8e-40e4-b21c-1be0104415e6" (UID: "0f0cee0c-9d8e-40e4-b21c-1be0104415e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.299240 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-kube-api-access-kpmfz" (OuterVolumeSpecName: "kube-api-access-kpmfz") pod "0f0cee0c-9d8e-40e4-b21c-1be0104415e6" (UID: "0f0cee0c-9d8e-40e4-b21c-1be0104415e6"). InnerVolumeSpecName "kube-api-access-kpmfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.392818 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpmfz\" (UniqueName: \"kubernetes.io/projected/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-kube-api-access-kpmfz\") on node \"crc\" DevicePath \"\"" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.392856 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.426737 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f0cee0c-9d8e-40e4-b21c-1be0104415e6" (UID: "0f0cee0c-9d8e-40e4-b21c-1be0104415e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.495166 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f0cee0c-9d8e-40e4-b21c-1be0104415e6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.638173 4917 generic.go:334] "Generic (PLEG): container finished" podID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerID="2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700" exitCode=0 Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.638258 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hb2w2" event={"ID":"0f0cee0c-9d8e-40e4-b21c-1be0104415e6","Type":"ContainerDied","Data":"2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700"} Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.638322 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hb2w2" event={"ID":"0f0cee0c-9d8e-40e4-b21c-1be0104415e6","Type":"ContainerDied","Data":"46e53fab6babeb0a721189f75435b7e76edf610ec64756ef6a1475b011d92ba5"} Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.638320 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hb2w2" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.638342 4917 scope.go:117] "RemoveContainer" containerID="2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.664173 4917 scope.go:117] "RemoveContainer" containerID="03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.686334 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hb2w2"] Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.696765 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hb2w2"] Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.712752 4917 scope.go:117] "RemoveContainer" containerID="fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.751076 4917 scope.go:117] "RemoveContainer" containerID="2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700" Mar 18 08:39:13 crc kubenswrapper[4917]: E0318 08:39:13.751504 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700\": container with ID starting with 2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700 not found: ID does not exist" containerID="2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.751695 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700"} err="failed to get container status \"2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700\": rpc error: code = NotFound desc = could not find container \"2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700\": container with ID starting with 2639f73c53071c2426d1bd420d3e6e511e8d7143e54b78a53f8b699f523cd700 not found: ID does not exist" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.751786 4917 scope.go:117] "RemoveContainer" containerID="03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c" Mar 18 08:39:13 crc kubenswrapper[4917]: E0318 08:39:13.752290 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c\": container with ID starting with 03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c not found: ID does not exist" containerID="03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.752345 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c"} err="failed to get container status \"03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c\": rpc error: code = NotFound desc = could not find container \"03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c\": container with ID starting with 03572f623fd7ef8d56dfd615eb9f396665706cdf06ae87979ad48649eab90b4c not found: ID does not exist" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.752378 4917 scope.go:117] "RemoveContainer" containerID="fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce" Mar 18 08:39:13 crc kubenswrapper[4917]: E0318 08:39:13.752747 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce\": container with ID starting with fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce not found: ID does not exist" containerID="fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.752859 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce"} err="failed to get container status \"fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce\": rpc error: code = NotFound desc = could not find container \"fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce\": container with ID starting with fdd5c98a3b2d89c94a60cf67368814f6d502f707a885d505bc0933b997ac26ce not found: ID does not exist" Mar 18 08:39:13 crc kubenswrapper[4917]: I0318 08:39:13.787854 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" path="/var/lib/kubelet/pods/0f0cee0c-9d8e-40e4-b21c-1be0104415e6/volumes" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.311078 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-gzhp8"] Mar 18 08:39:25 crc kubenswrapper[4917]: E0318 08:39:25.312376 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="extract-utilities" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.312403 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="extract-utilities" Mar 18 08:39:25 crc kubenswrapper[4917]: E0318 08:39:25.312425 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afc5f89c-2aa9-4a99-8fda-794ab379f98a" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.312438 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="afc5f89c-2aa9-4a99-8fda-794ab379f98a" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 18 08:39:25 crc kubenswrapper[4917]: E0318 08:39:25.312461 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="registry-server" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.312471 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="registry-server" Mar 18 08:39:25 crc kubenswrapper[4917]: E0318 08:39:25.312504 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="extract-content" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.312516 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="extract-content" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.312850 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="afc5f89c-2aa9-4a99-8fda-794ab379f98a" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.312878 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f0cee0c-9d8e-40e4-b21c-1be0104415e6" containerName="registry-server" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.315767 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.319466 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.319980 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.321028 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.323303 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.338692 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-gzhp8"] Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.460054 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvn8b\" (UniqueName: \"kubernetes.io/projected/637c5a5e-39a1-498f-b95f-863f64074f0b-kube-api-access-mvn8b\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.460376 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-inventory\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.460546 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.460674 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.562854 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvn8b\" (UniqueName: \"kubernetes.io/projected/637c5a5e-39a1-498f-b95f-863f64074f0b-kube-api-access-mvn8b\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.562979 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-inventory\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.563043 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.563099 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.573980 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-inventory\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.589672 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.589694 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.596676 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvn8b\" (UniqueName: \"kubernetes.io/projected/637c5a5e-39a1-498f-b95f-863f64074f0b-kube-api-access-mvn8b\") pod \"bootstrap-openstack-openstack-cell1-gzhp8\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:25 crc kubenswrapper[4917]: I0318 08:39:25.656229 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:39:26 crc kubenswrapper[4917]: I0318 08:39:26.257781 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-gzhp8"] Mar 18 08:39:26 crc kubenswrapper[4917]: I0318 08:39:26.839084 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" event={"ID":"637c5a5e-39a1-498f-b95f-863f64074f0b","Type":"ContainerStarted","Data":"63885ef29ae48583b1c7fb8a2f05b585e655a09a7affb2d92cb6cda0b92647b2"} Mar 18 08:39:27 crc kubenswrapper[4917]: I0318 08:39:27.850921 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" event={"ID":"637c5a5e-39a1-498f-b95f-863f64074f0b","Type":"ContainerStarted","Data":"92cd1820eb56929b8981fbfb3cfdcf548741c59b43f7acd8994b56c01cf1de1c"} Mar 18 08:39:27 crc kubenswrapper[4917]: I0318 08:39:27.873212 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" podStartSLOduration=2.3681971969999998 podStartE2EDuration="2.873192748s" podCreationTimestamp="2026-03-18 08:39:25 +0000 UTC" firstStartedPulling="2026-03-18 08:39:26.26731505 +0000 UTC m=+6751.208469764" lastFinishedPulling="2026-03-18 08:39:26.772310601 +0000 UTC m=+6751.713465315" observedRunningTime="2026-03-18 08:39:27.869718744 +0000 UTC m=+6752.810873508" watchObservedRunningTime="2026-03-18 08:39:27.873192748 +0000 UTC m=+6752.814347472" Mar 18 08:39:32 crc kubenswrapper[4917]: I0318 08:39:32.929189 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:39:32 crc kubenswrapper[4917]: I0318 08:39:32.929809 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.165473 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563720-l587q"] Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.167723 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563720-l587q" Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.170563 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.170652 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.170973 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.180301 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563720-l587q"] Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.249471 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlvqd\" (UniqueName: \"kubernetes.io/projected/050aec6a-c49d-41d0-b42e-d8f5e4564bde-kube-api-access-dlvqd\") pod \"auto-csr-approver-29563720-l587q\" (UID: \"050aec6a-c49d-41d0-b42e-d8f5e4564bde\") " pod="openshift-infra/auto-csr-approver-29563720-l587q" Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.351131 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlvqd\" (UniqueName: \"kubernetes.io/projected/050aec6a-c49d-41d0-b42e-d8f5e4564bde-kube-api-access-dlvqd\") pod \"auto-csr-approver-29563720-l587q\" (UID: \"050aec6a-c49d-41d0-b42e-d8f5e4564bde\") " pod="openshift-infra/auto-csr-approver-29563720-l587q" Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.371384 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlvqd\" (UniqueName: \"kubernetes.io/projected/050aec6a-c49d-41d0-b42e-d8f5e4564bde-kube-api-access-dlvqd\") pod \"auto-csr-approver-29563720-l587q\" (UID: \"050aec6a-c49d-41d0-b42e-d8f5e4564bde\") " pod="openshift-infra/auto-csr-approver-29563720-l587q" Mar 18 08:40:00 crc kubenswrapper[4917]: I0318 08:40:00.492322 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563720-l587q" Mar 18 08:40:01 crc kubenswrapper[4917]: I0318 08:40:01.022512 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563720-l587q"] Mar 18 08:40:01 crc kubenswrapper[4917]: I0318 08:40:01.306872 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563720-l587q" event={"ID":"050aec6a-c49d-41d0-b42e-d8f5e4564bde","Type":"ContainerStarted","Data":"62e142b4a9f1048643e53208b7fa8217ab1d1578aff34bb81d556f160a369ccd"} Mar 18 08:40:02 crc kubenswrapper[4917]: I0318 08:40:02.929569 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:40:02 crc kubenswrapper[4917]: I0318 08:40:02.930060 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:40:03 crc kubenswrapper[4917]: I0318 08:40:03.329449 4917 generic.go:334] "Generic (PLEG): container finished" podID="050aec6a-c49d-41d0-b42e-d8f5e4564bde" containerID="93cb51db0190ff6873f64edad6e487bcd587d3e42e019c7e2e6081b6d02e99c8" exitCode=0 Mar 18 08:40:03 crc kubenswrapper[4917]: I0318 08:40:03.329494 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563720-l587q" event={"ID":"050aec6a-c49d-41d0-b42e-d8f5e4564bde","Type":"ContainerDied","Data":"93cb51db0190ff6873f64edad6e487bcd587d3e42e019c7e2e6081b6d02e99c8"} Mar 18 08:40:04 crc kubenswrapper[4917]: I0318 08:40:04.815755 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563720-l587q" Mar 18 08:40:04 crc kubenswrapper[4917]: I0318 08:40:04.961457 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlvqd\" (UniqueName: \"kubernetes.io/projected/050aec6a-c49d-41d0-b42e-d8f5e4564bde-kube-api-access-dlvqd\") pod \"050aec6a-c49d-41d0-b42e-d8f5e4564bde\" (UID: \"050aec6a-c49d-41d0-b42e-d8f5e4564bde\") " Mar 18 08:40:04 crc kubenswrapper[4917]: I0318 08:40:04.970475 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050aec6a-c49d-41d0-b42e-d8f5e4564bde-kube-api-access-dlvqd" (OuterVolumeSpecName: "kube-api-access-dlvqd") pod "050aec6a-c49d-41d0-b42e-d8f5e4564bde" (UID: "050aec6a-c49d-41d0-b42e-d8f5e4564bde"). InnerVolumeSpecName "kube-api-access-dlvqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:40:05 crc kubenswrapper[4917]: I0318 08:40:05.063797 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlvqd\" (UniqueName: \"kubernetes.io/projected/050aec6a-c49d-41d0-b42e-d8f5e4564bde-kube-api-access-dlvqd\") on node \"crc\" DevicePath \"\"" Mar 18 08:40:05 crc kubenswrapper[4917]: I0318 08:40:05.352880 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563720-l587q" event={"ID":"050aec6a-c49d-41d0-b42e-d8f5e4564bde","Type":"ContainerDied","Data":"62e142b4a9f1048643e53208b7fa8217ab1d1578aff34bb81d556f160a369ccd"} Mar 18 08:40:05 crc kubenswrapper[4917]: I0318 08:40:05.352927 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62e142b4a9f1048643e53208b7fa8217ab1d1578aff34bb81d556f160a369ccd" Mar 18 08:40:05 crc kubenswrapper[4917]: I0318 08:40:05.352939 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563720-l587q" Mar 18 08:40:05 crc kubenswrapper[4917]: I0318 08:40:05.913059 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563714-8nt7f"] Mar 18 08:40:05 crc kubenswrapper[4917]: I0318 08:40:05.927766 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563714-8nt7f"] Mar 18 08:40:07 crc kubenswrapper[4917]: I0318 08:40:07.790409 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f42d14-be42-4a2b-945d-d768f6232280" path="/var/lib/kubelet/pods/89f42d14-be42-4a2b-945d-d768f6232280/volumes" Mar 18 08:40:13 crc kubenswrapper[4917]: I0318 08:40:13.870759 4917 scope.go:117] "RemoveContainer" containerID="72da59dd220a82ba05a8c9953a9195c199ca4e26cb47d5940265beff3651b234" Mar 18 08:40:32 crc kubenswrapper[4917]: I0318 08:40:32.929558 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:40:32 crc kubenswrapper[4917]: I0318 08:40:32.930241 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:40:32 crc kubenswrapper[4917]: I0318 08:40:32.930290 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:40:32 crc kubenswrapper[4917]: I0318 08:40:32.931202 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:40:32 crc kubenswrapper[4917]: I0318 08:40:32.931282 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" gracePeriod=600 Mar 18 08:40:33 crc kubenswrapper[4917]: E0318 08:40:33.060253 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:40:33 crc kubenswrapper[4917]: I0318 08:40:33.695504 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" exitCode=0 Mar 18 08:40:33 crc kubenswrapper[4917]: I0318 08:40:33.695573 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860"} Mar 18 08:40:33 crc kubenswrapper[4917]: I0318 08:40:33.695710 4917 scope.go:117] "RemoveContainer" containerID="845a384dbbd152118e1d1f211f7ebf1b3f540d7bed6649dd90db902b2bd0a837" Mar 18 08:40:33 crc kubenswrapper[4917]: I0318 08:40:33.696800 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:40:33 crc kubenswrapper[4917]: E0318 08:40:33.697529 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:40:46 crc kubenswrapper[4917]: I0318 08:40:46.772524 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:40:46 crc kubenswrapper[4917]: E0318 08:40:46.773496 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:40:57 crc kubenswrapper[4917]: I0318 08:40:57.774537 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:40:57 crc kubenswrapper[4917]: E0318 08:40:57.775633 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:41:09 crc kubenswrapper[4917]: I0318 08:41:09.772403 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:41:09 crc kubenswrapper[4917]: E0318 08:41:09.773228 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:41:20 crc kubenswrapper[4917]: I0318 08:41:20.772478 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:41:20 crc kubenswrapper[4917]: E0318 08:41:20.773146 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:41:32 crc kubenswrapper[4917]: I0318 08:41:32.772873 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:41:32 crc kubenswrapper[4917]: E0318 08:41:32.773660 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:41:46 crc kubenswrapper[4917]: I0318 08:41:46.772572 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:41:46 crc kubenswrapper[4917]: E0318 08:41:46.773541 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:41:58 crc kubenswrapper[4917]: I0318 08:41:58.773289 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:41:58 crc kubenswrapper[4917]: E0318 08:41:58.774285 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.137472 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563722-qm2lc"] Mar 18 08:42:00 crc kubenswrapper[4917]: E0318 08:42:00.137904 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050aec6a-c49d-41d0-b42e-d8f5e4564bde" containerName="oc" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.137916 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="050aec6a-c49d-41d0-b42e-d8f5e4564bde" containerName="oc" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.138144 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="050aec6a-c49d-41d0-b42e-d8f5e4564bde" containerName="oc" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.138815 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563722-qm2lc" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.141921 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.142094 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.142224 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.148061 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563722-qm2lc"] Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.162507 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc2fm\" (UniqueName: \"kubernetes.io/projected/009d965f-f5bc-4113-883e-a45f2a4c4f04-kube-api-access-lc2fm\") pod \"auto-csr-approver-29563722-qm2lc\" (UID: \"009d965f-f5bc-4113-883e-a45f2a4c4f04\") " pod="openshift-infra/auto-csr-approver-29563722-qm2lc" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.265430 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc2fm\" (UniqueName: \"kubernetes.io/projected/009d965f-f5bc-4113-883e-a45f2a4c4f04-kube-api-access-lc2fm\") pod \"auto-csr-approver-29563722-qm2lc\" (UID: \"009d965f-f5bc-4113-883e-a45f2a4c4f04\") " pod="openshift-infra/auto-csr-approver-29563722-qm2lc" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.286303 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc2fm\" (UniqueName: \"kubernetes.io/projected/009d965f-f5bc-4113-883e-a45f2a4c4f04-kube-api-access-lc2fm\") pod \"auto-csr-approver-29563722-qm2lc\" (UID: \"009d965f-f5bc-4113-883e-a45f2a4c4f04\") " pod="openshift-infra/auto-csr-approver-29563722-qm2lc" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.457969 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563722-qm2lc" Mar 18 08:42:00 crc kubenswrapper[4917]: I0318 08:42:00.918131 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563722-qm2lc"] Mar 18 08:42:00 crc kubenswrapper[4917]: W0318 08:42:00.919110 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod009d965f_f5bc_4113_883e_a45f2a4c4f04.slice/crio-72f984927162afbbb28b1e832fe34a2552457e6d34c3cba7ab0395108b9614bb WatchSource:0}: Error finding container 72f984927162afbbb28b1e832fe34a2552457e6d34c3cba7ab0395108b9614bb: Status 404 returned error can't find the container with id 72f984927162afbbb28b1e832fe34a2552457e6d34c3cba7ab0395108b9614bb Mar 18 08:42:01 crc kubenswrapper[4917]: I0318 08:42:01.706241 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563722-qm2lc" event={"ID":"009d965f-f5bc-4113-883e-a45f2a4c4f04","Type":"ContainerStarted","Data":"72f984927162afbbb28b1e832fe34a2552457e6d34c3cba7ab0395108b9614bb"} Mar 18 08:42:02 crc kubenswrapper[4917]: I0318 08:42:02.727082 4917 generic.go:334] "Generic (PLEG): container finished" podID="009d965f-f5bc-4113-883e-a45f2a4c4f04" containerID="9aabd2b95bdd2cd16ac6a50dc666d41df78a5f427b03ff833fa0c664190069de" exitCode=0 Mar 18 08:42:02 crc kubenswrapper[4917]: I0318 08:42:02.727537 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563722-qm2lc" event={"ID":"009d965f-f5bc-4113-883e-a45f2a4c4f04","Type":"ContainerDied","Data":"9aabd2b95bdd2cd16ac6a50dc666d41df78a5f427b03ff833fa0c664190069de"} Mar 18 08:42:04 crc kubenswrapper[4917]: I0318 08:42:04.185942 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563722-qm2lc" Mar 18 08:42:04 crc kubenswrapper[4917]: I0318 08:42:04.273173 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lc2fm\" (UniqueName: \"kubernetes.io/projected/009d965f-f5bc-4113-883e-a45f2a4c4f04-kube-api-access-lc2fm\") pod \"009d965f-f5bc-4113-883e-a45f2a4c4f04\" (UID: \"009d965f-f5bc-4113-883e-a45f2a4c4f04\") " Mar 18 08:42:04 crc kubenswrapper[4917]: I0318 08:42:04.299639 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009d965f-f5bc-4113-883e-a45f2a4c4f04-kube-api-access-lc2fm" (OuterVolumeSpecName: "kube-api-access-lc2fm") pod "009d965f-f5bc-4113-883e-a45f2a4c4f04" (UID: "009d965f-f5bc-4113-883e-a45f2a4c4f04"). InnerVolumeSpecName "kube-api-access-lc2fm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:42:04 crc kubenswrapper[4917]: I0318 08:42:04.374253 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lc2fm\" (UniqueName: \"kubernetes.io/projected/009d965f-f5bc-4113-883e-a45f2a4c4f04-kube-api-access-lc2fm\") on node \"crc\" DevicePath \"\"" Mar 18 08:42:04 crc kubenswrapper[4917]: I0318 08:42:04.754025 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563722-qm2lc" event={"ID":"009d965f-f5bc-4113-883e-a45f2a4c4f04","Type":"ContainerDied","Data":"72f984927162afbbb28b1e832fe34a2552457e6d34c3cba7ab0395108b9614bb"} Mar 18 08:42:04 crc kubenswrapper[4917]: I0318 08:42:04.754110 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f984927162afbbb28b1e832fe34a2552457e6d34c3cba7ab0395108b9614bb" Mar 18 08:42:04 crc kubenswrapper[4917]: I0318 08:42:04.754067 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563722-qm2lc" Mar 18 08:42:05 crc kubenswrapper[4917]: I0318 08:42:05.277120 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563716-djmfw"] Mar 18 08:42:05 crc kubenswrapper[4917]: I0318 08:42:05.284705 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563716-djmfw"] Mar 18 08:42:05 crc kubenswrapper[4917]: I0318 08:42:05.791940 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fce264c0-97a8-4ff0-8c7d-01658b2298c2" path="/var/lib/kubelet/pods/fce264c0-97a8-4ff0-8c7d-01658b2298c2/volumes" Mar 18 08:42:11 crc kubenswrapper[4917]: I0318 08:42:11.773294 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:42:11 crc kubenswrapper[4917]: E0318 08:42:11.774341 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:42:14 crc kubenswrapper[4917]: I0318 08:42:14.020506 4917 scope.go:117] "RemoveContainer" containerID="ff71ec2202ffe518ee2517a7cd2c8c03696791c37767ae1cb0c0f41b5b662fb1" Mar 18 08:42:14 crc kubenswrapper[4917]: I0318 08:42:14.076551 4917 scope.go:117] "RemoveContainer" containerID="1eec3c20ed5c19ca160c43ef38e8e84942a628030081b12e5f73816bfbacd2e1" Mar 18 08:42:14 crc kubenswrapper[4917]: I0318 08:42:14.186163 4917 scope.go:117] "RemoveContainer" containerID="4be9ac2bb9b366acec13e775f485026ca61c3d350af77761b1316be69bb8c814" Mar 18 08:42:14 crc kubenswrapper[4917]: I0318 08:42:14.212416 4917 scope.go:117] "RemoveContainer" containerID="825710ed77ac632dfac349b8caed9f06a8e97e3d20cd503cae750d67ec0e90d9" Mar 18 08:42:23 crc kubenswrapper[4917]: I0318 08:42:23.773846 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:42:23 crc kubenswrapper[4917]: E0318 08:42:23.774894 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:42:34 crc kubenswrapper[4917]: I0318 08:42:34.080432 4917 generic.go:334] "Generic (PLEG): container finished" podID="637c5a5e-39a1-498f-b95f-863f64074f0b" containerID="92cd1820eb56929b8981fbfb3cfdcf548741c59b43f7acd8994b56c01cf1de1c" exitCode=0 Mar 18 08:42:34 crc kubenswrapper[4917]: I0318 08:42:34.080604 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" event={"ID":"637c5a5e-39a1-498f-b95f-863f64074f0b","Type":"ContainerDied","Data":"92cd1820eb56929b8981fbfb3cfdcf548741c59b43f7acd8994b56c01cf1de1c"} Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.667268 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.818909 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-bootstrap-combined-ca-bundle\") pod \"637c5a5e-39a1-498f-b95f-863f64074f0b\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.819075 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvn8b\" (UniqueName: \"kubernetes.io/projected/637c5a5e-39a1-498f-b95f-863f64074f0b-kube-api-access-mvn8b\") pod \"637c5a5e-39a1-498f-b95f-863f64074f0b\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.819199 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-inventory\") pod \"637c5a5e-39a1-498f-b95f-863f64074f0b\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.819270 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-ssh-key-openstack-cell1\") pod \"637c5a5e-39a1-498f-b95f-863f64074f0b\" (UID: \"637c5a5e-39a1-498f-b95f-863f64074f0b\") " Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.826013 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "637c5a5e-39a1-498f-b95f-863f64074f0b" (UID: "637c5a5e-39a1-498f-b95f-863f64074f0b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.851355 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637c5a5e-39a1-498f-b95f-863f64074f0b-kube-api-access-mvn8b" (OuterVolumeSpecName: "kube-api-access-mvn8b") pod "637c5a5e-39a1-498f-b95f-863f64074f0b" (UID: "637c5a5e-39a1-498f-b95f-863f64074f0b"). InnerVolumeSpecName "kube-api-access-mvn8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.852775 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "637c5a5e-39a1-498f-b95f-863f64074f0b" (UID: "637c5a5e-39a1-498f-b95f-863f64074f0b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.855820 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-inventory" (OuterVolumeSpecName: "inventory") pod "637c5a5e-39a1-498f-b95f-863f64074f0b" (UID: "637c5a5e-39a1-498f-b95f-863f64074f0b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.922932 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.922961 4917 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.922971 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvn8b\" (UniqueName: \"kubernetes.io/projected/637c5a5e-39a1-498f-b95f-863f64074f0b-kube-api-access-mvn8b\") on node \"crc\" DevicePath \"\"" Mar 18 08:42:35 crc kubenswrapper[4917]: I0318 08:42:35.922981 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/637c5a5e-39a1-498f-b95f-863f64074f0b-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.105113 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" event={"ID":"637c5a5e-39a1-498f-b95f-863f64074f0b","Type":"ContainerDied","Data":"63885ef29ae48583b1c7fb8a2f05b585e655a09a7affb2d92cb6cda0b92647b2"} Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.105691 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63885ef29ae48583b1c7fb8a2f05b585e655a09a7affb2d92cb6cda0b92647b2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.105176 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-gzhp8" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.216107 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-96ck2"] Mar 18 08:42:36 crc kubenswrapper[4917]: E0318 08:42:36.216671 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637c5a5e-39a1-498f-b95f-863f64074f0b" containerName="bootstrap-openstack-openstack-cell1" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.216694 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="637c5a5e-39a1-498f-b95f-863f64074f0b" containerName="bootstrap-openstack-openstack-cell1" Mar 18 08:42:36 crc kubenswrapper[4917]: E0318 08:42:36.216707 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009d965f-f5bc-4113-883e-a45f2a4c4f04" containerName="oc" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.216717 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="009d965f-f5bc-4113-883e-a45f2a4c4f04" containerName="oc" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.216989 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="009d965f-f5bc-4113-883e-a45f2a4c4f04" containerName="oc" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.217024 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="637c5a5e-39a1-498f-b95f-863f64074f0b" containerName="bootstrap-openstack-openstack-cell1" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.218185 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.221269 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.221511 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.221754 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.221963 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.227769 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-96ck2"] Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.330713 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-96ck2\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.331022 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-inventory\") pod \"download-cache-openstack-openstack-cell1-96ck2\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.331110 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m26qb\" (UniqueName: \"kubernetes.io/projected/56acaff3-9fc1-4c18-b49d-f2a025d5804b-kube-api-access-m26qb\") pod \"download-cache-openstack-openstack-cell1-96ck2\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.433559 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-inventory\") pod \"download-cache-openstack-openstack-cell1-96ck2\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.433642 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m26qb\" (UniqueName: \"kubernetes.io/projected/56acaff3-9fc1-4c18-b49d-f2a025d5804b-kube-api-access-m26qb\") pod \"download-cache-openstack-openstack-cell1-96ck2\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.433765 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-96ck2\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.437767 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-inventory\") pod \"download-cache-openstack-openstack-cell1-96ck2\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.439534 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-96ck2\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.450098 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m26qb\" (UniqueName: \"kubernetes.io/projected/56acaff3-9fc1-4c18-b49d-f2a025d5804b-kube-api-access-m26qb\") pod \"download-cache-openstack-openstack-cell1-96ck2\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.539088 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:42:36 crc kubenswrapper[4917]: I0318 08:42:36.773346 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:42:36 crc kubenswrapper[4917]: E0318 08:42:36.773908 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:42:37 crc kubenswrapper[4917]: I0318 08:42:37.139943 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-96ck2"] Mar 18 08:42:38 crc kubenswrapper[4917]: I0318 08:42:38.127851 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-96ck2" event={"ID":"56acaff3-9fc1-4c18-b49d-f2a025d5804b","Type":"ContainerStarted","Data":"921abe26d773c3ddc8ad375ba9389d6e03fb023f4ebd09d9fe62f1c7b02a69e2"} Mar 18 08:42:38 crc kubenswrapper[4917]: I0318 08:42:38.128191 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-96ck2" event={"ID":"56acaff3-9fc1-4c18-b49d-f2a025d5804b","Type":"ContainerStarted","Data":"2188d0ce94668a1f6f541e5c3cae18088edd67a96d196c7d008e68ade48b292c"} Mar 18 08:42:38 crc kubenswrapper[4917]: I0318 08:42:38.152466 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-96ck2" podStartSLOduration=1.7106914469999999 podStartE2EDuration="2.152448482s" podCreationTimestamp="2026-03-18 08:42:36 +0000 UTC" firstStartedPulling="2026-03-18 08:42:37.14607558 +0000 UTC m=+6942.087230294" lastFinishedPulling="2026-03-18 08:42:37.587832615 +0000 UTC m=+6942.528987329" observedRunningTime="2026-03-18 08:42:38.146663142 +0000 UTC m=+6943.087817896" watchObservedRunningTime="2026-03-18 08:42:38.152448482 +0000 UTC m=+6943.093603206" Mar 18 08:42:49 crc kubenswrapper[4917]: I0318 08:42:49.774124 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:42:49 crc kubenswrapper[4917]: E0318 08:42:49.775996 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:43:04 crc kubenswrapper[4917]: I0318 08:43:04.773217 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:43:04 crc kubenswrapper[4917]: E0318 08:43:04.774619 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:43:16 crc kubenswrapper[4917]: I0318 08:43:16.774039 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:43:16 crc kubenswrapper[4917]: E0318 08:43:16.775118 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.564394 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b8w4d"] Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.570279 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.585823 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8w4d"] Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.648701 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4m58\" (UniqueName: \"kubernetes.io/projected/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-kube-api-access-s4m58\") pod \"certified-operators-b8w4d\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.648879 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-utilities\") pod \"certified-operators-b8w4d\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.649019 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-catalog-content\") pod \"certified-operators-b8w4d\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.750855 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-catalog-content\") pod \"certified-operators-b8w4d\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.750969 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4m58\" (UniqueName: \"kubernetes.io/projected/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-kube-api-access-s4m58\") pod \"certified-operators-b8w4d\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.751041 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-utilities\") pod \"certified-operators-b8w4d\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.751506 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-utilities\") pod \"certified-operators-b8w4d\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.751505 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-catalog-content\") pod \"certified-operators-b8w4d\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.771623 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4m58\" (UniqueName: \"kubernetes.io/projected/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-kube-api-access-s4m58\") pod \"certified-operators-b8w4d\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:26 crc kubenswrapper[4917]: I0318 08:43:26.914769 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:27 crc kubenswrapper[4917]: I0318 08:43:27.405487 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8w4d"] Mar 18 08:43:27 crc kubenswrapper[4917]: I0318 08:43:27.622469 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8w4d" event={"ID":"ccd338de-f8e2-404c-88e7-c38ed4e8fa55","Type":"ContainerStarted","Data":"b27ece708d445945e694e19352002c1f6b5f17ea9868dc91ff69fa2ed45e5d41"} Mar 18 08:43:27 crc kubenswrapper[4917]: I0318 08:43:27.622770 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8w4d" event={"ID":"ccd338de-f8e2-404c-88e7-c38ed4e8fa55","Type":"ContainerStarted","Data":"f81c89aaef6872c821f44fa6befe45558502247f86aff5fbdb5b70f4b666bba0"} Mar 18 08:43:28 crc kubenswrapper[4917]: I0318 08:43:28.636507 4917 generic.go:334] "Generic (PLEG): container finished" podID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerID="b27ece708d445945e694e19352002c1f6b5f17ea9868dc91ff69fa2ed45e5d41" exitCode=0 Mar 18 08:43:28 crc kubenswrapper[4917]: I0318 08:43:28.636800 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8w4d" event={"ID":"ccd338de-f8e2-404c-88e7-c38ed4e8fa55","Type":"ContainerDied","Data":"b27ece708d445945e694e19352002c1f6b5f17ea9868dc91ff69fa2ed45e5d41"} Mar 18 08:43:28 crc kubenswrapper[4917]: I0318 08:43:28.642631 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:43:29 crc kubenswrapper[4917]: I0318 08:43:29.650397 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8w4d" event={"ID":"ccd338de-f8e2-404c-88e7-c38ed4e8fa55","Type":"ContainerStarted","Data":"1bdfe4c035b7d1e669065459ffe0b5a9ff738645c2ebc4cf3b66e447e17911ee"} Mar 18 08:43:29 crc kubenswrapper[4917]: I0318 08:43:29.773834 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:43:29 crc kubenswrapper[4917]: E0318 08:43:29.774166 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:43:31 crc kubenswrapper[4917]: I0318 08:43:31.677694 4917 generic.go:334] "Generic (PLEG): container finished" podID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerID="1bdfe4c035b7d1e669065459ffe0b5a9ff738645c2ebc4cf3b66e447e17911ee" exitCode=0 Mar 18 08:43:31 crc kubenswrapper[4917]: I0318 08:43:31.677779 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8w4d" event={"ID":"ccd338de-f8e2-404c-88e7-c38ed4e8fa55","Type":"ContainerDied","Data":"1bdfe4c035b7d1e669065459ffe0b5a9ff738645c2ebc4cf3b66e447e17911ee"} Mar 18 08:43:32 crc kubenswrapper[4917]: I0318 08:43:32.693445 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8w4d" event={"ID":"ccd338de-f8e2-404c-88e7-c38ed4e8fa55","Type":"ContainerStarted","Data":"7fb4c4867e57d1b9965e0b6e278fa80b2365895eab8173182cdde631e4152208"} Mar 18 08:43:32 crc kubenswrapper[4917]: I0318 08:43:32.732265 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b8w4d" podStartSLOduration=3.257160546 podStartE2EDuration="6.732238865s" podCreationTimestamp="2026-03-18 08:43:26 +0000 UTC" firstStartedPulling="2026-03-18 08:43:28.642124494 +0000 UTC m=+6993.583279248" lastFinishedPulling="2026-03-18 08:43:32.117202853 +0000 UTC m=+6997.058357567" observedRunningTime="2026-03-18 08:43:32.724504238 +0000 UTC m=+6997.665659032" watchObservedRunningTime="2026-03-18 08:43:32.732238865 +0000 UTC m=+6997.673393619" Mar 18 08:43:36 crc kubenswrapper[4917]: I0318 08:43:36.914936 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:36 crc kubenswrapper[4917]: I0318 08:43:36.915696 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:36 crc kubenswrapper[4917]: I0318 08:43:36.982519 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:37 crc kubenswrapper[4917]: I0318 08:43:37.844089 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:37 crc kubenswrapper[4917]: I0318 08:43:37.904322 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8w4d"] Mar 18 08:43:39 crc kubenswrapper[4917]: I0318 08:43:39.806253 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b8w4d" podUID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerName="registry-server" containerID="cri-o://7fb4c4867e57d1b9965e0b6e278fa80b2365895eab8173182cdde631e4152208" gracePeriod=2 Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.818783 4917 generic.go:334] "Generic (PLEG): container finished" podID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerID="7fb4c4867e57d1b9965e0b6e278fa80b2365895eab8173182cdde631e4152208" exitCode=0 Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.818968 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8w4d" event={"ID":"ccd338de-f8e2-404c-88e7-c38ed4e8fa55","Type":"ContainerDied","Data":"7fb4c4867e57d1b9965e0b6e278fa80b2365895eab8173182cdde631e4152208"} Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.819147 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8w4d" event={"ID":"ccd338de-f8e2-404c-88e7-c38ed4e8fa55","Type":"ContainerDied","Data":"f81c89aaef6872c821f44fa6befe45558502247f86aff5fbdb5b70f4b666bba0"} Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.819160 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f81c89aaef6872c821f44fa6befe45558502247f86aff5fbdb5b70f4b666bba0" Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.852729 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.915764 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-catalog-content\") pod \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.915956 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4m58\" (UniqueName: \"kubernetes.io/projected/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-kube-api-access-s4m58\") pod \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.917212 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-utilities\") pod \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\" (UID: \"ccd338de-f8e2-404c-88e7-c38ed4e8fa55\") " Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.919491 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-utilities" (OuterVolumeSpecName: "utilities") pod "ccd338de-f8e2-404c-88e7-c38ed4e8fa55" (UID: "ccd338de-f8e2-404c-88e7-c38ed4e8fa55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.925451 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-kube-api-access-s4m58" (OuterVolumeSpecName: "kube-api-access-s4m58") pod "ccd338de-f8e2-404c-88e7-c38ed4e8fa55" (UID: "ccd338de-f8e2-404c-88e7-c38ed4e8fa55"). InnerVolumeSpecName "kube-api-access-s4m58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:43:40 crc kubenswrapper[4917]: I0318 08:43:40.992487 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccd338de-f8e2-404c-88e7-c38ed4e8fa55" (UID: "ccd338de-f8e2-404c-88e7-c38ed4e8fa55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:43:41 crc kubenswrapper[4917]: I0318 08:43:41.020217 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:43:41 crc kubenswrapper[4917]: I0318 08:43:41.020248 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:43:41 crc kubenswrapper[4917]: I0318 08:43:41.020264 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4m58\" (UniqueName: \"kubernetes.io/projected/ccd338de-f8e2-404c-88e7-c38ed4e8fa55-kube-api-access-s4m58\") on node \"crc\" DevicePath \"\"" Mar 18 08:43:41 crc kubenswrapper[4917]: I0318 08:43:41.826282 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8w4d" Mar 18 08:43:41 crc kubenswrapper[4917]: I0318 08:43:41.851152 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8w4d"] Mar 18 08:43:41 crc kubenswrapper[4917]: I0318 08:43:41.860789 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b8w4d"] Mar 18 08:43:42 crc kubenswrapper[4917]: I0318 08:43:42.772304 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:43:42 crc kubenswrapper[4917]: E0318 08:43:42.772923 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:43:44 crc kubenswrapper[4917]: I0318 08:43:43.791099 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" path="/var/lib/kubelet/pods/ccd338de-f8e2-404c-88e7-c38ed4e8fa55/volumes" Mar 18 08:43:54 crc kubenswrapper[4917]: I0318 08:43:54.774776 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:43:54 crc kubenswrapper[4917]: E0318 08:43:54.776923 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.158226 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563724-vw87b"] Mar 18 08:44:00 crc kubenswrapper[4917]: E0318 08:44:00.159649 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerName="extract-content" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.159678 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerName="extract-content" Mar 18 08:44:00 crc kubenswrapper[4917]: E0318 08:44:00.159719 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerName="registry-server" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.159757 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerName="registry-server" Mar 18 08:44:00 crc kubenswrapper[4917]: E0318 08:44:00.159788 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerName="extract-utilities" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.159801 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerName="extract-utilities" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.160135 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccd338de-f8e2-404c-88e7-c38ed4e8fa55" containerName="registry-server" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.161386 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563724-vw87b" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.164969 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.165076 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.172381 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.176725 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563724-vw87b"] Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.256005 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f676s\" (UniqueName: \"kubernetes.io/projected/5a213707-ba5a-4473-8685-0933a627c94e-kube-api-access-f676s\") pod \"auto-csr-approver-29563724-vw87b\" (UID: \"5a213707-ba5a-4473-8685-0933a627c94e\") " pod="openshift-infra/auto-csr-approver-29563724-vw87b" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.359086 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f676s\" (UniqueName: \"kubernetes.io/projected/5a213707-ba5a-4473-8685-0933a627c94e-kube-api-access-f676s\") pod \"auto-csr-approver-29563724-vw87b\" (UID: \"5a213707-ba5a-4473-8685-0933a627c94e\") " pod="openshift-infra/auto-csr-approver-29563724-vw87b" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.385673 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f676s\" (UniqueName: \"kubernetes.io/projected/5a213707-ba5a-4473-8685-0933a627c94e-kube-api-access-f676s\") pod \"auto-csr-approver-29563724-vw87b\" (UID: \"5a213707-ba5a-4473-8685-0933a627c94e\") " pod="openshift-infra/auto-csr-approver-29563724-vw87b" Mar 18 08:44:00 crc kubenswrapper[4917]: I0318 08:44:00.491617 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563724-vw87b" Mar 18 08:44:01 crc kubenswrapper[4917]: I0318 08:44:01.016209 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563724-vw87b"] Mar 18 08:44:01 crc kubenswrapper[4917]: I0318 08:44:01.327167 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563724-vw87b" event={"ID":"5a213707-ba5a-4473-8685-0933a627c94e","Type":"ContainerStarted","Data":"e1ebc9d41f289de41206cabc981021ade12e8b99ee4c4e01ff0557ae4887f643"} Mar 18 08:44:04 crc kubenswrapper[4917]: I0318 08:44:04.376722 4917 generic.go:334] "Generic (PLEG): container finished" podID="5a213707-ba5a-4473-8685-0933a627c94e" containerID="951bb3483c1b723c7b3a395cbd51d6abb45ec3bee28c6cb3277a077d23079d9b" exitCode=0 Mar 18 08:44:04 crc kubenswrapper[4917]: I0318 08:44:04.376811 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563724-vw87b" event={"ID":"5a213707-ba5a-4473-8685-0933a627c94e","Type":"ContainerDied","Data":"951bb3483c1b723c7b3a395cbd51d6abb45ec3bee28c6cb3277a077d23079d9b"} Mar 18 08:44:05 crc kubenswrapper[4917]: I0318 08:44:05.817152 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563724-vw87b" Mar 18 08:44:05 crc kubenswrapper[4917]: I0318 08:44:05.987152 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f676s\" (UniqueName: \"kubernetes.io/projected/5a213707-ba5a-4473-8685-0933a627c94e-kube-api-access-f676s\") pod \"5a213707-ba5a-4473-8685-0933a627c94e\" (UID: \"5a213707-ba5a-4473-8685-0933a627c94e\") " Mar 18 08:44:06 crc kubenswrapper[4917]: I0318 08:44:06.031089 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a213707-ba5a-4473-8685-0933a627c94e-kube-api-access-f676s" (OuterVolumeSpecName: "kube-api-access-f676s") pod "5a213707-ba5a-4473-8685-0933a627c94e" (UID: "5a213707-ba5a-4473-8685-0933a627c94e"). InnerVolumeSpecName "kube-api-access-f676s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:44:06 crc kubenswrapper[4917]: I0318 08:44:06.089292 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f676s\" (UniqueName: \"kubernetes.io/projected/5a213707-ba5a-4473-8685-0933a627c94e-kube-api-access-f676s\") on node \"crc\" DevicePath \"\"" Mar 18 08:44:06 crc kubenswrapper[4917]: I0318 08:44:06.409242 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563724-vw87b" event={"ID":"5a213707-ba5a-4473-8685-0933a627c94e","Type":"ContainerDied","Data":"e1ebc9d41f289de41206cabc981021ade12e8b99ee4c4e01ff0557ae4887f643"} Mar 18 08:44:06 crc kubenswrapper[4917]: I0318 08:44:06.409710 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ebc9d41f289de41206cabc981021ade12e8b99ee4c4e01ff0557ae4887f643" Mar 18 08:44:06 crc kubenswrapper[4917]: I0318 08:44:06.409359 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563724-vw87b" Mar 18 08:44:06 crc kubenswrapper[4917]: I0318 08:44:06.892832 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563718-q7wfx"] Mar 18 08:44:06 crc kubenswrapper[4917]: I0318 08:44:06.903145 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563718-q7wfx"] Mar 18 08:44:07 crc kubenswrapper[4917]: I0318 08:44:07.774159 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:44:07 crc kubenswrapper[4917]: E0318 08:44:07.774801 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:44:07 crc kubenswrapper[4917]: I0318 08:44:07.790175 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3765ce07-a7b5-4d0f-8924-e670428ff79e" path="/var/lib/kubelet/pods/3765ce07-a7b5-4d0f-8924-e670428ff79e/volumes" Mar 18 08:44:14 crc kubenswrapper[4917]: I0318 08:44:14.342355 4917 scope.go:117] "RemoveContainer" containerID="0cdac4b6b4578e1b72cb63e50201589245d63fde5cc6140bf83b1ed716913a8c" Mar 18 08:44:22 crc kubenswrapper[4917]: I0318 08:44:22.773415 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:44:22 crc kubenswrapper[4917]: E0318 08:44:22.774143 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:44:35 crc kubenswrapper[4917]: I0318 08:44:35.756196 4917 generic.go:334] "Generic (PLEG): container finished" podID="56acaff3-9fc1-4c18-b49d-f2a025d5804b" containerID="921abe26d773c3ddc8ad375ba9389d6e03fb023f4ebd09d9fe62f1c7b02a69e2" exitCode=0 Mar 18 08:44:35 crc kubenswrapper[4917]: I0318 08:44:35.757685 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-96ck2" event={"ID":"56acaff3-9fc1-4c18-b49d-f2a025d5804b","Type":"ContainerDied","Data":"921abe26d773c3ddc8ad375ba9389d6e03fb023f4ebd09d9fe62f1c7b02a69e2"} Mar 18 08:44:36 crc kubenswrapper[4917]: I0318 08:44:36.773375 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:44:36 crc kubenswrapper[4917]: E0318 08:44:36.774366 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.283617 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.432482 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m26qb\" (UniqueName: \"kubernetes.io/projected/56acaff3-9fc1-4c18-b49d-f2a025d5804b-kube-api-access-m26qb\") pod \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.432567 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-ssh-key-openstack-cell1\") pod \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.432891 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-inventory\") pod \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\" (UID: \"56acaff3-9fc1-4c18-b49d-f2a025d5804b\") " Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.437519 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56acaff3-9fc1-4c18-b49d-f2a025d5804b-kube-api-access-m26qb" (OuterVolumeSpecName: "kube-api-access-m26qb") pod "56acaff3-9fc1-4c18-b49d-f2a025d5804b" (UID: "56acaff3-9fc1-4c18-b49d-f2a025d5804b"). InnerVolumeSpecName "kube-api-access-m26qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.460293 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-inventory" (OuterVolumeSpecName: "inventory") pod "56acaff3-9fc1-4c18-b49d-f2a025d5804b" (UID: "56acaff3-9fc1-4c18-b49d-f2a025d5804b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.461240 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "56acaff3-9fc1-4c18-b49d-f2a025d5804b" (UID: "56acaff3-9fc1-4c18-b49d-f2a025d5804b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.535254 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m26qb\" (UniqueName: \"kubernetes.io/projected/56acaff3-9fc1-4c18-b49d-f2a025d5804b-kube-api-access-m26qb\") on node \"crc\" DevicePath \"\"" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.535304 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.535324 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/56acaff3-9fc1-4c18-b49d-f2a025d5804b-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.783235 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-96ck2" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.802218 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-96ck2" event={"ID":"56acaff3-9fc1-4c18-b49d-f2a025d5804b","Type":"ContainerDied","Data":"2188d0ce94668a1f6f541e5c3cae18088edd67a96d196c7d008e68ade48b292c"} Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.802283 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2188d0ce94668a1f6f541e5c3cae18088edd67a96d196c7d008e68ade48b292c" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.886142 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-hngpl"] Mar 18 08:44:37 crc kubenswrapper[4917]: E0318 08:44:37.887141 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56acaff3-9fc1-4c18-b49d-f2a025d5804b" containerName="download-cache-openstack-openstack-cell1" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.887175 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="56acaff3-9fc1-4c18-b49d-f2a025d5804b" containerName="download-cache-openstack-openstack-cell1" Mar 18 08:44:37 crc kubenswrapper[4917]: E0318 08:44:37.887218 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a213707-ba5a-4473-8685-0933a627c94e" containerName="oc" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.887233 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a213707-ba5a-4473-8685-0933a627c94e" containerName="oc" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.887615 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="56acaff3-9fc1-4c18-b49d-f2a025d5804b" containerName="download-cache-openstack-openstack-cell1" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.887661 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a213707-ba5a-4473-8685-0933a627c94e" containerName="oc" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.890544 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.895732 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.895810 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-hngpl"] Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.896186 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.896213 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.896349 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.945276 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc685\" (UniqueName: \"kubernetes.io/projected/517c4a9b-15b8-42fd-8c75-2e4487db81ab-kube-api-access-kc685\") pod \"configure-network-openstack-openstack-cell1-hngpl\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.945800 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-inventory\") pod \"configure-network-openstack-openstack-cell1-hngpl\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:37 crc kubenswrapper[4917]: I0318 08:44:37.945852 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-hngpl\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:38 crc kubenswrapper[4917]: I0318 08:44:38.047731 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-inventory\") pod \"configure-network-openstack-openstack-cell1-hngpl\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:38 crc kubenswrapper[4917]: I0318 08:44:38.048061 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-hngpl\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:38 crc kubenswrapper[4917]: I0318 08:44:38.048215 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc685\" (UniqueName: \"kubernetes.io/projected/517c4a9b-15b8-42fd-8c75-2e4487db81ab-kube-api-access-kc685\") pod \"configure-network-openstack-openstack-cell1-hngpl\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:38 crc kubenswrapper[4917]: I0318 08:44:38.053033 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-hngpl\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:38 crc kubenswrapper[4917]: I0318 08:44:38.065466 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-inventory\") pod \"configure-network-openstack-openstack-cell1-hngpl\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:38 crc kubenswrapper[4917]: I0318 08:44:38.084139 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc685\" (UniqueName: \"kubernetes.io/projected/517c4a9b-15b8-42fd-8c75-2e4487db81ab-kube-api-access-kc685\") pod \"configure-network-openstack-openstack-cell1-hngpl\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:38 crc kubenswrapper[4917]: I0318 08:44:38.209564 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:44:38 crc kubenswrapper[4917]: I0318 08:44:38.824986 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-hngpl"] Mar 18 08:44:39 crc kubenswrapper[4917]: I0318 08:44:39.805776 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-hngpl" event={"ID":"517c4a9b-15b8-42fd-8c75-2e4487db81ab","Type":"ContainerStarted","Data":"f64b10eda44df0c9834ce69ef4415849f5d22da1f591c3965c77fd26be0c68ae"} Mar 18 08:44:39 crc kubenswrapper[4917]: I0318 08:44:39.806322 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-hngpl" event={"ID":"517c4a9b-15b8-42fd-8c75-2e4487db81ab","Type":"ContainerStarted","Data":"ec0a3bd10bd6664c1b47e4b84c8482353658714f6347661949a6093eaede0e25"} Mar 18 08:44:39 crc kubenswrapper[4917]: I0318 08:44:39.827699 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-hngpl" podStartSLOduration=2.390631998 podStartE2EDuration="2.82767535s" podCreationTimestamp="2026-03-18 08:44:37 +0000 UTC" firstStartedPulling="2026-03-18 08:44:38.836095805 +0000 UTC m=+7063.777250529" lastFinishedPulling="2026-03-18 08:44:39.273139127 +0000 UTC m=+7064.214293881" observedRunningTime="2026-03-18 08:44:39.825051146 +0000 UTC m=+7064.766205880" watchObservedRunningTime="2026-03-18 08:44:39.82767535 +0000 UTC m=+7064.768830074" Mar 18 08:44:51 crc kubenswrapper[4917]: I0318 08:44:51.773647 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:44:51 crc kubenswrapper[4917]: E0318 08:44:51.774694 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.163178 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb"] Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.165866 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.169782 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.169783 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.193957 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb"] Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.272728 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdxwt\" (UniqueName: \"kubernetes.io/projected/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-kube-api-access-mdxwt\") pod \"collect-profiles-29563725-dprsb\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.273046 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-config-volume\") pod \"collect-profiles-29563725-dprsb\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.273239 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-secret-volume\") pod \"collect-profiles-29563725-dprsb\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.375839 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdxwt\" (UniqueName: \"kubernetes.io/projected/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-kube-api-access-mdxwt\") pod \"collect-profiles-29563725-dprsb\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.375938 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-config-volume\") pod \"collect-profiles-29563725-dprsb\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.376087 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-secret-volume\") pod \"collect-profiles-29563725-dprsb\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.378415 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-config-volume\") pod \"collect-profiles-29563725-dprsb\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.386928 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-secret-volume\") pod \"collect-profiles-29563725-dprsb\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.403350 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdxwt\" (UniqueName: \"kubernetes.io/projected/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-kube-api-access-mdxwt\") pod \"collect-profiles-29563725-dprsb\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:00 crc kubenswrapper[4917]: I0318 08:45:00.511232 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:01 crc kubenswrapper[4917]: I0318 08:45:01.033882 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb"] Mar 18 08:45:01 crc kubenswrapper[4917]: I0318 08:45:01.074456 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" event={"ID":"dcd3de79-3341-4256-bc5f-ca8adb6fce8e","Type":"ContainerStarted","Data":"6fa40dbd6b40d1c612105db7fbafa591bf0f48f5065196d6d642fcbd18ef5dee"} Mar 18 08:45:02 crc kubenswrapper[4917]: I0318 08:45:02.092972 4917 generic.go:334] "Generic (PLEG): container finished" podID="dcd3de79-3341-4256-bc5f-ca8adb6fce8e" containerID="5737c8f5cb3c5cff4cfc88731716a806c078fb16466e2cb37711448304edf5b7" exitCode=0 Mar 18 08:45:02 crc kubenswrapper[4917]: I0318 08:45:02.093104 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" event={"ID":"dcd3de79-3341-4256-bc5f-ca8adb6fce8e","Type":"ContainerDied","Data":"5737c8f5cb3c5cff4cfc88731716a806c078fb16466e2cb37711448304edf5b7"} Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.465109 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.551132 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdxwt\" (UniqueName: \"kubernetes.io/projected/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-kube-api-access-mdxwt\") pod \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.551645 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-config-volume\") pod \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.551904 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-secret-volume\") pod \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\" (UID: \"dcd3de79-3341-4256-bc5f-ca8adb6fce8e\") " Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.552836 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-config-volume" (OuterVolumeSpecName: "config-volume") pod "dcd3de79-3341-4256-bc5f-ca8adb6fce8e" (UID: "dcd3de79-3341-4256-bc5f-ca8adb6fce8e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.558713 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dcd3de79-3341-4256-bc5f-ca8adb6fce8e" (UID: "dcd3de79-3341-4256-bc5f-ca8adb6fce8e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.560892 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-kube-api-access-mdxwt" (OuterVolumeSpecName: "kube-api-access-mdxwt") pod "dcd3de79-3341-4256-bc5f-ca8adb6fce8e" (UID: "dcd3de79-3341-4256-bc5f-ca8adb6fce8e"). InnerVolumeSpecName "kube-api-access-mdxwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.655712 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.655785 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 08:45:03 crc kubenswrapper[4917]: I0318 08:45:03.655816 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdxwt\" (UniqueName: \"kubernetes.io/projected/dcd3de79-3341-4256-bc5f-ca8adb6fce8e-kube-api-access-mdxwt\") on node \"crc\" DevicePath \"\"" Mar 18 08:45:04 crc kubenswrapper[4917]: I0318 08:45:04.122278 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" event={"ID":"dcd3de79-3341-4256-bc5f-ca8adb6fce8e","Type":"ContainerDied","Data":"6fa40dbd6b40d1c612105db7fbafa591bf0f48f5065196d6d642fcbd18ef5dee"} Mar 18 08:45:04 crc kubenswrapper[4917]: I0318 08:45:04.122673 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fa40dbd6b40d1c612105db7fbafa591bf0f48f5065196d6d642fcbd18ef5dee" Mar 18 08:45:04 crc kubenswrapper[4917]: I0318 08:45:04.122370 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb" Mar 18 08:45:04 crc kubenswrapper[4917]: I0318 08:45:04.586275 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx"] Mar 18 08:45:04 crc kubenswrapper[4917]: I0318 08:45:04.596559 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563680-jk8mx"] Mar 18 08:45:05 crc kubenswrapper[4917]: I0318 08:45:05.792100 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713196bd-eb4e-42a5-b17e-265566a93719" path="/var/lib/kubelet/pods/713196bd-eb4e-42a5-b17e-265566a93719/volumes" Mar 18 08:45:06 crc kubenswrapper[4917]: I0318 08:45:06.773353 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:45:06 crc kubenswrapper[4917]: E0318 08:45:06.773780 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:45:14 crc kubenswrapper[4917]: I0318 08:45:14.465791 4917 scope.go:117] "RemoveContainer" containerID="a57c4f67a3b1fce162b3fe2bcd969d8a103a309c852e882cfab522ce663ced76" Mar 18 08:45:20 crc kubenswrapper[4917]: I0318 08:45:20.784389 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:45:20 crc kubenswrapper[4917]: E0318 08:45:20.785486 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:45:33 crc kubenswrapper[4917]: I0318 08:45:33.773072 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:45:34 crc kubenswrapper[4917]: I0318 08:45:34.464108 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"004af87b0589146355065aa18f9d5186b2677b48811e1c447f8ebdd857aa020a"} Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.159632 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563726-28h2f"] Mar 18 08:46:00 crc kubenswrapper[4917]: E0318 08:46:00.160551 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd3de79-3341-4256-bc5f-ca8adb6fce8e" containerName="collect-profiles" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.160564 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd3de79-3341-4256-bc5f-ca8adb6fce8e" containerName="collect-profiles" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.160784 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd3de79-3341-4256-bc5f-ca8adb6fce8e" containerName="collect-profiles" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.161445 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563726-28h2f" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.165688 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.166737 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.167815 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563726-28h2f"] Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.169751 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.230127 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65s5p\" (UniqueName: \"kubernetes.io/projected/b71d0fd7-116c-484e-9045-8f69d6907a6c-kube-api-access-65s5p\") pod \"auto-csr-approver-29563726-28h2f\" (UID: \"b71d0fd7-116c-484e-9045-8f69d6907a6c\") " pod="openshift-infra/auto-csr-approver-29563726-28h2f" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.332135 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65s5p\" (UniqueName: \"kubernetes.io/projected/b71d0fd7-116c-484e-9045-8f69d6907a6c-kube-api-access-65s5p\") pod \"auto-csr-approver-29563726-28h2f\" (UID: \"b71d0fd7-116c-484e-9045-8f69d6907a6c\") " pod="openshift-infra/auto-csr-approver-29563726-28h2f" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.354143 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65s5p\" (UniqueName: \"kubernetes.io/projected/b71d0fd7-116c-484e-9045-8f69d6907a6c-kube-api-access-65s5p\") pod \"auto-csr-approver-29563726-28h2f\" (UID: \"b71d0fd7-116c-484e-9045-8f69d6907a6c\") " pod="openshift-infra/auto-csr-approver-29563726-28h2f" Mar 18 08:46:00 crc kubenswrapper[4917]: I0318 08:46:00.485701 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563726-28h2f" Mar 18 08:46:01 crc kubenswrapper[4917]: I0318 08:46:01.014972 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563726-28h2f"] Mar 18 08:46:01 crc kubenswrapper[4917]: W0318 08:46:01.023223 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb71d0fd7_116c_484e_9045_8f69d6907a6c.slice/crio-73388d74fb02213c92a3eb4840b20b241428c90bb8e06cfe18c5d5e498e47f97 WatchSource:0}: Error finding container 73388d74fb02213c92a3eb4840b20b241428c90bb8e06cfe18c5d5e498e47f97: Status 404 returned error can't find the container with id 73388d74fb02213c92a3eb4840b20b241428c90bb8e06cfe18c5d5e498e47f97 Mar 18 08:46:01 crc kubenswrapper[4917]: I0318 08:46:01.776286 4917 generic.go:334] "Generic (PLEG): container finished" podID="517c4a9b-15b8-42fd-8c75-2e4487db81ab" containerID="f64b10eda44df0c9834ce69ef4415849f5d22da1f591c3965c77fd26be0c68ae" exitCode=0 Mar 18 08:46:01 crc kubenswrapper[4917]: I0318 08:46:01.794785 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563726-28h2f" event={"ID":"b71d0fd7-116c-484e-9045-8f69d6907a6c","Type":"ContainerStarted","Data":"73388d74fb02213c92a3eb4840b20b241428c90bb8e06cfe18c5d5e498e47f97"} Mar 18 08:46:01 crc kubenswrapper[4917]: I0318 08:46:01.794847 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-hngpl" event={"ID":"517c4a9b-15b8-42fd-8c75-2e4487db81ab","Type":"ContainerDied","Data":"f64b10eda44df0c9834ce69ef4415849f5d22da1f591c3965c77fd26be0c68ae"} Mar 18 08:46:02 crc kubenswrapper[4917]: I0318 08:46:02.792204 4917 generic.go:334] "Generic (PLEG): container finished" podID="b71d0fd7-116c-484e-9045-8f69d6907a6c" containerID="56e281aae2ad6cc106e8b204dc7eaac0a818a88dede8de091f60f8d64b3d320c" exitCode=0 Mar 18 08:46:02 crc kubenswrapper[4917]: I0318 08:46:02.792801 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563726-28h2f" event={"ID":"b71d0fd7-116c-484e-9045-8f69d6907a6c","Type":"ContainerDied","Data":"56e281aae2ad6cc106e8b204dc7eaac0a818a88dede8de091f60f8d64b3d320c"} Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.269931 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.407806 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-inventory\") pod \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.408128 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc685\" (UniqueName: \"kubernetes.io/projected/517c4a9b-15b8-42fd-8c75-2e4487db81ab-kube-api-access-kc685\") pod \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.408162 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-ssh-key-openstack-cell1\") pod \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\" (UID: \"517c4a9b-15b8-42fd-8c75-2e4487db81ab\") " Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.416221 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517c4a9b-15b8-42fd-8c75-2e4487db81ab-kube-api-access-kc685" (OuterVolumeSpecName: "kube-api-access-kc685") pod "517c4a9b-15b8-42fd-8c75-2e4487db81ab" (UID: "517c4a9b-15b8-42fd-8c75-2e4487db81ab"). InnerVolumeSpecName "kube-api-access-kc685". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.448570 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-inventory" (OuterVolumeSpecName: "inventory") pod "517c4a9b-15b8-42fd-8c75-2e4487db81ab" (UID: "517c4a9b-15b8-42fd-8c75-2e4487db81ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.466891 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "517c4a9b-15b8-42fd-8c75-2e4487db81ab" (UID: "517c4a9b-15b8-42fd-8c75-2e4487db81ab"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.511806 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc685\" (UniqueName: \"kubernetes.io/projected/517c4a9b-15b8-42fd-8c75-2e4487db81ab-kube-api-access-kc685\") on node \"crc\" DevicePath \"\"" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.511877 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.511906 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/517c4a9b-15b8-42fd-8c75-2e4487db81ab-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.811484 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-hngpl" event={"ID":"517c4a9b-15b8-42fd-8c75-2e4487db81ab","Type":"ContainerDied","Data":"ec0a3bd10bd6664c1b47e4b84c8482353658714f6347661949a6093eaede0e25"} Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.811557 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec0a3bd10bd6664c1b47e4b84c8482353658714f6347661949a6093eaede0e25" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.811510 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-hngpl" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.944189 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-xhxnc"] Mar 18 08:46:03 crc kubenswrapper[4917]: E0318 08:46:03.945772 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517c4a9b-15b8-42fd-8c75-2e4487db81ab" containerName="configure-network-openstack-openstack-cell1" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.945795 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="517c4a9b-15b8-42fd-8c75-2e4487db81ab" containerName="configure-network-openstack-openstack-cell1" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.946738 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="517c4a9b-15b8-42fd-8c75-2e4487db81ab" containerName="configure-network-openstack-openstack-cell1" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.949084 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.952246 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.952577 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.954821 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.954989 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:46:03 crc kubenswrapper[4917]: I0318 08:46:03.970869 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-xhxnc"] Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.028462 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-xhxnc\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.030079 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-inventory\") pod \"validate-network-openstack-openstack-cell1-xhxnc\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.030448 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm7rb\" (UniqueName: \"kubernetes.io/projected/634f496b-b740-4ed3-9dac-45ec0afdbb16-kube-api-access-fm7rb\") pod \"validate-network-openstack-openstack-cell1-xhxnc\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.132896 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-inventory\") pod \"validate-network-openstack-openstack-cell1-xhxnc\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.133025 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm7rb\" (UniqueName: \"kubernetes.io/projected/634f496b-b740-4ed3-9dac-45ec0afdbb16-kube-api-access-fm7rb\") pod \"validate-network-openstack-openstack-cell1-xhxnc\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.133102 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-xhxnc\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.137321 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-inventory\") pod \"validate-network-openstack-openstack-cell1-xhxnc\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.154998 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-xhxnc\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.158841 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm7rb\" (UniqueName: \"kubernetes.io/projected/634f496b-b740-4ed3-9dac-45ec0afdbb16-kube-api-access-fm7rb\") pod \"validate-network-openstack-openstack-cell1-xhxnc\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.240372 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563726-28h2f" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.291294 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.336818 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65s5p\" (UniqueName: \"kubernetes.io/projected/b71d0fd7-116c-484e-9045-8f69d6907a6c-kube-api-access-65s5p\") pod \"b71d0fd7-116c-484e-9045-8f69d6907a6c\" (UID: \"b71d0fd7-116c-484e-9045-8f69d6907a6c\") " Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.341864 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b71d0fd7-116c-484e-9045-8f69d6907a6c-kube-api-access-65s5p" (OuterVolumeSpecName: "kube-api-access-65s5p") pod "b71d0fd7-116c-484e-9045-8f69d6907a6c" (UID: "b71d0fd7-116c-484e-9045-8f69d6907a6c"). InnerVolumeSpecName "kube-api-access-65s5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.440930 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65s5p\" (UniqueName: \"kubernetes.io/projected/b71d0fd7-116c-484e-9045-8f69d6907a6c-kube-api-access-65s5p\") on node \"crc\" DevicePath \"\"" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.873107 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563726-28h2f" event={"ID":"b71d0fd7-116c-484e-9045-8f69d6907a6c","Type":"ContainerDied","Data":"73388d74fb02213c92a3eb4840b20b241428c90bb8e06cfe18c5d5e498e47f97"} Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.873437 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73388d74fb02213c92a3eb4840b20b241428c90bb8e06cfe18c5d5e498e47f97" Mar 18 08:46:04 crc kubenswrapper[4917]: I0318 08:46:04.873519 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563726-28h2f" Mar 18 08:46:05 crc kubenswrapper[4917]: I0318 08:46:05.012245 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-xhxnc"] Mar 18 08:46:05 crc kubenswrapper[4917]: I0318 08:46:05.317436 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563720-l587q"] Mar 18 08:46:05 crc kubenswrapper[4917]: I0318 08:46:05.330079 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563720-l587q"] Mar 18 08:46:05 crc kubenswrapper[4917]: I0318 08:46:05.788439 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="050aec6a-c49d-41d0-b42e-d8f5e4564bde" path="/var/lib/kubelet/pods/050aec6a-c49d-41d0-b42e-d8f5e4564bde/volumes" Mar 18 08:46:05 crc kubenswrapper[4917]: I0318 08:46:05.885730 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" event={"ID":"634f496b-b740-4ed3-9dac-45ec0afdbb16","Type":"ContainerStarted","Data":"72c77b374f2c5e8d80df1509e85c47f0899e89e3a0312e198b580e5d61eca259"} Mar 18 08:46:05 crc kubenswrapper[4917]: I0318 08:46:05.886738 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" event={"ID":"634f496b-b740-4ed3-9dac-45ec0afdbb16","Type":"ContainerStarted","Data":"1a376e3926b0b55797568f1a32823bafed5a231ed9cabc2b9c655c1d76a98b2c"} Mar 18 08:46:05 crc kubenswrapper[4917]: I0318 08:46:05.913792 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" podStartSLOduration=2.2974653959999998 podStartE2EDuration="2.913769198s" podCreationTimestamp="2026-03-18 08:46:03 +0000 UTC" firstStartedPulling="2026-03-18 08:46:05.024027066 +0000 UTC m=+7149.965181780" lastFinishedPulling="2026-03-18 08:46:05.640330868 +0000 UTC m=+7150.581485582" observedRunningTime="2026-03-18 08:46:05.907472324 +0000 UTC m=+7150.848627058" watchObservedRunningTime="2026-03-18 08:46:05.913769198 +0000 UTC m=+7150.854923912" Mar 18 08:46:10 crc kubenswrapper[4917]: I0318 08:46:10.946766 4917 generic.go:334] "Generic (PLEG): container finished" podID="634f496b-b740-4ed3-9dac-45ec0afdbb16" containerID="72c77b374f2c5e8d80df1509e85c47f0899e89e3a0312e198b580e5d61eca259" exitCode=0 Mar 18 08:46:10 crc kubenswrapper[4917]: I0318 08:46:10.946868 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" event={"ID":"634f496b-b740-4ed3-9dac-45ec0afdbb16","Type":"ContainerDied","Data":"72c77b374f2c5e8d80df1509e85c47f0899e89e3a0312e198b580e5d61eca259"} Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.350749 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.427875 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-inventory\") pod \"634f496b-b740-4ed3-9dac-45ec0afdbb16\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.428232 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm7rb\" (UniqueName: \"kubernetes.io/projected/634f496b-b740-4ed3-9dac-45ec0afdbb16-kube-api-access-fm7rb\") pod \"634f496b-b740-4ed3-9dac-45ec0afdbb16\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.428430 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-ssh-key-openstack-cell1\") pod \"634f496b-b740-4ed3-9dac-45ec0afdbb16\" (UID: \"634f496b-b740-4ed3-9dac-45ec0afdbb16\") " Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.434573 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634f496b-b740-4ed3-9dac-45ec0afdbb16-kube-api-access-fm7rb" (OuterVolumeSpecName: "kube-api-access-fm7rb") pod "634f496b-b740-4ed3-9dac-45ec0afdbb16" (UID: "634f496b-b740-4ed3-9dac-45ec0afdbb16"). InnerVolumeSpecName "kube-api-access-fm7rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.457933 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "634f496b-b740-4ed3-9dac-45ec0afdbb16" (UID: "634f496b-b740-4ed3-9dac-45ec0afdbb16"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.463151 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-inventory" (OuterVolumeSpecName: "inventory") pod "634f496b-b740-4ed3-9dac-45ec0afdbb16" (UID: "634f496b-b740-4ed3-9dac-45ec0afdbb16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.531462 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.531502 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm7rb\" (UniqueName: \"kubernetes.io/projected/634f496b-b740-4ed3-9dac-45ec0afdbb16-kube-api-access-fm7rb\") on node \"crc\" DevicePath \"\"" Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.531516 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/634f496b-b740-4ed3-9dac-45ec0afdbb16-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.972555 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" event={"ID":"634f496b-b740-4ed3-9dac-45ec0afdbb16","Type":"ContainerDied","Data":"1a376e3926b0b55797568f1a32823bafed5a231ed9cabc2b9c655c1d76a98b2c"} Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.972898 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a376e3926b0b55797568f1a32823bafed5a231ed9cabc2b9c655c1d76a98b2c" Mar 18 08:46:12 crc kubenswrapper[4917]: I0318 08:46:12.973087 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-xhxnc" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.101130 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-tvtpj"] Mar 18 08:46:13 crc kubenswrapper[4917]: E0318 08:46:13.101718 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b71d0fd7-116c-484e-9045-8f69d6907a6c" containerName="oc" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.101750 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b71d0fd7-116c-484e-9045-8f69d6907a6c" containerName="oc" Mar 18 08:46:13 crc kubenswrapper[4917]: E0318 08:46:13.101839 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634f496b-b740-4ed3-9dac-45ec0afdbb16" containerName="validate-network-openstack-openstack-cell1" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.101856 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="634f496b-b740-4ed3-9dac-45ec0afdbb16" containerName="validate-network-openstack-openstack-cell1" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.102135 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b71d0fd7-116c-484e-9045-8f69d6907a6c" containerName="oc" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.102169 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="634f496b-b740-4ed3-9dac-45ec0afdbb16" containerName="validate-network-openstack-openstack-cell1" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.103230 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.107258 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.108088 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.108368 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.108642 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.124248 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-tvtpj"] Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.145039 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kcpt\" (UniqueName: \"kubernetes.io/projected/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-kube-api-access-2kcpt\") pod \"install-os-openstack-openstack-cell1-tvtpj\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.145145 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-tvtpj\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.145616 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-inventory\") pod \"install-os-openstack-openstack-cell1-tvtpj\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.247502 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-inventory\") pod \"install-os-openstack-openstack-cell1-tvtpj\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.247933 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kcpt\" (UniqueName: \"kubernetes.io/projected/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-kube-api-access-2kcpt\") pod \"install-os-openstack-openstack-cell1-tvtpj\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.247976 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-tvtpj\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.253473 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-tvtpj\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.254836 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-inventory\") pod \"install-os-openstack-openstack-cell1-tvtpj\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.269222 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kcpt\" (UniqueName: \"kubernetes.io/projected/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-kube-api-access-2kcpt\") pod \"install-os-openstack-openstack-cell1-tvtpj\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:13 crc kubenswrapper[4917]: I0318 08:46:13.426803 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:46:14 crc kubenswrapper[4917]: I0318 08:46:14.046690 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-tvtpj"] Mar 18 08:46:14 crc kubenswrapper[4917]: I0318 08:46:14.562182 4917 scope.go:117] "RemoveContainer" containerID="93cb51db0190ff6873f64edad6e487bcd587d3e42e019c7e2e6081b6d02e99c8" Mar 18 08:46:15 crc kubenswrapper[4917]: I0318 08:46:15.005088 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tvtpj" event={"ID":"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5","Type":"ContainerStarted","Data":"3a7a187b85ebffd6279cfb0ce3931e7c5a0e6ac7dfaba81c9be4d05c201b64f9"} Mar 18 08:46:15 crc kubenswrapper[4917]: I0318 08:46:15.005476 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tvtpj" event={"ID":"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5","Type":"ContainerStarted","Data":"92ff711896d5e432477921ed53ff0c0bb774056605d492aa5b61a6a08e09b6ff"} Mar 18 08:46:15 crc kubenswrapper[4917]: I0318 08:46:15.044285 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-tvtpj" podStartSLOduration=1.5838120999999998 podStartE2EDuration="2.04425779s" podCreationTimestamp="2026-03-18 08:46:13 +0000 UTC" firstStartedPulling="2026-03-18 08:46:14.05412642 +0000 UTC m=+7158.995281124" lastFinishedPulling="2026-03-18 08:46:14.51457206 +0000 UTC m=+7159.455726814" observedRunningTime="2026-03-18 08:46:15.034762979 +0000 UTC m=+7159.975917713" watchObservedRunningTime="2026-03-18 08:46:15.04425779 +0000 UTC m=+7159.985412534" Mar 18 08:46:46 crc kubenswrapper[4917]: I0318 08:46:46.959015 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d5z67"] Mar 18 08:46:46 crc kubenswrapper[4917]: I0318 08:46:46.962678 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:46 crc kubenswrapper[4917]: I0318 08:46:46.974535 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-catalog-content\") pod \"community-operators-d5z67\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:46 crc kubenswrapper[4917]: I0318 08:46:46.974866 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-utilities\") pod \"community-operators-d5z67\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:46 crc kubenswrapper[4917]: I0318 08:46:46.975119 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbnmz\" (UniqueName: \"kubernetes.io/projected/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-kube-api-access-sbnmz\") pod \"community-operators-d5z67\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:46 crc kubenswrapper[4917]: I0318 08:46:46.975346 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5z67"] Mar 18 08:46:47 crc kubenswrapper[4917]: I0318 08:46:47.077118 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-utilities\") pod \"community-operators-d5z67\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:47 crc kubenswrapper[4917]: I0318 08:46:47.077237 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbnmz\" (UniqueName: \"kubernetes.io/projected/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-kube-api-access-sbnmz\") pod \"community-operators-d5z67\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:47 crc kubenswrapper[4917]: I0318 08:46:47.077482 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-catalog-content\") pod \"community-operators-d5z67\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:47 crc kubenswrapper[4917]: I0318 08:46:47.078001 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-catalog-content\") pod \"community-operators-d5z67\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:47 crc kubenswrapper[4917]: I0318 08:46:47.078032 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-utilities\") pod \"community-operators-d5z67\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:47 crc kubenswrapper[4917]: I0318 08:46:47.096773 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbnmz\" (UniqueName: \"kubernetes.io/projected/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-kube-api-access-sbnmz\") pod \"community-operators-d5z67\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:47 crc kubenswrapper[4917]: I0318 08:46:47.287247 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:47 crc kubenswrapper[4917]: I0318 08:46:47.904904 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d5z67"] Mar 18 08:46:47 crc kubenswrapper[4917]: W0318 08:46:47.905631 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab9bed88_d3a9_4cbc_ab18_6afcf2d7a86d.slice/crio-eb7942500e5d443a108185ca889765aacce8f39557fd305da1027c793f472f65 WatchSource:0}: Error finding container eb7942500e5d443a108185ca889765aacce8f39557fd305da1027c793f472f65: Status 404 returned error can't find the container with id eb7942500e5d443a108185ca889765aacce8f39557fd305da1027c793f472f65 Mar 18 08:46:48 crc kubenswrapper[4917]: I0318 08:46:48.385221 4917 generic.go:334] "Generic (PLEG): container finished" podID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerID="a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe" exitCode=0 Mar 18 08:46:48 crc kubenswrapper[4917]: I0318 08:46:48.385320 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5z67" event={"ID":"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d","Type":"ContainerDied","Data":"a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe"} Mar 18 08:46:48 crc kubenswrapper[4917]: I0318 08:46:48.385632 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5z67" event={"ID":"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d","Type":"ContainerStarted","Data":"eb7942500e5d443a108185ca889765aacce8f39557fd305da1027c793f472f65"} Mar 18 08:46:51 crc kubenswrapper[4917]: I0318 08:46:51.416209 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5z67" event={"ID":"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d","Type":"ContainerStarted","Data":"87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7"} Mar 18 08:46:53 crc kubenswrapper[4917]: I0318 08:46:53.443274 4917 generic.go:334] "Generic (PLEG): container finished" podID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerID="87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7" exitCode=0 Mar 18 08:46:53 crc kubenswrapper[4917]: I0318 08:46:53.443833 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5z67" event={"ID":"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d","Type":"ContainerDied","Data":"87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7"} Mar 18 08:46:54 crc kubenswrapper[4917]: I0318 08:46:54.454415 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5z67" event={"ID":"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d","Type":"ContainerStarted","Data":"c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b"} Mar 18 08:46:54 crc kubenswrapper[4917]: I0318 08:46:54.470775 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d5z67" podStartSLOduration=2.990804558 podStartE2EDuration="8.470755602s" podCreationTimestamp="2026-03-18 08:46:46 +0000 UTC" firstStartedPulling="2026-03-18 08:46:48.389375516 +0000 UTC m=+7193.330530270" lastFinishedPulling="2026-03-18 08:46:53.8693266 +0000 UTC m=+7198.810481314" observedRunningTime="2026-03-18 08:46:54.46944965 +0000 UTC m=+7199.410604384" watchObservedRunningTime="2026-03-18 08:46:54.470755602 +0000 UTC m=+7199.411910336" Mar 18 08:46:57 crc kubenswrapper[4917]: I0318 08:46:57.287577 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:57 crc kubenswrapper[4917]: I0318 08:46:57.288146 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:46:57 crc kubenswrapper[4917]: I0318 08:46:57.346266 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:47:00 crc kubenswrapper[4917]: I0318 08:47:00.519155 4917 generic.go:334] "Generic (PLEG): container finished" podID="d4d1cb7a-8c3a-4932-a94a-03038c30b8e5" containerID="3a7a187b85ebffd6279cfb0ce3931e7c5a0e6ac7dfaba81c9be4d05c201b64f9" exitCode=0 Mar 18 08:47:00 crc kubenswrapper[4917]: I0318 08:47:00.519269 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tvtpj" event={"ID":"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5","Type":"ContainerDied","Data":"3a7a187b85ebffd6279cfb0ce3931e7c5a0e6ac7dfaba81c9be4d05c201b64f9"} Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.000423 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.121276 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-inventory\") pod \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.121413 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kcpt\" (UniqueName: \"kubernetes.io/projected/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-kube-api-access-2kcpt\") pod \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.121642 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-ssh-key-openstack-cell1\") pod \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\" (UID: \"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5\") " Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.127445 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-kube-api-access-2kcpt" (OuterVolumeSpecName: "kube-api-access-2kcpt") pod "d4d1cb7a-8c3a-4932-a94a-03038c30b8e5" (UID: "d4d1cb7a-8c3a-4932-a94a-03038c30b8e5"). InnerVolumeSpecName "kube-api-access-2kcpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.161340 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-inventory" (OuterVolumeSpecName: "inventory") pod "d4d1cb7a-8c3a-4932-a94a-03038c30b8e5" (UID: "d4d1cb7a-8c3a-4932-a94a-03038c30b8e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.161809 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d4d1cb7a-8c3a-4932-a94a-03038c30b8e5" (UID: "d4d1cb7a-8c3a-4932-a94a-03038c30b8e5"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.225002 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.225056 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kcpt\" (UniqueName: \"kubernetes.io/projected/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-kube-api-access-2kcpt\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.225074 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d4d1cb7a-8c3a-4932-a94a-03038c30b8e5-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.549372 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-tvtpj" event={"ID":"d4d1cb7a-8c3a-4932-a94a-03038c30b8e5","Type":"ContainerDied","Data":"92ff711896d5e432477921ed53ff0c0bb774056605d492aa5b61a6a08e09b6ff"} Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.549692 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92ff711896d5e432477921ed53ff0c0bb774056605d492aa5b61a6a08e09b6ff" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.549528 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-tvtpj" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.665663 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-hgp6d"] Mar 18 08:47:02 crc kubenswrapper[4917]: E0318 08:47:02.666125 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d1cb7a-8c3a-4932-a94a-03038c30b8e5" containerName="install-os-openstack-openstack-cell1" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.666149 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d1cb7a-8c3a-4932-a94a-03038c30b8e5" containerName="install-os-openstack-openstack-cell1" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.666412 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d1cb7a-8c3a-4932-a94a-03038c30b8e5" containerName="install-os-openstack-openstack-cell1" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.667272 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.671132 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.671378 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.671534 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.671413 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.682320 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-hgp6d"] Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.765607 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d7pz5"] Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.769571 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.779812 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7pz5"] Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.843856 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-inventory\") pod \"configure-os-openstack-openstack-cell1-hgp6d\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.843924 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-hgp6d\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.843984 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j84cd\" (UniqueName: \"kubernetes.io/projected/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-kube-api-access-j84cd\") pod \"configure-os-openstack-openstack-cell1-hgp6d\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.945870 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-utilities\") pod \"redhat-marketplace-d7pz5\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.946057 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-inventory\") pod \"configure-os-openstack-openstack-cell1-hgp6d\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.946185 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-catalog-content\") pod \"redhat-marketplace-d7pz5\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.946930 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl9vm\" (UniqueName: \"kubernetes.io/projected/c5cad219-229f-4974-803f-8e4541430cad-kube-api-access-wl9vm\") pod \"redhat-marketplace-d7pz5\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.947522 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-hgp6d\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.947656 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j84cd\" (UniqueName: \"kubernetes.io/projected/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-kube-api-access-j84cd\") pod \"configure-os-openstack-openstack-cell1-hgp6d\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.949876 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-inventory\") pod \"configure-os-openstack-openstack-cell1-hgp6d\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.950291 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-hgp6d\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:02 crc kubenswrapper[4917]: I0318 08:47:02.966387 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j84cd\" (UniqueName: \"kubernetes.io/projected/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-kube-api-access-j84cd\") pod \"configure-os-openstack-openstack-cell1-hgp6d\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.028961 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.050109 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-catalog-content\") pod \"redhat-marketplace-d7pz5\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.050182 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl9vm\" (UniqueName: \"kubernetes.io/projected/c5cad219-229f-4974-803f-8e4541430cad-kube-api-access-wl9vm\") pod \"redhat-marketplace-d7pz5\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.050367 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-utilities\") pod \"redhat-marketplace-d7pz5\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.051049 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-utilities\") pod \"redhat-marketplace-d7pz5\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.051132 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-catalog-content\") pod \"redhat-marketplace-d7pz5\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.072772 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl9vm\" (UniqueName: \"kubernetes.io/projected/c5cad219-229f-4974-803f-8e4541430cad-kube-api-access-wl9vm\") pod \"redhat-marketplace-d7pz5\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.100668 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.584423 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-hgp6d"] Mar 18 08:47:03 crc kubenswrapper[4917]: W0318 08:47:03.589885 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7a34bdf_e873_4224_aff7_f81e8d05c6d4.slice/crio-5eb5bc0828635c6300cb3687833287af5a14c4d11be1094426a18357018bd7ab WatchSource:0}: Error finding container 5eb5bc0828635c6300cb3687833287af5a14c4d11be1094426a18357018bd7ab: Status 404 returned error can't find the container with id 5eb5bc0828635c6300cb3687833287af5a14c4d11be1094426a18357018bd7ab Mar 18 08:47:03 crc kubenswrapper[4917]: I0318 08:47:03.611246 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7pz5"] Mar 18 08:47:03 crc kubenswrapper[4917]: W0318 08:47:03.614150 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5cad219_229f_4974_803f_8e4541430cad.slice/crio-1ddcca2139ba01cd87e8fe6c4f1cff8b1ef77c8f9fec7768012970dfb4e11bbd WatchSource:0}: Error finding container 1ddcca2139ba01cd87e8fe6c4f1cff8b1ef77c8f9fec7768012970dfb4e11bbd: Status 404 returned error can't find the container with id 1ddcca2139ba01cd87e8fe6c4f1cff8b1ef77c8f9fec7768012970dfb4e11bbd Mar 18 08:47:04 crc kubenswrapper[4917]: I0318 08:47:04.575132 4917 generic.go:334] "Generic (PLEG): container finished" podID="c5cad219-229f-4974-803f-8e4541430cad" containerID="d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8" exitCode=0 Mar 18 08:47:04 crc kubenswrapper[4917]: I0318 08:47:04.575251 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7pz5" event={"ID":"c5cad219-229f-4974-803f-8e4541430cad","Type":"ContainerDied","Data":"d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8"} Mar 18 08:47:04 crc kubenswrapper[4917]: I0318 08:47:04.575813 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7pz5" event={"ID":"c5cad219-229f-4974-803f-8e4541430cad","Type":"ContainerStarted","Data":"1ddcca2139ba01cd87e8fe6c4f1cff8b1ef77c8f9fec7768012970dfb4e11bbd"} Mar 18 08:47:04 crc kubenswrapper[4917]: I0318 08:47:04.580736 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" event={"ID":"a7a34bdf-e873-4224-aff7-f81e8d05c6d4","Type":"ContainerStarted","Data":"5290f07537c9b79a0b5a02e8d10c502b7be8ecbf446f8c0df5f9a9abaf64b347"} Mar 18 08:47:04 crc kubenswrapper[4917]: I0318 08:47:04.580817 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" event={"ID":"a7a34bdf-e873-4224-aff7-f81e8d05c6d4","Type":"ContainerStarted","Data":"5eb5bc0828635c6300cb3687833287af5a14c4d11be1094426a18357018bd7ab"} Mar 18 08:47:04 crc kubenswrapper[4917]: I0318 08:47:04.659983 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" podStartSLOduration=2.261142255 podStartE2EDuration="2.659958628s" podCreationTimestamp="2026-03-18 08:47:02 +0000 UTC" firstStartedPulling="2026-03-18 08:47:03.594425049 +0000 UTC m=+7208.535579763" lastFinishedPulling="2026-03-18 08:47:03.993241372 +0000 UTC m=+7208.934396136" observedRunningTime="2026-03-18 08:47:04.638870686 +0000 UTC m=+7209.580025410" watchObservedRunningTime="2026-03-18 08:47:04.659958628 +0000 UTC m=+7209.601113362" Mar 18 08:47:05 crc kubenswrapper[4917]: I0318 08:47:05.595758 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7pz5" event={"ID":"c5cad219-229f-4974-803f-8e4541430cad","Type":"ContainerStarted","Data":"725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd"} Mar 18 08:47:06 crc kubenswrapper[4917]: I0318 08:47:06.613904 4917 generic.go:334] "Generic (PLEG): container finished" podID="c5cad219-229f-4974-803f-8e4541430cad" containerID="725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd" exitCode=0 Mar 18 08:47:06 crc kubenswrapper[4917]: I0318 08:47:06.613998 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7pz5" event={"ID":"c5cad219-229f-4974-803f-8e4541430cad","Type":"ContainerDied","Data":"725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd"} Mar 18 08:47:07 crc kubenswrapper[4917]: I0318 08:47:07.380396 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:47:07 crc kubenswrapper[4917]: I0318 08:47:07.632686 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7pz5" event={"ID":"c5cad219-229f-4974-803f-8e4541430cad","Type":"ContainerStarted","Data":"6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632"} Mar 18 08:47:07 crc kubenswrapper[4917]: I0318 08:47:07.667119 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d7pz5" podStartSLOduration=3.255771284 podStartE2EDuration="5.667090726s" podCreationTimestamp="2026-03-18 08:47:02 +0000 UTC" firstStartedPulling="2026-03-18 08:47:04.577877275 +0000 UTC m=+7209.519032019" lastFinishedPulling="2026-03-18 08:47:06.989196737 +0000 UTC m=+7211.930351461" observedRunningTime="2026-03-18 08:47:07.650040291 +0000 UTC m=+7212.591195015" watchObservedRunningTime="2026-03-18 08:47:07.667090726 +0000 UTC m=+7212.608245470" Mar 18 08:47:09 crc kubenswrapper[4917]: I0318 08:47:09.726407 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d5z67"] Mar 18 08:47:09 crc kubenswrapper[4917]: I0318 08:47:09.727152 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-d5z67" podUID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerName="registry-server" containerID="cri-o://c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b" gracePeriod=2 Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.231712 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.322497 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-utilities\") pod \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.323007 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbnmz\" (UniqueName: \"kubernetes.io/projected/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-kube-api-access-sbnmz\") pod \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.323168 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-catalog-content\") pod \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\" (UID: \"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d\") " Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.323243 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-utilities" (OuterVolumeSpecName: "utilities") pod "ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" (UID: "ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.324094 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.334916 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-kube-api-access-sbnmz" (OuterVolumeSpecName: "kube-api-access-sbnmz") pod "ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" (UID: "ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d"). InnerVolumeSpecName "kube-api-access-sbnmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.392292 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" (UID: "ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.426011 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbnmz\" (UniqueName: \"kubernetes.io/projected/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-kube-api-access-sbnmz\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.426187 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.685844 4917 generic.go:334] "Generic (PLEG): container finished" podID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerID="c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b" exitCode=0 Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.685897 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5z67" event={"ID":"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d","Type":"ContainerDied","Data":"c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b"} Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.685945 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d5z67" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.685969 4917 scope.go:117] "RemoveContainer" containerID="c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.685955 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d5z67" event={"ID":"ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d","Type":"ContainerDied","Data":"eb7942500e5d443a108185ca889765aacce8f39557fd305da1027c793f472f65"} Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.721153 4917 scope.go:117] "RemoveContainer" containerID="87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.741615 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-d5z67"] Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.753027 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-d5z67"] Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.773362 4917 scope.go:117] "RemoveContainer" containerID="a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.832184 4917 scope.go:117] "RemoveContainer" containerID="c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b" Mar 18 08:47:10 crc kubenswrapper[4917]: E0318 08:47:10.832612 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b\": container with ID starting with c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b not found: ID does not exist" containerID="c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.832650 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b"} err="failed to get container status \"c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b\": rpc error: code = NotFound desc = could not find container \"c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b\": container with ID starting with c495b1df00f3a09a676fb2102037fb92b12434433f464cf351811df607f3867b not found: ID does not exist" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.832671 4917 scope.go:117] "RemoveContainer" containerID="87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7" Mar 18 08:47:10 crc kubenswrapper[4917]: E0318 08:47:10.833137 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7\": container with ID starting with 87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7 not found: ID does not exist" containerID="87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.833358 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7"} err="failed to get container status \"87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7\": rpc error: code = NotFound desc = could not find container \"87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7\": container with ID starting with 87645a7e75261c5f3e17918eac8efd08eec5c91c958b83a6d80378c74810f3e7 not found: ID does not exist" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.833637 4917 scope.go:117] "RemoveContainer" containerID="a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe" Mar 18 08:47:10 crc kubenswrapper[4917]: E0318 08:47:10.834350 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe\": container with ID starting with a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe not found: ID does not exist" containerID="a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe" Mar 18 08:47:10 crc kubenswrapper[4917]: I0318 08:47:10.834380 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe"} err="failed to get container status \"a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe\": rpc error: code = NotFound desc = could not find container \"a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe\": container with ID starting with a6f980f84a42f9e160ffb0a7e97124ece6a5819db06c9af92777588bae7855fe not found: ID does not exist" Mar 18 08:47:11 crc kubenswrapper[4917]: I0318 08:47:11.793898 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" path="/var/lib/kubelet/pods/ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d/volumes" Mar 18 08:47:13 crc kubenswrapper[4917]: I0318 08:47:13.101533 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:13 crc kubenswrapper[4917]: I0318 08:47:13.102343 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:13 crc kubenswrapper[4917]: I0318 08:47:13.180458 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:13 crc kubenswrapper[4917]: I0318 08:47:13.810298 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:14 crc kubenswrapper[4917]: I0318 08:47:14.334393 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7pz5"] Mar 18 08:47:15 crc kubenswrapper[4917]: I0318 08:47:15.751058 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d7pz5" podUID="c5cad219-229f-4974-803f-8e4541430cad" containerName="registry-server" containerID="cri-o://6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632" gracePeriod=2 Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.209653 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.373278 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl9vm\" (UniqueName: \"kubernetes.io/projected/c5cad219-229f-4974-803f-8e4541430cad-kube-api-access-wl9vm\") pod \"c5cad219-229f-4974-803f-8e4541430cad\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.373650 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-utilities\") pod \"c5cad219-229f-4974-803f-8e4541430cad\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.373839 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-catalog-content\") pod \"c5cad219-229f-4974-803f-8e4541430cad\" (UID: \"c5cad219-229f-4974-803f-8e4541430cad\") " Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.375189 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-utilities" (OuterVolumeSpecName: "utilities") pod "c5cad219-229f-4974-803f-8e4541430cad" (UID: "c5cad219-229f-4974-803f-8e4541430cad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.388819 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cad219-229f-4974-803f-8e4541430cad-kube-api-access-wl9vm" (OuterVolumeSpecName: "kube-api-access-wl9vm") pod "c5cad219-229f-4974-803f-8e4541430cad" (UID: "c5cad219-229f-4974-803f-8e4541430cad"). InnerVolumeSpecName "kube-api-access-wl9vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.409082 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5cad219-229f-4974-803f-8e4541430cad" (UID: "c5cad219-229f-4974-803f-8e4541430cad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.475709 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl9vm\" (UniqueName: \"kubernetes.io/projected/c5cad219-229f-4974-803f-8e4541430cad-kube-api-access-wl9vm\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.475746 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.475760 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5cad219-229f-4974-803f-8e4541430cad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.762669 4917 generic.go:334] "Generic (PLEG): container finished" podID="c5cad219-229f-4974-803f-8e4541430cad" containerID="6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632" exitCode=0 Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.762727 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7pz5" event={"ID":"c5cad219-229f-4974-803f-8e4541430cad","Type":"ContainerDied","Data":"6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632"} Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.762762 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d7pz5" event={"ID":"c5cad219-229f-4974-803f-8e4541430cad","Type":"ContainerDied","Data":"1ddcca2139ba01cd87e8fe6c4f1cff8b1ef77c8f9fec7768012970dfb4e11bbd"} Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.762783 4917 scope.go:117] "RemoveContainer" containerID="6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.762914 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d7pz5" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.788699 4917 scope.go:117] "RemoveContainer" containerID="725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.812235 4917 scope.go:117] "RemoveContainer" containerID="d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.900003 4917 scope.go:117] "RemoveContainer" containerID="6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632" Mar 18 08:47:16 crc kubenswrapper[4917]: E0318 08:47:16.900944 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632\": container with ID starting with 6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632 not found: ID does not exist" containerID="6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.900988 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632"} err="failed to get container status \"6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632\": rpc error: code = NotFound desc = could not find container \"6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632\": container with ID starting with 6c23c63849b43649f26fa58f0ef3f4b756a697550e34a2b5fd7c5beb123a9632 not found: ID does not exist" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.901014 4917 scope.go:117] "RemoveContainer" containerID="725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd" Mar 18 08:47:16 crc kubenswrapper[4917]: E0318 08:47:16.901321 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd\": container with ID starting with 725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd not found: ID does not exist" containerID="725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.901355 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd"} err="failed to get container status \"725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd\": rpc error: code = NotFound desc = could not find container \"725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd\": container with ID starting with 725d781219ab09e32541e8a15b07e5a77c5bb63eb85c349d6d444596250281dd not found: ID does not exist" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.901375 4917 scope.go:117] "RemoveContainer" containerID="d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8" Mar 18 08:47:16 crc kubenswrapper[4917]: E0318 08:47:16.902393 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8\": container with ID starting with d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8 not found: ID does not exist" containerID="d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.902432 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8"} err="failed to get container status \"d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8\": rpc error: code = NotFound desc = could not find container \"d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8\": container with ID starting with d1c117402cee40fdde524efe0f93dbc5ec5d1370093622ac0bf1260407a310a8 not found: ID does not exist" Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.980770 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7pz5"] Mar 18 08:47:16 crc kubenswrapper[4917]: I0318 08:47:16.998477 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d7pz5"] Mar 18 08:47:17 crc kubenswrapper[4917]: I0318 08:47:17.792069 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cad219-229f-4974-803f-8e4541430cad" path="/var/lib/kubelet/pods/c5cad219-229f-4974-803f-8e4541430cad/volumes" Mar 18 08:47:49 crc kubenswrapper[4917]: I0318 08:47:49.092374 4917 generic.go:334] "Generic (PLEG): container finished" podID="a7a34bdf-e873-4224-aff7-f81e8d05c6d4" containerID="5290f07537c9b79a0b5a02e8d10c502b7be8ecbf446f8c0df5f9a9abaf64b347" exitCode=0 Mar 18 08:47:49 crc kubenswrapper[4917]: I0318 08:47:49.092486 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" event={"ID":"a7a34bdf-e873-4224-aff7-f81e8d05c6d4","Type":"ContainerDied","Data":"5290f07537c9b79a0b5a02e8d10c502b7be8ecbf446f8c0df5f9a9abaf64b347"} Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.569549 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.758410 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-inventory\") pod \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.758793 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-ssh-key-openstack-cell1\") pod \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.758997 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j84cd\" (UniqueName: \"kubernetes.io/projected/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-kube-api-access-j84cd\") pod \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\" (UID: \"a7a34bdf-e873-4224-aff7-f81e8d05c6d4\") " Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.775726 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-kube-api-access-j84cd" (OuterVolumeSpecName: "kube-api-access-j84cd") pod "a7a34bdf-e873-4224-aff7-f81e8d05c6d4" (UID: "a7a34bdf-e873-4224-aff7-f81e8d05c6d4"). InnerVolumeSpecName "kube-api-access-j84cd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.796468 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-inventory" (OuterVolumeSpecName: "inventory") pod "a7a34bdf-e873-4224-aff7-f81e8d05c6d4" (UID: "a7a34bdf-e873-4224-aff7-f81e8d05c6d4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.806873 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a7a34bdf-e873-4224-aff7-f81e8d05c6d4" (UID: "a7a34bdf-e873-4224-aff7-f81e8d05c6d4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.860846 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.860906 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:50 crc kubenswrapper[4917]: I0318 08:47:50.860924 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j84cd\" (UniqueName: \"kubernetes.io/projected/a7a34bdf-e873-4224-aff7-f81e8d05c6d4-kube-api-access-j84cd\") on node \"crc\" DevicePath \"\"" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.117049 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" event={"ID":"a7a34bdf-e873-4224-aff7-f81e8d05c6d4","Type":"ContainerDied","Data":"5eb5bc0828635c6300cb3687833287af5a14c4d11be1094426a18357018bd7ab"} Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.117094 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eb5bc0828635c6300cb3687833287af5a14c4d11be1094426a18357018bd7ab" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.117144 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-hgp6d" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.218849 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-5bnqr"] Mar 18 08:47:51 crc kubenswrapper[4917]: E0318 08:47:51.219265 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cad219-229f-4974-803f-8e4541430cad" containerName="registry-server" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219282 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cad219-229f-4974-803f-8e4541430cad" containerName="registry-server" Mar 18 08:47:51 crc kubenswrapper[4917]: E0318 08:47:51.219293 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a34bdf-e873-4224-aff7-f81e8d05c6d4" containerName="configure-os-openstack-openstack-cell1" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219300 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a34bdf-e873-4224-aff7-f81e8d05c6d4" containerName="configure-os-openstack-openstack-cell1" Mar 18 08:47:51 crc kubenswrapper[4917]: E0318 08:47:51.219330 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerName="extract-utilities" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219338 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerName="extract-utilities" Mar 18 08:47:51 crc kubenswrapper[4917]: E0318 08:47:51.219352 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerName="extract-content" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219357 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerName="extract-content" Mar 18 08:47:51 crc kubenswrapper[4917]: E0318 08:47:51.219377 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cad219-229f-4974-803f-8e4541430cad" containerName="extract-content" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219384 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cad219-229f-4974-803f-8e4541430cad" containerName="extract-content" Mar 18 08:47:51 crc kubenswrapper[4917]: E0318 08:47:51.219396 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerName="registry-server" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219405 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerName="registry-server" Mar 18 08:47:51 crc kubenswrapper[4917]: E0318 08:47:51.219418 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cad219-229f-4974-803f-8e4541430cad" containerName="extract-utilities" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219426 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cad219-229f-4974-803f-8e4541430cad" containerName="extract-utilities" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219673 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a34bdf-e873-4224-aff7-f81e8d05c6d4" containerName="configure-os-openstack-openstack-cell1" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219695 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9bed88-d3a9-4cbc-ab18-6afcf2d7a86d" containerName="registry-server" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.219707 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cad219-229f-4974-803f-8e4541430cad" containerName="registry-server" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.220372 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.222352 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.222551 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.224209 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.225037 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.229651 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-5bnqr"] Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.371317 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwh6w\" (UniqueName: \"kubernetes.io/projected/4c797acd-959e-4642-a4d5-037291d1f2ce-kube-api-access-rwh6w\") pod \"ssh-known-hosts-openstack-5bnqr\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.373092 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-5bnqr\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.373272 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-inventory-0\") pod \"ssh-known-hosts-openstack-5bnqr\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.475171 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-inventory-0\") pod \"ssh-known-hosts-openstack-5bnqr\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.475401 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwh6w\" (UniqueName: \"kubernetes.io/projected/4c797acd-959e-4642-a4d5-037291d1f2ce-kube-api-access-rwh6w\") pod \"ssh-known-hosts-openstack-5bnqr\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.475475 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-5bnqr\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.482022 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-inventory-0\") pod \"ssh-known-hosts-openstack-5bnqr\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.484199 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-5bnqr\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.491859 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwh6w\" (UniqueName: \"kubernetes.io/projected/4c797acd-959e-4642-a4d5-037291d1f2ce-kube-api-access-rwh6w\") pod \"ssh-known-hosts-openstack-5bnqr\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:51 crc kubenswrapper[4917]: I0318 08:47:51.551661 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:47:52 crc kubenswrapper[4917]: I0318 08:47:52.118580 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-5bnqr"] Mar 18 08:47:52 crc kubenswrapper[4917]: I0318 08:47:52.130886 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-5bnqr" event={"ID":"4c797acd-959e-4642-a4d5-037291d1f2ce","Type":"ContainerStarted","Data":"f1cb64de74f70990ea237b296cabf53fbe7a648fd0cb8749f5432fb81e7b2465"} Mar 18 08:47:53 crc kubenswrapper[4917]: I0318 08:47:53.143722 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-5bnqr" event={"ID":"4c797acd-959e-4642-a4d5-037291d1f2ce","Type":"ContainerStarted","Data":"a1086756aefe839409d744812b39a937e54abc027b0142c5d22739cdbb49c78b"} Mar 18 08:47:53 crc kubenswrapper[4917]: I0318 08:47:53.176311 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-5bnqr" podStartSLOduration=1.667986216 podStartE2EDuration="2.176286097s" podCreationTimestamp="2026-03-18 08:47:51 +0000 UTC" firstStartedPulling="2026-03-18 08:47:52.120058414 +0000 UTC m=+7257.061213128" lastFinishedPulling="2026-03-18 08:47:52.628358295 +0000 UTC m=+7257.569513009" observedRunningTime="2026-03-18 08:47:53.164255055 +0000 UTC m=+7258.105409849" watchObservedRunningTime="2026-03-18 08:47:53.176286097 +0000 UTC m=+7258.117440851" Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.148175 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563728-jfwvp"] Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.150544 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563728-jfwvp" Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.157262 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.157862 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.158028 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.158674 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563728-jfwvp"] Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.171908 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz5br\" (UniqueName: \"kubernetes.io/projected/1942ea5e-436f-4396-a0e9-a864c699b2b7-kube-api-access-lz5br\") pod \"auto-csr-approver-29563728-jfwvp\" (UID: \"1942ea5e-436f-4396-a0e9-a864c699b2b7\") " pod="openshift-infra/auto-csr-approver-29563728-jfwvp" Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.274797 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz5br\" (UniqueName: \"kubernetes.io/projected/1942ea5e-436f-4396-a0e9-a864c699b2b7-kube-api-access-lz5br\") pod \"auto-csr-approver-29563728-jfwvp\" (UID: \"1942ea5e-436f-4396-a0e9-a864c699b2b7\") " pod="openshift-infra/auto-csr-approver-29563728-jfwvp" Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.300008 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz5br\" (UniqueName: \"kubernetes.io/projected/1942ea5e-436f-4396-a0e9-a864c699b2b7-kube-api-access-lz5br\") pod \"auto-csr-approver-29563728-jfwvp\" (UID: \"1942ea5e-436f-4396-a0e9-a864c699b2b7\") " pod="openshift-infra/auto-csr-approver-29563728-jfwvp" Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.485384 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563728-jfwvp" Mar 18 08:48:00 crc kubenswrapper[4917]: I0318 08:48:00.983059 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563728-jfwvp"] Mar 18 08:48:01 crc kubenswrapper[4917]: I0318 08:48:01.237510 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563728-jfwvp" event={"ID":"1942ea5e-436f-4396-a0e9-a864c699b2b7","Type":"ContainerStarted","Data":"c830b4440271ae74a5f2ec1ac36165052fd94e308cdd0778e8f00d4368b11104"} Mar 18 08:48:02 crc kubenswrapper[4917]: I0318 08:48:02.250412 4917 generic.go:334] "Generic (PLEG): container finished" podID="4c797acd-959e-4642-a4d5-037291d1f2ce" containerID="a1086756aefe839409d744812b39a937e54abc027b0142c5d22739cdbb49c78b" exitCode=0 Mar 18 08:48:02 crc kubenswrapper[4917]: I0318 08:48:02.250627 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-5bnqr" event={"ID":"4c797acd-959e-4642-a4d5-037291d1f2ce","Type":"ContainerDied","Data":"a1086756aefe839409d744812b39a937e54abc027b0142c5d22739cdbb49c78b"} Mar 18 08:48:02 crc kubenswrapper[4917]: I0318 08:48:02.929710 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:48:02 crc kubenswrapper[4917]: I0318 08:48:02.930077 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.265018 4917 generic.go:334] "Generic (PLEG): container finished" podID="1942ea5e-436f-4396-a0e9-a864c699b2b7" containerID="8c5dab954c1253d6a6388b22ad220d9cb4d7cd39e04d77c84f4d70ad4280c0c4" exitCode=0 Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.265127 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563728-jfwvp" event={"ID":"1942ea5e-436f-4396-a0e9-a864c699b2b7","Type":"ContainerDied","Data":"8c5dab954c1253d6a6388b22ad220d9cb4d7cd39e04d77c84f4d70ad4280c0c4"} Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.799660 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.855799 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-ssh-key-openstack-cell1\") pod \"4c797acd-959e-4642-a4d5-037291d1f2ce\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.856095 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-inventory-0\") pod \"4c797acd-959e-4642-a4d5-037291d1f2ce\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.856148 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwh6w\" (UniqueName: \"kubernetes.io/projected/4c797acd-959e-4642-a4d5-037291d1f2ce-kube-api-access-rwh6w\") pod \"4c797acd-959e-4642-a4d5-037291d1f2ce\" (UID: \"4c797acd-959e-4642-a4d5-037291d1f2ce\") " Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.862831 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c797acd-959e-4642-a4d5-037291d1f2ce-kube-api-access-rwh6w" (OuterVolumeSpecName: "kube-api-access-rwh6w") pod "4c797acd-959e-4642-a4d5-037291d1f2ce" (UID: "4c797acd-959e-4642-a4d5-037291d1f2ce"). InnerVolumeSpecName "kube-api-access-rwh6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.893610 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4c797acd-959e-4642-a4d5-037291d1f2ce" (UID: "4c797acd-959e-4642-a4d5-037291d1f2ce"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.895674 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "4c797acd-959e-4642-a4d5-037291d1f2ce" (UID: "4c797acd-959e-4642-a4d5-037291d1f2ce"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.959778 4917 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.959815 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwh6w\" (UniqueName: \"kubernetes.io/projected/4c797acd-959e-4642-a4d5-037291d1f2ce-kube-api-access-rwh6w\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:03 crc kubenswrapper[4917]: I0318 08:48:03.959829 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4c797acd-959e-4642-a4d5-037291d1f2ce-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.279926 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-5bnqr" event={"ID":"4c797acd-959e-4642-a4d5-037291d1f2ce","Type":"ContainerDied","Data":"f1cb64de74f70990ea237b296cabf53fbe7a648fd0cb8749f5432fb81e7b2465"} Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.280355 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1cb64de74f70990ea237b296cabf53fbe7a648fd0cb8749f5432fb81e7b2465" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.280462 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-5bnqr" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.366221 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-lmvxk"] Mar 18 08:48:04 crc kubenswrapper[4917]: E0318 08:48:04.366782 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c797acd-959e-4642-a4d5-037291d1f2ce" containerName="ssh-known-hosts-openstack" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.366804 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c797acd-959e-4642-a4d5-037291d1f2ce" containerName="ssh-known-hosts-openstack" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.367074 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c797acd-959e-4642-a4d5-037291d1f2ce" containerName="ssh-known-hosts-openstack" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.368043 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.371981 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.372950 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.372987 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.373177 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.385117 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-lmvxk"] Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.470245 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-lmvxk\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.470294 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-inventory\") pod \"run-os-openstack-openstack-cell1-lmvxk\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.470328 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggqnz\" (UniqueName: \"kubernetes.io/projected/57890f30-e41c-4670-8b1a-baf27e024f02-kube-api-access-ggqnz\") pod \"run-os-openstack-openstack-cell1-lmvxk\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.573665 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-lmvxk\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.573724 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-inventory\") pod \"run-os-openstack-openstack-cell1-lmvxk\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.573753 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggqnz\" (UniqueName: \"kubernetes.io/projected/57890f30-e41c-4670-8b1a-baf27e024f02-kube-api-access-ggqnz\") pod \"run-os-openstack-openstack-cell1-lmvxk\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.586352 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-inventory\") pod \"run-os-openstack-openstack-cell1-lmvxk\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.586353 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-lmvxk\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.597569 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggqnz\" (UniqueName: \"kubernetes.io/projected/57890f30-e41c-4670-8b1a-baf27e024f02-kube-api-access-ggqnz\") pod \"run-os-openstack-openstack-cell1-lmvxk\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.663606 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563728-jfwvp" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.707614 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.777495 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz5br\" (UniqueName: \"kubernetes.io/projected/1942ea5e-436f-4396-a0e9-a864c699b2b7-kube-api-access-lz5br\") pod \"1942ea5e-436f-4396-a0e9-a864c699b2b7\" (UID: \"1942ea5e-436f-4396-a0e9-a864c699b2b7\") " Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.790372 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1942ea5e-436f-4396-a0e9-a864c699b2b7-kube-api-access-lz5br" (OuterVolumeSpecName: "kube-api-access-lz5br") pod "1942ea5e-436f-4396-a0e9-a864c699b2b7" (UID: "1942ea5e-436f-4396-a0e9-a864c699b2b7"). InnerVolumeSpecName "kube-api-access-lz5br". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:48:04 crc kubenswrapper[4917]: I0318 08:48:04.882078 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz5br\" (UniqueName: \"kubernetes.io/projected/1942ea5e-436f-4396-a0e9-a864c699b2b7-kube-api-access-lz5br\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:05 crc kubenswrapper[4917]: W0318 08:48:05.276370 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57890f30_e41c_4670_8b1a_baf27e024f02.slice/crio-7754639f06ef662e07a8c8016f6a3e88978f4605009104eeb44caf4d127ff4eb WatchSource:0}: Error finding container 7754639f06ef662e07a8c8016f6a3e88978f4605009104eeb44caf4d127ff4eb: Status 404 returned error can't find the container with id 7754639f06ef662e07a8c8016f6a3e88978f4605009104eeb44caf4d127ff4eb Mar 18 08:48:05 crc kubenswrapper[4917]: I0318 08:48:05.276808 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-lmvxk"] Mar 18 08:48:05 crc kubenswrapper[4917]: I0318 08:48:05.294698 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-lmvxk" event={"ID":"57890f30-e41c-4670-8b1a-baf27e024f02","Type":"ContainerStarted","Data":"7754639f06ef662e07a8c8016f6a3e88978f4605009104eeb44caf4d127ff4eb"} Mar 18 08:48:05 crc kubenswrapper[4917]: I0318 08:48:05.297471 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563728-jfwvp" event={"ID":"1942ea5e-436f-4396-a0e9-a864c699b2b7","Type":"ContainerDied","Data":"c830b4440271ae74a5f2ec1ac36165052fd94e308cdd0778e8f00d4368b11104"} Mar 18 08:48:05 crc kubenswrapper[4917]: I0318 08:48:05.297517 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c830b4440271ae74a5f2ec1ac36165052fd94e308cdd0778e8f00d4368b11104" Mar 18 08:48:05 crc kubenswrapper[4917]: I0318 08:48:05.297559 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563728-jfwvp" Mar 18 08:48:05 crc kubenswrapper[4917]: I0318 08:48:05.776711 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563722-qm2lc"] Mar 18 08:48:05 crc kubenswrapper[4917]: I0318 08:48:05.831442 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563722-qm2lc"] Mar 18 08:48:06 crc kubenswrapper[4917]: I0318 08:48:06.308238 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-lmvxk" event={"ID":"57890f30-e41c-4670-8b1a-baf27e024f02","Type":"ContainerStarted","Data":"82528c2c75ec11ffef013c62e7f69a41a9023923b9e723fbf26487e2d145e28a"} Mar 18 08:48:06 crc kubenswrapper[4917]: I0318 08:48:06.339917 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-lmvxk" podStartSLOduration=1.8055350529999998 podStartE2EDuration="2.339889557s" podCreationTimestamp="2026-03-18 08:48:04 +0000 UTC" firstStartedPulling="2026-03-18 08:48:05.279084492 +0000 UTC m=+7270.220239206" lastFinishedPulling="2026-03-18 08:48:05.813438996 +0000 UTC m=+7270.754593710" observedRunningTime="2026-03-18 08:48:06.327054065 +0000 UTC m=+7271.268208809" watchObservedRunningTime="2026-03-18 08:48:06.339889557 +0000 UTC m=+7271.281044271" Mar 18 08:48:07 crc kubenswrapper[4917]: I0318 08:48:07.782895 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="009d965f-f5bc-4113-883e-a45f2a4c4f04" path="/var/lib/kubelet/pods/009d965f-f5bc-4113-883e-a45f2a4c4f04/volumes" Mar 18 08:48:14 crc kubenswrapper[4917]: I0318 08:48:14.404052 4917 generic.go:334] "Generic (PLEG): container finished" podID="57890f30-e41c-4670-8b1a-baf27e024f02" containerID="82528c2c75ec11ffef013c62e7f69a41a9023923b9e723fbf26487e2d145e28a" exitCode=0 Mar 18 08:48:14 crc kubenswrapper[4917]: I0318 08:48:14.404206 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-lmvxk" event={"ID":"57890f30-e41c-4670-8b1a-baf27e024f02","Type":"ContainerDied","Data":"82528c2c75ec11ffef013c62e7f69a41a9023923b9e723fbf26487e2d145e28a"} Mar 18 08:48:14 crc kubenswrapper[4917]: I0318 08:48:14.812516 4917 scope.go:117] "RemoveContainer" containerID="9aabd2b95bdd2cd16ac6a50dc666d41df78a5f427b03ff833fa0c664190069de" Mar 18 08:48:15 crc kubenswrapper[4917]: I0318 08:48:15.880055 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:15 crc kubenswrapper[4917]: I0318 08:48:15.925192 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-ssh-key-openstack-cell1\") pod \"57890f30-e41c-4670-8b1a-baf27e024f02\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " Mar 18 08:48:15 crc kubenswrapper[4917]: I0318 08:48:15.925318 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-inventory\") pod \"57890f30-e41c-4670-8b1a-baf27e024f02\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " Mar 18 08:48:15 crc kubenswrapper[4917]: I0318 08:48:15.925409 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggqnz\" (UniqueName: \"kubernetes.io/projected/57890f30-e41c-4670-8b1a-baf27e024f02-kube-api-access-ggqnz\") pod \"57890f30-e41c-4670-8b1a-baf27e024f02\" (UID: \"57890f30-e41c-4670-8b1a-baf27e024f02\") " Mar 18 08:48:15 crc kubenswrapper[4917]: I0318 08:48:15.932877 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57890f30-e41c-4670-8b1a-baf27e024f02-kube-api-access-ggqnz" (OuterVolumeSpecName: "kube-api-access-ggqnz") pod "57890f30-e41c-4670-8b1a-baf27e024f02" (UID: "57890f30-e41c-4670-8b1a-baf27e024f02"). InnerVolumeSpecName "kube-api-access-ggqnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:48:15 crc kubenswrapper[4917]: I0318 08:48:15.954084 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-inventory" (OuterVolumeSpecName: "inventory") pod "57890f30-e41c-4670-8b1a-baf27e024f02" (UID: "57890f30-e41c-4670-8b1a-baf27e024f02"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:48:15 crc kubenswrapper[4917]: I0318 08:48:15.956237 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "57890f30-e41c-4670-8b1a-baf27e024f02" (UID: "57890f30-e41c-4670-8b1a-baf27e024f02"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.029101 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.029164 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/57890f30-e41c-4670-8b1a-baf27e024f02-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.029184 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggqnz\" (UniqueName: \"kubernetes.io/projected/57890f30-e41c-4670-8b1a-baf27e024f02-kube-api-access-ggqnz\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.434799 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-lmvxk" event={"ID":"57890f30-e41c-4670-8b1a-baf27e024f02","Type":"ContainerDied","Data":"7754639f06ef662e07a8c8016f6a3e88978f4605009104eeb44caf4d127ff4eb"} Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.435171 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7754639f06ef662e07a8c8016f6a3e88978f4605009104eeb44caf4d127ff4eb" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.435255 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-lmvxk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.551471 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-hj7wk"] Mar 18 08:48:16 crc kubenswrapper[4917]: E0318 08:48:16.551908 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1942ea5e-436f-4396-a0e9-a864c699b2b7" containerName="oc" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.551925 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1942ea5e-436f-4396-a0e9-a864c699b2b7" containerName="oc" Mar 18 08:48:16 crc kubenswrapper[4917]: E0318 08:48:16.551942 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57890f30-e41c-4670-8b1a-baf27e024f02" containerName="run-os-openstack-openstack-cell1" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.551951 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="57890f30-e41c-4670-8b1a-baf27e024f02" containerName="run-os-openstack-openstack-cell1" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.552167 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1942ea5e-436f-4396-a0e9-a864c699b2b7" containerName="oc" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.552198 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="57890f30-e41c-4670-8b1a-baf27e024f02" containerName="run-os-openstack-openstack-cell1" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.552871 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.555665 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.555975 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.556970 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.568006 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.575149 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-hj7wk"] Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.638655 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqc5c\" (UniqueName: \"kubernetes.io/projected/21e0212b-6666-4baf-bcb7-ef705369af7f-kube-api-access-gqc5c\") pod \"reboot-os-openstack-openstack-cell1-hj7wk\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.638889 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-hj7wk\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.638967 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-inventory\") pod \"reboot-os-openstack-openstack-cell1-hj7wk\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.741633 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-hj7wk\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.741731 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-inventory\") pod \"reboot-os-openstack-openstack-cell1-hj7wk\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.741846 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqc5c\" (UniqueName: \"kubernetes.io/projected/21e0212b-6666-4baf-bcb7-ef705369af7f-kube-api-access-gqc5c\") pod \"reboot-os-openstack-openstack-cell1-hj7wk\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.747272 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-hj7wk\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.749083 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-inventory\") pod \"reboot-os-openstack-openstack-cell1-hj7wk\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.760842 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqc5c\" (UniqueName: \"kubernetes.io/projected/21e0212b-6666-4baf-bcb7-ef705369af7f-kube-api-access-gqc5c\") pod \"reboot-os-openstack-openstack-cell1-hj7wk\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:16 crc kubenswrapper[4917]: I0318 08:48:16.873619 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:17 crc kubenswrapper[4917]: I0318 08:48:17.374073 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-hj7wk"] Mar 18 08:48:17 crc kubenswrapper[4917]: I0318 08:48:17.444512 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" event={"ID":"21e0212b-6666-4baf-bcb7-ef705369af7f","Type":"ContainerStarted","Data":"b4106ccd697177703b26825cab6f796a41df8ecec8e15561f00fad49909590a4"} Mar 18 08:48:18 crc kubenswrapper[4917]: I0318 08:48:18.458908 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" event={"ID":"21e0212b-6666-4baf-bcb7-ef705369af7f","Type":"ContainerStarted","Data":"73a4e0f2f31947dc90f7582edb620a1fcec64f34bb3ef88c5c9cc7e490e88ebc"} Mar 18 08:48:18 crc kubenswrapper[4917]: I0318 08:48:18.488777 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" podStartSLOduration=2.065555455 podStartE2EDuration="2.48875778s" podCreationTimestamp="2026-03-18 08:48:16 +0000 UTC" firstStartedPulling="2026-03-18 08:48:17.387279048 +0000 UTC m=+7282.328433772" lastFinishedPulling="2026-03-18 08:48:17.810481383 +0000 UTC m=+7282.751636097" observedRunningTime="2026-03-18 08:48:18.480874169 +0000 UTC m=+7283.422028893" watchObservedRunningTime="2026-03-18 08:48:18.48875778 +0000 UTC m=+7283.429912484" Mar 18 08:48:32 crc kubenswrapper[4917]: I0318 08:48:32.929651 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:48:32 crc kubenswrapper[4917]: I0318 08:48:32.932425 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:48:34 crc kubenswrapper[4917]: I0318 08:48:34.648227 4917 generic.go:334] "Generic (PLEG): container finished" podID="21e0212b-6666-4baf-bcb7-ef705369af7f" containerID="73a4e0f2f31947dc90f7582edb620a1fcec64f34bb3ef88c5c9cc7e490e88ebc" exitCode=0 Mar 18 08:48:34 crc kubenswrapper[4917]: I0318 08:48:34.648298 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" event={"ID":"21e0212b-6666-4baf-bcb7-ef705369af7f","Type":"ContainerDied","Data":"73a4e0f2f31947dc90f7582edb620a1fcec64f34bb3ef88c5c9cc7e490e88ebc"} Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.165169 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.294295 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-inventory\") pod \"21e0212b-6666-4baf-bcb7-ef705369af7f\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.294540 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqc5c\" (UniqueName: \"kubernetes.io/projected/21e0212b-6666-4baf-bcb7-ef705369af7f-kube-api-access-gqc5c\") pod \"21e0212b-6666-4baf-bcb7-ef705369af7f\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.294606 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-ssh-key-openstack-cell1\") pod \"21e0212b-6666-4baf-bcb7-ef705369af7f\" (UID: \"21e0212b-6666-4baf-bcb7-ef705369af7f\") " Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.302526 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21e0212b-6666-4baf-bcb7-ef705369af7f-kube-api-access-gqc5c" (OuterVolumeSpecName: "kube-api-access-gqc5c") pod "21e0212b-6666-4baf-bcb7-ef705369af7f" (UID: "21e0212b-6666-4baf-bcb7-ef705369af7f"). InnerVolumeSpecName "kube-api-access-gqc5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.329202 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-inventory" (OuterVolumeSpecName: "inventory") pod "21e0212b-6666-4baf-bcb7-ef705369af7f" (UID: "21e0212b-6666-4baf-bcb7-ef705369af7f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.330075 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "21e0212b-6666-4baf-bcb7-ef705369af7f" (UID: "21e0212b-6666-4baf-bcb7-ef705369af7f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.397783 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqc5c\" (UniqueName: \"kubernetes.io/projected/21e0212b-6666-4baf-bcb7-ef705369af7f-kube-api-access-gqc5c\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.398252 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.398296 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21e0212b-6666-4baf-bcb7-ef705369af7f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.671537 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" event={"ID":"21e0212b-6666-4baf-bcb7-ef705369af7f","Type":"ContainerDied","Data":"b4106ccd697177703b26825cab6f796a41df8ecec8e15561f00fad49909590a4"} Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.671606 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4106ccd697177703b26825cab6f796a41df8ecec8e15561f00fad49909590a4" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.671637 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-hj7wk" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.795695 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-4ppg5"] Mar 18 08:48:36 crc kubenswrapper[4917]: E0318 08:48:36.796347 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21e0212b-6666-4baf-bcb7-ef705369af7f" containerName="reboot-os-openstack-openstack-cell1" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.796380 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="21e0212b-6666-4baf-bcb7-ef705369af7f" containerName="reboot-os-openstack-openstack-cell1" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.796672 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="21e0212b-6666-4baf-bcb7-ef705369af7f" containerName="reboot-os-openstack-openstack-cell1" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.797546 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.799707 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.800053 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-libvirt-default-certs-0" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.800155 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-ovn-default-certs-0" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.800329 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.800362 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-telemetry-default-certs-0" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.802253 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.803551 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-neutron-metadata-default-certs-0" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.805025 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.809288 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-4ppg5"] Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.907664 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.907999 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvdzh\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-kube-api-access-zvdzh\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.908079 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.908202 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.908265 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.908495 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.908761 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.908805 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-inventory\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.908868 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.908935 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.909023 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.909050 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.909095 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.909139 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:36 crc kubenswrapper[4917]: I0318 08:48:36.909175 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011172 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011345 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011396 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-inventory\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011479 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011575 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011689 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011738 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011804 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011880 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.011932 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.012041 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.012176 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvdzh\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-kube-api-access-zvdzh\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.012271 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.012348 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.012416 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.018518 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-ovn-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.018805 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.019003 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-inventory\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.019207 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.020236 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.020475 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.021349 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-telemetry-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.022445 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-libvirt-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.022487 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.024942 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.027162 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.027405 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.027911 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-neutron-metadata-default-certs-0\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.028949 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.030108 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvdzh\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-kube-api-access-zvdzh\") pod \"install-certs-openstack-openstack-cell1-4ppg5\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.115435 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.505817 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-4ppg5"] Mar 18 08:48:37 crc kubenswrapper[4917]: W0318 08:48:37.511895 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf93f1f68_8a97_4e89_9d2b_b1583f046a2d.slice/crio-2d447271ab551a736d766e94659a4807567c2aeb975a8569fdf092f4738c37e0 WatchSource:0}: Error finding container 2d447271ab551a736d766e94659a4807567c2aeb975a8569fdf092f4738c37e0: Status 404 returned error can't find the container with id 2d447271ab551a736d766e94659a4807567c2aeb975a8569fdf092f4738c37e0 Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.514920 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:48:37 crc kubenswrapper[4917]: I0318 08:48:37.680441 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" event={"ID":"f93f1f68-8a97-4e89-9d2b-b1583f046a2d","Type":"ContainerStarted","Data":"2d447271ab551a736d766e94659a4807567c2aeb975a8569fdf092f4738c37e0"} Mar 18 08:48:38 crc kubenswrapper[4917]: I0318 08:48:38.691987 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" event={"ID":"f93f1f68-8a97-4e89-9d2b-b1583f046a2d","Type":"ContainerStarted","Data":"8d0315dd08af82a6ec08f61edf9b58eda45b6066068fb3dcadff0722fb596655"} Mar 18 08:48:38 crc kubenswrapper[4917]: I0318 08:48:38.731329 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" podStartSLOduration=2.217235513 podStartE2EDuration="2.731311194s" podCreationTimestamp="2026-03-18 08:48:36 +0000 UTC" firstStartedPulling="2026-03-18 08:48:37.514674126 +0000 UTC m=+7302.455828850" lastFinishedPulling="2026-03-18 08:48:38.028749817 +0000 UTC m=+7302.969904531" observedRunningTime="2026-03-18 08:48:38.724215322 +0000 UTC m=+7303.665370036" watchObservedRunningTime="2026-03-18 08:48:38.731311194 +0000 UTC m=+7303.672465908" Mar 18 08:49:02 crc kubenswrapper[4917]: I0318 08:49:02.929388 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:49:02 crc kubenswrapper[4917]: I0318 08:49:02.930019 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:49:02 crc kubenswrapper[4917]: I0318 08:49:02.930073 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:49:02 crc kubenswrapper[4917]: I0318 08:49:02.930938 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"004af87b0589146355065aa18f9d5186b2677b48811e1c447f8ebdd857aa020a"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:49:02 crc kubenswrapper[4917]: I0318 08:49:02.930989 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://004af87b0589146355065aa18f9d5186b2677b48811e1c447f8ebdd857aa020a" gracePeriod=600 Mar 18 08:49:03 crc kubenswrapper[4917]: I0318 08:49:03.952980 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="004af87b0589146355065aa18f9d5186b2677b48811e1c447f8ebdd857aa020a" exitCode=0 Mar 18 08:49:03 crc kubenswrapper[4917]: I0318 08:49:03.953050 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"004af87b0589146355065aa18f9d5186b2677b48811e1c447f8ebdd857aa020a"} Mar 18 08:49:03 crc kubenswrapper[4917]: I0318 08:49:03.953573 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369"} Mar 18 08:49:03 crc kubenswrapper[4917]: I0318 08:49:03.953616 4917 scope.go:117] "RemoveContainer" containerID="6b1fc63b65e8e12deffc691df990f02a9ecfd11817c92a0dc35444a7c1f69860" Mar 18 08:49:17 crc kubenswrapper[4917]: I0318 08:49:17.100842 4917 generic.go:334] "Generic (PLEG): container finished" podID="f93f1f68-8a97-4e89-9d2b-b1583f046a2d" containerID="8d0315dd08af82a6ec08f61edf9b58eda45b6066068fb3dcadff0722fb596655" exitCode=0 Mar 18 08:49:17 crc kubenswrapper[4917]: I0318 08:49:17.100933 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" event={"ID":"f93f1f68-8a97-4e89-9d2b-b1583f046a2d","Type":"ContainerDied","Data":"8d0315dd08af82a6ec08f61edf9b58eda45b6066068fb3dcadff0722fb596655"} Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.580273 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.671705 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-ovn-default-certs-0\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.671794 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-bootstrap-combined-ca-bundle\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.671871 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvdzh\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-kube-api-access-zvdzh\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.671942 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-libvirt-default-certs-0\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.671978 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-dhcp-combined-ca-bundle\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672064 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-metadata-combined-ca-bundle\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672098 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-telemetry-combined-ca-bundle\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672123 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ovn-combined-ca-bundle\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672160 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-sriov-combined-ca-bundle\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672184 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ssh-key-openstack-cell1\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672258 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-telemetry-default-certs-0\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672293 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-inventory\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672325 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-libvirt-combined-ca-bundle\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672369 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-neutron-metadata-default-certs-0\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.672393 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-nova-combined-ca-bundle\") pod \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\" (UID: \"f93f1f68-8a97-4e89-9d2b-b1583f046a2d\") " Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.678932 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.679175 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.680196 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.680275 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-neutron-metadata-default-certs-0") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "openstack-cell1-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.680308 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.680414 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-ovn-default-certs-0") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "openstack-cell1-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.681772 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-telemetry-default-certs-0") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "openstack-cell1-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.682980 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.684143 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.684580 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.684734 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-cell1-libvirt-default-certs-0") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "openstack-cell1-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.685065 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-kube-api-access-zvdzh" (OuterVolumeSpecName: "kube-api-access-zvdzh") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "kube-api-access-zvdzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.699694 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.706958 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.707053 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-inventory" (OuterVolumeSpecName: "inventory") pod "f93f1f68-8a97-4e89-9d2b-b1583f046a2d" (UID: "f93f1f68-8a97-4e89-9d2b-b1583f046a2d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777179 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777237 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777259 4917 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777284 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777303 4917 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777323 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777344 4917 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777364 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvdzh\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-kube-api-access-zvdzh\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777382 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-openstack-cell1-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777402 4917 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777430 4917 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777789 4917 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777813 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777832 4917 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:18 crc kubenswrapper[4917]: I0318 08:49:18.777855 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f93f1f68-8a97-4e89-9d2b-b1583f046a2d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.128185 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.128186 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-4ppg5" event={"ID":"f93f1f68-8a97-4e89-9d2b-b1583f046a2d","Type":"ContainerDied","Data":"2d447271ab551a736d766e94659a4807567c2aeb975a8569fdf092f4738c37e0"} Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.128318 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d447271ab551a736d766e94659a4807567c2aeb975a8569fdf092f4738c37e0" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.258472 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-7ff47"] Mar 18 08:49:19 crc kubenswrapper[4917]: E0318 08:49:19.259049 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93f1f68-8a97-4e89-9d2b-b1583f046a2d" containerName="install-certs-openstack-openstack-cell1" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.259072 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93f1f68-8a97-4e89-9d2b-b1583f046a2d" containerName="install-certs-openstack-openstack-cell1" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.259364 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93f1f68-8a97-4e89-9d2b-b1583f046a2d" containerName="install-certs-openstack-openstack-cell1" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.260281 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.266708 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.266989 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.267131 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.267309 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.267964 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.270853 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-7ff47"] Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.290943 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.291063 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/717aeb26-970e-480b-95d1-d514151ca1ac-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.291099 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvsz2\" (UniqueName: \"kubernetes.io/projected/717aeb26-970e-480b-95d1-d514151ca1ac-kube-api-access-pvsz2\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.291202 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-inventory\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.291470 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.400995 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.401135 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.401204 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/717aeb26-970e-480b-95d1-d514151ca1ac-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.401233 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvsz2\" (UniqueName: \"kubernetes.io/projected/717aeb26-970e-480b-95d1-d514151ca1ac-kube-api-access-pvsz2\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.401318 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-inventory\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.406390 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/717aeb26-970e-480b-95d1-d514151ca1ac-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.408937 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.426938 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-inventory\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.426968 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.430304 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvsz2\" (UniqueName: \"kubernetes.io/projected/717aeb26-970e-480b-95d1-d514151ca1ac-kube-api-access-pvsz2\") pod \"ovn-openstack-openstack-cell1-7ff47\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:19 crc kubenswrapper[4917]: I0318 08:49:19.589027 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:49:20 crc kubenswrapper[4917]: I0318 08:49:20.183137 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-7ff47"] Mar 18 08:49:21 crc kubenswrapper[4917]: I0318 08:49:21.149852 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7ff47" event={"ID":"717aeb26-970e-480b-95d1-d514151ca1ac","Type":"ContainerStarted","Data":"5d9e5dbd3261ecb1b41cc9d22482636b88ac384cebd78e7e7d1354d643a20354"} Mar 18 08:49:21 crc kubenswrapper[4917]: I0318 08:49:21.151768 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7ff47" event={"ID":"717aeb26-970e-480b-95d1-d514151ca1ac","Type":"ContainerStarted","Data":"62035c9d06cc8d9107f259ea44af7a54ef30fd190b4f4a2d1856417a71f0cbd2"} Mar 18 08:49:21 crc kubenswrapper[4917]: I0318 08:49:21.179929 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-7ff47" podStartSLOduration=1.60574376 podStartE2EDuration="2.17952349s" podCreationTimestamp="2026-03-18 08:49:19 +0000 UTC" firstStartedPulling="2026-03-18 08:49:20.188621672 +0000 UTC m=+7345.129776386" lastFinishedPulling="2026-03-18 08:49:20.762401372 +0000 UTC m=+7345.703556116" observedRunningTime="2026-03-18 08:49:21.169102217 +0000 UTC m=+7346.110256961" watchObservedRunningTime="2026-03-18 08:49:21.17952349 +0000 UTC m=+7346.120678214" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.297762 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6hn2q"] Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.315112 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.375070 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6hn2q"] Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.394002 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-catalog-content\") pod \"redhat-operators-6hn2q\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.394069 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-utilities\") pod \"redhat-operators-6hn2q\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.394119 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn9vx\" (UniqueName: \"kubernetes.io/projected/52364ec0-1480-4069-addf-ec1d6e6e02fb-kube-api-access-qn9vx\") pod \"redhat-operators-6hn2q\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.495721 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn9vx\" (UniqueName: \"kubernetes.io/projected/52364ec0-1480-4069-addf-ec1d6e6e02fb-kube-api-access-qn9vx\") pod \"redhat-operators-6hn2q\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.495850 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-catalog-content\") pod \"redhat-operators-6hn2q\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.495893 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-utilities\") pod \"redhat-operators-6hn2q\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.496360 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-utilities\") pod \"redhat-operators-6hn2q\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.496422 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-catalog-content\") pod \"redhat-operators-6hn2q\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.518006 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn9vx\" (UniqueName: \"kubernetes.io/projected/52364ec0-1480-4069-addf-ec1d6e6e02fb-kube-api-access-qn9vx\") pod \"redhat-operators-6hn2q\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:23 crc kubenswrapper[4917]: I0318 08:49:23.664056 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:24 crc kubenswrapper[4917]: I0318 08:49:24.162923 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6hn2q"] Mar 18 08:49:24 crc kubenswrapper[4917]: W0318 08:49:24.170405 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52364ec0_1480_4069_addf_ec1d6e6e02fb.slice/crio-bfa973e523acba7dc46216ea0d22309d1d1193175ab0348635f98fb14b70c744 WatchSource:0}: Error finding container bfa973e523acba7dc46216ea0d22309d1d1193175ab0348635f98fb14b70c744: Status 404 returned error can't find the container with id bfa973e523acba7dc46216ea0d22309d1d1193175ab0348635f98fb14b70c744 Mar 18 08:49:24 crc kubenswrapper[4917]: I0318 08:49:24.184811 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hn2q" event={"ID":"52364ec0-1480-4069-addf-ec1d6e6e02fb","Type":"ContainerStarted","Data":"bfa973e523acba7dc46216ea0d22309d1d1193175ab0348635f98fb14b70c744"} Mar 18 08:49:25 crc kubenswrapper[4917]: I0318 08:49:25.196162 4917 generic.go:334] "Generic (PLEG): container finished" podID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerID="4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00" exitCode=0 Mar 18 08:49:25 crc kubenswrapper[4917]: I0318 08:49:25.196214 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hn2q" event={"ID":"52364ec0-1480-4069-addf-ec1d6e6e02fb","Type":"ContainerDied","Data":"4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00"} Mar 18 08:49:26 crc kubenswrapper[4917]: I0318 08:49:26.207663 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hn2q" event={"ID":"52364ec0-1480-4069-addf-ec1d6e6e02fb","Type":"ContainerStarted","Data":"091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3"} Mar 18 08:49:31 crc kubenswrapper[4917]: I0318 08:49:31.269040 4917 generic.go:334] "Generic (PLEG): container finished" podID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerID="091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3" exitCode=0 Mar 18 08:49:31 crc kubenswrapper[4917]: I0318 08:49:31.269126 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hn2q" event={"ID":"52364ec0-1480-4069-addf-ec1d6e6e02fb","Type":"ContainerDied","Data":"091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3"} Mar 18 08:49:32 crc kubenswrapper[4917]: I0318 08:49:32.283421 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hn2q" event={"ID":"52364ec0-1480-4069-addf-ec1d6e6e02fb","Type":"ContainerStarted","Data":"d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6"} Mar 18 08:49:32 crc kubenswrapper[4917]: I0318 08:49:32.314878 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6hn2q" podStartSLOduration=2.829304038 podStartE2EDuration="9.314844246s" podCreationTimestamp="2026-03-18 08:49:23 +0000 UTC" firstStartedPulling="2026-03-18 08:49:25.199571019 +0000 UTC m=+7350.140725733" lastFinishedPulling="2026-03-18 08:49:31.685111227 +0000 UTC m=+7356.626265941" observedRunningTime="2026-03-18 08:49:32.313022192 +0000 UTC m=+7357.254176986" watchObservedRunningTime="2026-03-18 08:49:32.314844246 +0000 UTC m=+7357.255999000" Mar 18 08:49:33 crc kubenswrapper[4917]: I0318 08:49:33.664573 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:33 crc kubenswrapper[4917]: I0318 08:49:33.665006 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:49:34 crc kubenswrapper[4917]: I0318 08:49:34.724128 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6hn2q" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="registry-server" probeResult="failure" output=< Mar 18 08:49:34 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:49:34 crc kubenswrapper[4917]: > Mar 18 08:49:44 crc kubenswrapper[4917]: I0318 08:49:44.761090 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6hn2q" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="registry-server" probeResult="failure" output=< Mar 18 08:49:44 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:49:44 crc kubenswrapper[4917]: > Mar 18 08:49:54 crc kubenswrapper[4917]: I0318 08:49:54.711488 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6hn2q" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="registry-server" probeResult="failure" output=< Mar 18 08:49:54 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:49:54 crc kubenswrapper[4917]: > Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.156858 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563730-lrq66"] Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.158712 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563730-lrq66" Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.161811 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.163052 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.171158 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563730-lrq66"] Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.173384 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.248633 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89fm2\" (UniqueName: \"kubernetes.io/projected/8cfa3930-b29b-4512-8ef0-8cd47dad802c-kube-api-access-89fm2\") pod \"auto-csr-approver-29563730-lrq66\" (UID: \"8cfa3930-b29b-4512-8ef0-8cd47dad802c\") " pod="openshift-infra/auto-csr-approver-29563730-lrq66" Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.350684 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89fm2\" (UniqueName: \"kubernetes.io/projected/8cfa3930-b29b-4512-8ef0-8cd47dad802c-kube-api-access-89fm2\") pod \"auto-csr-approver-29563730-lrq66\" (UID: \"8cfa3930-b29b-4512-8ef0-8cd47dad802c\") " pod="openshift-infra/auto-csr-approver-29563730-lrq66" Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.371256 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89fm2\" (UniqueName: \"kubernetes.io/projected/8cfa3930-b29b-4512-8ef0-8cd47dad802c-kube-api-access-89fm2\") pod \"auto-csr-approver-29563730-lrq66\" (UID: \"8cfa3930-b29b-4512-8ef0-8cd47dad802c\") " pod="openshift-infra/auto-csr-approver-29563730-lrq66" Mar 18 08:50:00 crc kubenswrapper[4917]: I0318 08:50:00.491392 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563730-lrq66" Mar 18 08:50:01 crc kubenswrapper[4917]: I0318 08:50:01.016779 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563730-lrq66"] Mar 18 08:50:01 crc kubenswrapper[4917]: W0318 08:50:01.027798 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cfa3930_b29b_4512_8ef0_8cd47dad802c.slice/crio-4dc06bc148aa04b915a41192600a6c5a877d4b6e598efcd43366c58a34232518 WatchSource:0}: Error finding container 4dc06bc148aa04b915a41192600a6c5a877d4b6e598efcd43366c58a34232518: Status 404 returned error can't find the container with id 4dc06bc148aa04b915a41192600a6c5a877d4b6e598efcd43366c58a34232518 Mar 18 08:50:01 crc kubenswrapper[4917]: I0318 08:50:01.666095 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563730-lrq66" event={"ID":"8cfa3930-b29b-4512-8ef0-8cd47dad802c","Type":"ContainerStarted","Data":"4dc06bc148aa04b915a41192600a6c5a877d4b6e598efcd43366c58a34232518"} Mar 18 08:50:02 crc kubenswrapper[4917]: I0318 08:50:02.675052 4917 generic.go:334] "Generic (PLEG): container finished" podID="8cfa3930-b29b-4512-8ef0-8cd47dad802c" containerID="b3c64dee29a1077b817560911e670c63303d298e2f0f9abf9ff70e07f3a922e2" exitCode=0 Mar 18 08:50:02 crc kubenswrapper[4917]: I0318 08:50:02.675321 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563730-lrq66" event={"ID":"8cfa3930-b29b-4512-8ef0-8cd47dad802c","Type":"ContainerDied","Data":"b3c64dee29a1077b817560911e670c63303d298e2f0f9abf9ff70e07f3a922e2"} Mar 18 08:50:03 crc kubenswrapper[4917]: I0318 08:50:03.749196 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:50:03 crc kubenswrapper[4917]: I0318 08:50:03.823816 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:50:04 crc kubenswrapper[4917]: I0318 08:50:04.004996 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6hn2q"] Mar 18 08:50:04 crc kubenswrapper[4917]: I0318 08:50:04.075324 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563730-lrq66" Mar 18 08:50:04 crc kubenswrapper[4917]: I0318 08:50:04.233438 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89fm2\" (UniqueName: \"kubernetes.io/projected/8cfa3930-b29b-4512-8ef0-8cd47dad802c-kube-api-access-89fm2\") pod \"8cfa3930-b29b-4512-8ef0-8cd47dad802c\" (UID: \"8cfa3930-b29b-4512-8ef0-8cd47dad802c\") " Mar 18 08:50:04 crc kubenswrapper[4917]: I0318 08:50:04.238876 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cfa3930-b29b-4512-8ef0-8cd47dad802c-kube-api-access-89fm2" (OuterVolumeSpecName: "kube-api-access-89fm2") pod "8cfa3930-b29b-4512-8ef0-8cd47dad802c" (UID: "8cfa3930-b29b-4512-8ef0-8cd47dad802c"). InnerVolumeSpecName "kube-api-access-89fm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:50:04 crc kubenswrapper[4917]: I0318 08:50:04.337319 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89fm2\" (UniqueName: \"kubernetes.io/projected/8cfa3930-b29b-4512-8ef0-8cd47dad802c-kube-api-access-89fm2\") on node \"crc\" DevicePath \"\"" Mar 18 08:50:04 crc kubenswrapper[4917]: I0318 08:50:04.701553 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563730-lrq66" event={"ID":"8cfa3930-b29b-4512-8ef0-8cd47dad802c","Type":"ContainerDied","Data":"4dc06bc148aa04b915a41192600a6c5a877d4b6e598efcd43366c58a34232518"} Mar 18 08:50:04 crc kubenswrapper[4917]: I0318 08:50:04.701643 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dc06bc148aa04b915a41192600a6c5a877d4b6e598efcd43366c58a34232518" Mar 18 08:50:04 crc kubenswrapper[4917]: I0318 08:50:04.701599 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563730-lrq66" Mar 18 08:50:05 crc kubenswrapper[4917]: I0318 08:50:05.177488 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563724-vw87b"] Mar 18 08:50:05 crc kubenswrapper[4917]: I0318 08:50:05.188314 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563724-vw87b"] Mar 18 08:50:05 crc kubenswrapper[4917]: I0318 08:50:05.710229 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6hn2q" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="registry-server" containerID="cri-o://d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6" gracePeriod=2 Mar 18 08:50:05 crc kubenswrapper[4917]: I0318 08:50:05.795739 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a213707-ba5a-4473-8685-0933a627c94e" path="/var/lib/kubelet/pods/5a213707-ba5a-4473-8685-0933a627c94e/volumes" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.186690 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.383297 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn9vx\" (UniqueName: \"kubernetes.io/projected/52364ec0-1480-4069-addf-ec1d6e6e02fb-kube-api-access-qn9vx\") pod \"52364ec0-1480-4069-addf-ec1d6e6e02fb\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.383510 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-catalog-content\") pod \"52364ec0-1480-4069-addf-ec1d6e6e02fb\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.383730 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-utilities\") pod \"52364ec0-1480-4069-addf-ec1d6e6e02fb\" (UID: \"52364ec0-1480-4069-addf-ec1d6e6e02fb\") " Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.384841 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-utilities" (OuterVolumeSpecName: "utilities") pod "52364ec0-1480-4069-addf-ec1d6e6e02fb" (UID: "52364ec0-1480-4069-addf-ec1d6e6e02fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.391188 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52364ec0-1480-4069-addf-ec1d6e6e02fb-kube-api-access-qn9vx" (OuterVolumeSpecName: "kube-api-access-qn9vx") pod "52364ec0-1480-4069-addf-ec1d6e6e02fb" (UID: "52364ec0-1480-4069-addf-ec1d6e6e02fb"). InnerVolumeSpecName "kube-api-access-qn9vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.486179 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.486224 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn9vx\" (UniqueName: \"kubernetes.io/projected/52364ec0-1480-4069-addf-ec1d6e6e02fb-kube-api-access-qn9vx\") on node \"crc\" DevicePath \"\"" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.515009 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52364ec0-1480-4069-addf-ec1d6e6e02fb" (UID: "52364ec0-1480-4069-addf-ec1d6e6e02fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.588309 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52364ec0-1480-4069-addf-ec1d6e6e02fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.729909 4917 generic.go:334] "Generic (PLEG): container finished" podID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerID="d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6" exitCode=0 Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.729987 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hn2q" event={"ID":"52364ec0-1480-4069-addf-ec1d6e6e02fb","Type":"ContainerDied","Data":"d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6"} Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.730066 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6hn2q" event={"ID":"52364ec0-1480-4069-addf-ec1d6e6e02fb","Type":"ContainerDied","Data":"bfa973e523acba7dc46216ea0d22309d1d1193175ab0348635f98fb14b70c744"} Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.730100 4917 scope.go:117] "RemoveContainer" containerID="d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.730363 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6hn2q" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.795953 4917 scope.go:117] "RemoveContainer" containerID="091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.798633 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6hn2q"] Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.808986 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6hn2q"] Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.835018 4917 scope.go:117] "RemoveContainer" containerID="4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.915933 4917 scope.go:117] "RemoveContainer" containerID="d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6" Mar 18 08:50:06 crc kubenswrapper[4917]: E0318 08:50:06.916706 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6\": container with ID starting with d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6 not found: ID does not exist" containerID="d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.916914 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6"} err="failed to get container status \"d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6\": rpc error: code = NotFound desc = could not find container \"d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6\": container with ID starting with d5a62a58e9d8250b9dec3a05269c48cdf7bc49c4b564c1d3f133501b082f5bd6 not found: ID does not exist" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.917081 4917 scope.go:117] "RemoveContainer" containerID="091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3" Mar 18 08:50:06 crc kubenswrapper[4917]: E0318 08:50:06.917677 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3\": container with ID starting with 091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3 not found: ID does not exist" containerID="091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.917711 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3"} err="failed to get container status \"091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3\": rpc error: code = NotFound desc = could not find container \"091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3\": container with ID starting with 091e3d616c11664757c7d18dbfc9b265b58c6fe3a6d5cd77b999372c8e234eb3 not found: ID does not exist" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.917737 4917 scope.go:117] "RemoveContainer" containerID="4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00" Mar 18 08:50:06 crc kubenswrapper[4917]: E0318 08:50:06.918138 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00\": container with ID starting with 4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00 not found: ID does not exist" containerID="4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00" Mar 18 08:50:06 crc kubenswrapper[4917]: I0318 08:50:06.918177 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00"} err="failed to get container status \"4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00\": rpc error: code = NotFound desc = could not find container \"4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00\": container with ID starting with 4b6799fcdfd4d48654636eef512a541c0c0be1c78bcebd32a991a706cb883a00 not found: ID does not exist" Mar 18 08:50:07 crc kubenswrapper[4917]: I0318 08:50:07.785730 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" path="/var/lib/kubelet/pods/52364ec0-1480-4069-addf-ec1d6e6e02fb/volumes" Mar 18 08:50:14 crc kubenswrapper[4917]: I0318 08:50:14.934234 4917 scope.go:117] "RemoveContainer" containerID="1bdfe4c035b7d1e669065459ffe0b5a9ff738645c2ebc4cf3b66e447e17911ee" Mar 18 08:50:14 crc kubenswrapper[4917]: I0318 08:50:14.977259 4917 scope.go:117] "RemoveContainer" containerID="b27ece708d445945e694e19352002c1f6b5f17ea9868dc91ff69fa2ed45e5d41" Mar 18 08:50:15 crc kubenswrapper[4917]: I0318 08:50:15.028047 4917 scope.go:117] "RemoveContainer" containerID="7fb4c4867e57d1b9965e0b6e278fa80b2365895eab8173182cdde631e4152208" Mar 18 08:50:15 crc kubenswrapper[4917]: I0318 08:50:15.063489 4917 scope.go:117] "RemoveContainer" containerID="951bb3483c1b723c7b3a395cbd51d6abb45ec3bee28c6cb3277a077d23079d9b" Mar 18 08:50:24 crc kubenswrapper[4917]: I0318 08:50:24.941233 4917 generic.go:334] "Generic (PLEG): container finished" podID="717aeb26-970e-480b-95d1-d514151ca1ac" containerID="5d9e5dbd3261ecb1b41cc9d22482636b88ac384cebd78e7e7d1354d643a20354" exitCode=0 Mar 18 08:50:24 crc kubenswrapper[4917]: I0318 08:50:24.941825 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7ff47" event={"ID":"717aeb26-970e-480b-95d1-d514151ca1ac","Type":"ContainerDied","Data":"5d9e5dbd3261ecb1b41cc9d22482636b88ac384cebd78e7e7d1354d643a20354"} Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.498117 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.676757 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/717aeb26-970e-480b-95d1-d514151ca1ac-ovncontroller-config-0\") pod \"717aeb26-970e-480b-95d1-d514151ca1ac\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.676842 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvsz2\" (UniqueName: \"kubernetes.io/projected/717aeb26-970e-480b-95d1-d514151ca1ac-kube-api-access-pvsz2\") pod \"717aeb26-970e-480b-95d1-d514151ca1ac\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.676960 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ssh-key-openstack-cell1\") pod \"717aeb26-970e-480b-95d1-d514151ca1ac\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.677021 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-inventory\") pod \"717aeb26-970e-480b-95d1-d514151ca1ac\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.677206 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ovn-combined-ca-bundle\") pod \"717aeb26-970e-480b-95d1-d514151ca1ac\" (UID: \"717aeb26-970e-480b-95d1-d514151ca1ac\") " Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.683333 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "717aeb26-970e-480b-95d1-d514151ca1ac" (UID: "717aeb26-970e-480b-95d1-d514151ca1ac"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.684120 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717aeb26-970e-480b-95d1-d514151ca1ac-kube-api-access-pvsz2" (OuterVolumeSpecName: "kube-api-access-pvsz2") pod "717aeb26-970e-480b-95d1-d514151ca1ac" (UID: "717aeb26-970e-480b-95d1-d514151ca1ac"). InnerVolumeSpecName "kube-api-access-pvsz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.724131 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-inventory" (OuterVolumeSpecName: "inventory") pod "717aeb26-970e-480b-95d1-d514151ca1ac" (UID: "717aeb26-970e-480b-95d1-d514151ca1ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.737746 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/717aeb26-970e-480b-95d1-d514151ca1ac-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "717aeb26-970e-480b-95d1-d514151ca1ac" (UID: "717aeb26-970e-480b-95d1-d514151ca1ac"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.755544 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "717aeb26-970e-480b-95d1-d514151ca1ac" (UID: "717aeb26-970e-480b-95d1-d514151ca1ac"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.793928 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.793982 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.793996 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/717aeb26-970e-480b-95d1-d514151ca1ac-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.794009 4917 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/717aeb26-970e-480b-95d1-d514151ca1ac-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.794021 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvsz2\" (UniqueName: \"kubernetes.io/projected/717aeb26-970e-480b-95d1-d514151ca1ac-kube-api-access-pvsz2\") on node \"crc\" DevicePath \"\"" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.968803 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-7ff47" event={"ID":"717aeb26-970e-480b-95d1-d514151ca1ac","Type":"ContainerDied","Data":"62035c9d06cc8d9107f259ea44af7a54ef30fd190b4f4a2d1856417a71f0cbd2"} Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.968869 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62035c9d06cc8d9107f259ea44af7a54ef30fd190b4f4a2d1856417a71f0cbd2" Mar 18 08:50:26 crc kubenswrapper[4917]: I0318 08:50:26.968875 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-7ff47" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.068601 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-6c4fv"] Mar 18 08:50:27 crc kubenswrapper[4917]: E0318 08:50:27.069056 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="registry-server" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.069073 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="registry-server" Mar 18 08:50:27 crc kubenswrapper[4917]: E0318 08:50:27.069101 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="extract-content" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.069109 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="extract-content" Mar 18 08:50:27 crc kubenswrapper[4917]: E0318 08:50:27.069123 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="extract-utilities" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.069130 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="extract-utilities" Mar 18 08:50:27 crc kubenswrapper[4917]: E0318 08:50:27.069143 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cfa3930-b29b-4512-8ef0-8cd47dad802c" containerName="oc" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.069148 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfa3930-b29b-4512-8ef0-8cd47dad802c" containerName="oc" Mar 18 08:50:27 crc kubenswrapper[4917]: E0318 08:50:27.069168 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717aeb26-970e-480b-95d1-d514151ca1ac" containerName="ovn-openstack-openstack-cell1" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.069174 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="717aeb26-970e-480b-95d1-d514151ca1ac" containerName="ovn-openstack-openstack-cell1" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.069344 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="52364ec0-1480-4069-addf-ec1d6e6e02fb" containerName="registry-server" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.069357 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cfa3930-b29b-4512-8ef0-8cd47dad802c" containerName="oc" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.069378 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="717aeb26-970e-480b-95d1-d514151ca1ac" containerName="ovn-openstack-openstack-cell1" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.070113 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.072218 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.072726 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.072795 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.073021 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.073224 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.074041 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.098935 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.099142 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.099277 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.099431 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.099530 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78fv\" (UniqueName: \"kubernetes.io/projected/c3deb9d3-e372-47e2-9096-2e30c4aa547f-kube-api-access-g78fv\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.099554 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.107937 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-6c4fv"] Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.201087 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.201179 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.201240 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.201279 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g78fv\" (UniqueName: \"kubernetes.io/projected/c3deb9d3-e372-47e2-9096-2e30c4aa547f-kube-api-access-g78fv\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.201300 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.201378 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.205437 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.205702 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.207154 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.207424 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.210194 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.220034 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g78fv\" (UniqueName: \"kubernetes.io/projected/c3deb9d3-e372-47e2-9096-2e30c4aa547f-kube-api-access-g78fv\") pod \"neutron-metadata-openstack-openstack-cell1-6c4fv\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.395191 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.790028 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-6c4fv"] Mar 18 08:50:27 crc kubenswrapper[4917]: I0318 08:50:27.980680 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" event={"ID":"c3deb9d3-e372-47e2-9096-2e30c4aa547f","Type":"ContainerStarted","Data":"235bbe04c3fbdce58696a24890c6a186ed2c38ab90141ed137d94de171665723"} Mar 18 08:50:28 crc kubenswrapper[4917]: I0318 08:50:28.991901 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" event={"ID":"c3deb9d3-e372-47e2-9096-2e30c4aa547f","Type":"ContainerStarted","Data":"38b15ae3227c5b8e46e258c754c5ccd3c0458ca76602dfcc4cca15a2e140563b"} Mar 18 08:50:29 crc kubenswrapper[4917]: I0318 08:50:29.020053 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" podStartSLOduration=1.533581296 podStartE2EDuration="2.020033217s" podCreationTimestamp="2026-03-18 08:50:27 +0000 UTC" firstStartedPulling="2026-03-18 08:50:27.797253829 +0000 UTC m=+7412.738408543" lastFinishedPulling="2026-03-18 08:50:28.28370574 +0000 UTC m=+7413.224860464" observedRunningTime="2026-03-18 08:50:29.012442322 +0000 UTC m=+7413.953597026" watchObservedRunningTime="2026-03-18 08:50:29.020033217 +0000 UTC m=+7413.961187931" Mar 18 08:51:20 crc kubenswrapper[4917]: I0318 08:51:20.608382 4917 generic.go:334] "Generic (PLEG): container finished" podID="c3deb9d3-e372-47e2-9096-2e30c4aa547f" containerID="38b15ae3227c5b8e46e258c754c5ccd3c0458ca76602dfcc4cca15a2e140563b" exitCode=0 Mar 18 08:51:20 crc kubenswrapper[4917]: I0318 08:51:20.608546 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" event={"ID":"c3deb9d3-e372-47e2-9096-2e30c4aa547f","Type":"ContainerDied","Data":"38b15ae3227c5b8e46e258c754c5ccd3c0458ca76602dfcc4cca15a2e140563b"} Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.110782 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.190665 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-nova-metadata-neutron-config-0\") pod \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.190750 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.190773 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g78fv\" (UniqueName: \"kubernetes.io/projected/c3deb9d3-e372-47e2-9096-2e30c4aa547f-kube-api-access-g78fv\") pod \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.190813 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-inventory\") pod \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.191024 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-ssh-key-openstack-cell1\") pod \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.191079 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-metadata-combined-ca-bundle\") pod \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\" (UID: \"c3deb9d3-e372-47e2-9096-2e30c4aa547f\") " Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.197030 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3deb9d3-e372-47e2-9096-2e30c4aa547f-kube-api-access-g78fv" (OuterVolumeSpecName: "kube-api-access-g78fv") pod "c3deb9d3-e372-47e2-9096-2e30c4aa547f" (UID: "c3deb9d3-e372-47e2-9096-2e30c4aa547f"). InnerVolumeSpecName "kube-api-access-g78fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.198724 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c3deb9d3-e372-47e2-9096-2e30c4aa547f" (UID: "c3deb9d3-e372-47e2-9096-2e30c4aa547f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.218882 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "c3deb9d3-e372-47e2-9096-2e30c4aa547f" (UID: "c3deb9d3-e372-47e2-9096-2e30c4aa547f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.230171 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c3deb9d3-e372-47e2-9096-2e30c4aa547f" (UID: "c3deb9d3-e372-47e2-9096-2e30c4aa547f"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.230825 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-inventory" (OuterVolumeSpecName: "inventory") pod "c3deb9d3-e372-47e2-9096-2e30c4aa547f" (UID: "c3deb9d3-e372-47e2-9096-2e30c4aa547f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.239020 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c3deb9d3-e372-47e2-9096-2e30c4aa547f" (UID: "c3deb9d3-e372-47e2-9096-2e30c4aa547f"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.293477 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g78fv\" (UniqueName: \"kubernetes.io/projected/c3deb9d3-e372-47e2-9096-2e30c4aa547f-kube-api-access-g78fv\") on node \"crc\" DevicePath \"\"" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.293515 4917 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.293527 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.293537 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.293547 4917 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.293557 4917 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c3deb9d3-e372-47e2-9096-2e30c4aa547f-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.633456 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" event={"ID":"c3deb9d3-e372-47e2-9096-2e30c4aa547f","Type":"ContainerDied","Data":"235bbe04c3fbdce58696a24890c6a186ed2c38ab90141ed137d94de171665723"} Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.633898 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="235bbe04c3fbdce58696a24890c6a186ed2c38ab90141ed137d94de171665723" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.633546 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-6c4fv" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.758833 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-f7mjl"] Mar 18 08:51:22 crc kubenswrapper[4917]: E0318 08:51:22.759400 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3deb9d3-e372-47e2-9096-2e30c4aa547f" containerName="neutron-metadata-openstack-openstack-cell1" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.759425 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3deb9d3-e372-47e2-9096-2e30c4aa547f" containerName="neutron-metadata-openstack-openstack-cell1" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.759716 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3deb9d3-e372-47e2-9096-2e30c4aa547f" containerName="neutron-metadata-openstack-openstack-cell1" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.760638 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.763845 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.764055 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.764176 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.764450 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.766045 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.771618 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-f7mjl"] Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.905436 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.905580 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.905673 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-inventory\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.905758 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v675q\" (UniqueName: \"kubernetes.io/projected/b993b322-7111-4544-9def-988363263914-kube-api-access-v675q\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:22 crc kubenswrapper[4917]: I0318 08:51:22.906076 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.009333 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.009505 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.009552 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-inventory\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.009651 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v675q\" (UniqueName: \"kubernetes.io/projected/b993b322-7111-4544-9def-988363263914-kube-api-access-v675q\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.009711 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.014873 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.015063 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.016233 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.017470 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-inventory\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.046704 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v675q\" (UniqueName: \"kubernetes.io/projected/b993b322-7111-4544-9def-988363263914-kube-api-access-v675q\") pod \"libvirt-openstack-openstack-cell1-f7mjl\" (UID: \"b993b322-7111-4544-9def-988363263914\") " pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.089734 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:51:23 crc kubenswrapper[4917]: I0318 08:51:23.801852 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-f7mjl"] Mar 18 08:51:23 crc kubenswrapper[4917]: W0318 08:51:23.812525 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb993b322_7111_4544_9def_988363263914.slice/crio-072d7a14a5f02de16f0df34f96352d10a036a546b1a6b6bdf8a5b7967b615f00 WatchSource:0}: Error finding container 072d7a14a5f02de16f0df34f96352d10a036a546b1a6b6bdf8a5b7967b615f00: Status 404 returned error can't find the container with id 072d7a14a5f02de16f0df34f96352d10a036a546b1a6b6bdf8a5b7967b615f00 Mar 18 08:51:24 crc kubenswrapper[4917]: I0318 08:51:24.654449 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" event={"ID":"b993b322-7111-4544-9def-988363263914","Type":"ContainerStarted","Data":"5e526f156aefe260f590f8583370e9f000b638b54f5858cd89bdc35b751fad5f"} Mar 18 08:51:24 crc kubenswrapper[4917]: I0318 08:51:24.654731 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" event={"ID":"b993b322-7111-4544-9def-988363263914","Type":"ContainerStarted","Data":"072d7a14a5f02de16f0df34f96352d10a036a546b1a6b6bdf8a5b7967b615f00"} Mar 18 08:51:24 crc kubenswrapper[4917]: I0318 08:51:24.682665 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" podStartSLOduration=2.280507103 podStartE2EDuration="2.68260972s" podCreationTimestamp="2026-03-18 08:51:22 +0000 UTC" firstStartedPulling="2026-03-18 08:51:23.831434357 +0000 UTC m=+7468.772589071" lastFinishedPulling="2026-03-18 08:51:24.233536974 +0000 UTC m=+7469.174691688" observedRunningTime="2026-03-18 08:51:24.674306298 +0000 UTC m=+7469.615461012" watchObservedRunningTime="2026-03-18 08:51:24.68260972 +0000 UTC m=+7469.623764444" Mar 18 08:51:32 crc kubenswrapper[4917]: I0318 08:51:32.929158 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:51:32 crc kubenswrapper[4917]: I0318 08:51:32.929847 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.194126 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563732-xrclx"] Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.197394 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563732-xrclx" Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.199923 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.200133 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.200374 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.217053 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563732-xrclx"] Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.261150 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2hr\" (UniqueName: \"kubernetes.io/projected/94147ec3-b2d8-4a9c-8f4f-3951816f88b1-kube-api-access-rp2hr\") pod \"auto-csr-approver-29563732-xrclx\" (UID: \"94147ec3-b2d8-4a9c-8f4f-3951816f88b1\") " pod="openshift-infra/auto-csr-approver-29563732-xrclx" Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.363890 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2hr\" (UniqueName: \"kubernetes.io/projected/94147ec3-b2d8-4a9c-8f4f-3951816f88b1-kube-api-access-rp2hr\") pod \"auto-csr-approver-29563732-xrclx\" (UID: \"94147ec3-b2d8-4a9c-8f4f-3951816f88b1\") " pod="openshift-infra/auto-csr-approver-29563732-xrclx" Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.389298 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2hr\" (UniqueName: \"kubernetes.io/projected/94147ec3-b2d8-4a9c-8f4f-3951816f88b1-kube-api-access-rp2hr\") pod \"auto-csr-approver-29563732-xrclx\" (UID: \"94147ec3-b2d8-4a9c-8f4f-3951816f88b1\") " pod="openshift-infra/auto-csr-approver-29563732-xrclx" Mar 18 08:52:00 crc kubenswrapper[4917]: I0318 08:52:00.523668 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563732-xrclx" Mar 18 08:52:01 crc kubenswrapper[4917]: I0318 08:52:01.009947 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563732-xrclx"] Mar 18 08:52:01 crc kubenswrapper[4917]: I0318 08:52:01.068939 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563732-xrclx" event={"ID":"94147ec3-b2d8-4a9c-8f4f-3951816f88b1","Type":"ContainerStarted","Data":"4b4deae1b65401e4c3ff5a8c5887e1fc3df129052e28fbb102a82ac7e8dac6b3"} Mar 18 08:52:02 crc kubenswrapper[4917]: I0318 08:52:02.928596 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:52:02 crc kubenswrapper[4917]: I0318 08:52:02.928915 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:52:03 crc kubenswrapper[4917]: I0318 08:52:03.095894 4917 generic.go:334] "Generic (PLEG): container finished" podID="94147ec3-b2d8-4a9c-8f4f-3951816f88b1" containerID="e0e17de92ae998bf6224aeb6ec107acb0c61d7849c7395cf7b839059baf5bfec" exitCode=0 Mar 18 08:52:03 crc kubenswrapper[4917]: I0318 08:52:03.095949 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563732-xrclx" event={"ID":"94147ec3-b2d8-4a9c-8f4f-3951816f88b1","Type":"ContainerDied","Data":"e0e17de92ae998bf6224aeb6ec107acb0c61d7849c7395cf7b839059baf5bfec"} Mar 18 08:52:04 crc kubenswrapper[4917]: I0318 08:52:04.512110 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563732-xrclx" Mar 18 08:52:04 crc kubenswrapper[4917]: I0318 08:52:04.673994 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp2hr\" (UniqueName: \"kubernetes.io/projected/94147ec3-b2d8-4a9c-8f4f-3951816f88b1-kube-api-access-rp2hr\") pod \"94147ec3-b2d8-4a9c-8f4f-3951816f88b1\" (UID: \"94147ec3-b2d8-4a9c-8f4f-3951816f88b1\") " Mar 18 08:52:04 crc kubenswrapper[4917]: I0318 08:52:04.687624 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94147ec3-b2d8-4a9c-8f4f-3951816f88b1-kube-api-access-rp2hr" (OuterVolumeSpecName: "kube-api-access-rp2hr") pod "94147ec3-b2d8-4a9c-8f4f-3951816f88b1" (UID: "94147ec3-b2d8-4a9c-8f4f-3951816f88b1"). InnerVolumeSpecName "kube-api-access-rp2hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:52:04 crc kubenswrapper[4917]: I0318 08:52:04.776553 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp2hr\" (UniqueName: \"kubernetes.io/projected/94147ec3-b2d8-4a9c-8f4f-3951816f88b1-kube-api-access-rp2hr\") on node \"crc\" DevicePath \"\"" Mar 18 08:52:05 crc kubenswrapper[4917]: I0318 08:52:05.119387 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563732-xrclx" event={"ID":"94147ec3-b2d8-4a9c-8f4f-3951816f88b1","Type":"ContainerDied","Data":"4b4deae1b65401e4c3ff5a8c5887e1fc3df129052e28fbb102a82ac7e8dac6b3"} Mar 18 08:52:05 crc kubenswrapper[4917]: I0318 08:52:05.119842 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b4deae1b65401e4c3ff5a8c5887e1fc3df129052e28fbb102a82ac7e8dac6b3" Mar 18 08:52:05 crc kubenswrapper[4917]: I0318 08:52:05.119458 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563732-xrclx" Mar 18 08:52:05 crc kubenswrapper[4917]: I0318 08:52:05.627673 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563726-28h2f"] Mar 18 08:52:05 crc kubenswrapper[4917]: I0318 08:52:05.641159 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563726-28h2f"] Mar 18 08:52:05 crc kubenswrapper[4917]: I0318 08:52:05.786364 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b71d0fd7-116c-484e-9045-8f69d6907a6c" path="/var/lib/kubelet/pods/b71d0fd7-116c-484e-9045-8f69d6907a6c/volumes" Mar 18 08:52:15 crc kubenswrapper[4917]: I0318 08:52:15.217716 4917 scope.go:117] "RemoveContainer" containerID="56e281aae2ad6cc106e8b204dc7eaac0a818a88dede8de091f60f8d64b3d320c" Mar 18 08:52:32 crc kubenswrapper[4917]: I0318 08:52:32.928816 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 08:52:32 crc kubenswrapper[4917]: I0318 08:52:32.929378 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 08:52:32 crc kubenswrapper[4917]: I0318 08:52:32.929426 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 08:52:32 crc kubenswrapper[4917]: I0318 08:52:32.930158 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 08:52:32 crc kubenswrapper[4917]: I0318 08:52:32.930223 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" gracePeriod=600 Mar 18 08:52:33 crc kubenswrapper[4917]: E0318 08:52:33.079126 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:52:33 crc kubenswrapper[4917]: I0318 08:52:33.423349 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" exitCode=0 Mar 18 08:52:33 crc kubenswrapper[4917]: I0318 08:52:33.423406 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369"} Mar 18 08:52:33 crc kubenswrapper[4917]: I0318 08:52:33.423454 4917 scope.go:117] "RemoveContainer" containerID="004af87b0589146355065aa18f9d5186b2677b48811e1c447f8ebdd857aa020a" Mar 18 08:52:33 crc kubenswrapper[4917]: I0318 08:52:33.424152 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:52:33 crc kubenswrapper[4917]: E0318 08:52:33.424461 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:52:46 crc kubenswrapper[4917]: I0318 08:52:46.773509 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:52:46 crc kubenswrapper[4917]: E0318 08:52:46.774450 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:52:58 crc kubenswrapper[4917]: I0318 08:52:58.772668 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:52:58 crc kubenswrapper[4917]: E0318 08:52:58.773430 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:53:12 crc kubenswrapper[4917]: I0318 08:53:12.773110 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:53:12 crc kubenswrapper[4917]: E0318 08:53:12.774370 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:53:23 crc kubenswrapper[4917]: I0318 08:53:23.772903 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:53:23 crc kubenswrapper[4917]: E0318 08:53:23.774040 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:53:34 crc kubenswrapper[4917]: I0318 08:53:34.773176 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:53:34 crc kubenswrapper[4917]: E0318 08:53:34.774286 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:53:48 crc kubenswrapper[4917]: I0318 08:53:48.772432 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:53:48 crc kubenswrapper[4917]: E0318 08:53:48.773251 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:53:59 crc kubenswrapper[4917]: I0318 08:53:59.772829 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:53:59 crc kubenswrapper[4917]: E0318 08:53:59.774493 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.153029 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563734-vnh84"] Mar 18 08:54:00 crc kubenswrapper[4917]: E0318 08:54:00.153613 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94147ec3-b2d8-4a9c-8f4f-3951816f88b1" containerName="oc" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.153636 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="94147ec3-b2d8-4a9c-8f4f-3951816f88b1" containerName="oc" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.153897 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="94147ec3-b2d8-4a9c-8f4f-3951816f88b1" containerName="oc" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.154773 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563734-vnh84" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.157933 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.158054 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.158444 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.167923 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563734-vnh84"] Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.315565 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp48x\" (UniqueName: \"kubernetes.io/projected/2614c412-c2de-4410-bf1e-a6b01eecaaff-kube-api-access-kp48x\") pod \"auto-csr-approver-29563734-vnh84\" (UID: \"2614c412-c2de-4410-bf1e-a6b01eecaaff\") " pod="openshift-infra/auto-csr-approver-29563734-vnh84" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.417416 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp48x\" (UniqueName: \"kubernetes.io/projected/2614c412-c2de-4410-bf1e-a6b01eecaaff-kube-api-access-kp48x\") pod \"auto-csr-approver-29563734-vnh84\" (UID: \"2614c412-c2de-4410-bf1e-a6b01eecaaff\") " pod="openshift-infra/auto-csr-approver-29563734-vnh84" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.442741 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp48x\" (UniqueName: \"kubernetes.io/projected/2614c412-c2de-4410-bf1e-a6b01eecaaff-kube-api-access-kp48x\") pod \"auto-csr-approver-29563734-vnh84\" (UID: \"2614c412-c2de-4410-bf1e-a6b01eecaaff\") " pod="openshift-infra/auto-csr-approver-29563734-vnh84" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.489069 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563734-vnh84" Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.980836 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563734-vnh84"] Mar 18 08:54:00 crc kubenswrapper[4917]: I0318 08:54:00.996331 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:54:01 crc kubenswrapper[4917]: I0318 08:54:01.447072 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563734-vnh84" event={"ID":"2614c412-c2de-4410-bf1e-a6b01eecaaff","Type":"ContainerStarted","Data":"1b02287248e8b9abfdb87be80e21372f561df6f4c7ceeed37ec5c59b50442001"} Mar 18 08:54:02 crc kubenswrapper[4917]: I0318 08:54:02.458298 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563734-vnh84" event={"ID":"2614c412-c2de-4410-bf1e-a6b01eecaaff","Type":"ContainerStarted","Data":"0debe370d574331b34a0547cd908f07e50c5fcd819a01116bcfdf837737af576"} Mar 18 08:54:02 crc kubenswrapper[4917]: I0318 08:54:02.485024 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563734-vnh84" podStartSLOduration=1.477078644 podStartE2EDuration="2.48500281s" podCreationTimestamp="2026-03-18 08:54:00 +0000 UTC" firstStartedPulling="2026-03-18 08:54:00.996030183 +0000 UTC m=+7625.937184897" lastFinishedPulling="2026-03-18 08:54:02.003954339 +0000 UTC m=+7626.945109063" observedRunningTime="2026-03-18 08:54:02.476536204 +0000 UTC m=+7627.417690918" watchObservedRunningTime="2026-03-18 08:54:02.48500281 +0000 UTC m=+7627.426157534" Mar 18 08:54:03 crc kubenswrapper[4917]: I0318 08:54:03.469141 4917 generic.go:334] "Generic (PLEG): container finished" podID="2614c412-c2de-4410-bf1e-a6b01eecaaff" containerID="0debe370d574331b34a0547cd908f07e50c5fcd819a01116bcfdf837737af576" exitCode=0 Mar 18 08:54:03 crc kubenswrapper[4917]: I0318 08:54:03.469194 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563734-vnh84" event={"ID":"2614c412-c2de-4410-bf1e-a6b01eecaaff","Type":"ContainerDied","Data":"0debe370d574331b34a0547cd908f07e50c5fcd819a01116bcfdf837737af576"} Mar 18 08:54:04 crc kubenswrapper[4917]: I0318 08:54:04.954597 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563734-vnh84" Mar 18 08:54:05 crc kubenswrapper[4917]: I0318 08:54:05.131536 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp48x\" (UniqueName: \"kubernetes.io/projected/2614c412-c2de-4410-bf1e-a6b01eecaaff-kube-api-access-kp48x\") pod \"2614c412-c2de-4410-bf1e-a6b01eecaaff\" (UID: \"2614c412-c2de-4410-bf1e-a6b01eecaaff\") " Mar 18 08:54:05 crc kubenswrapper[4917]: I0318 08:54:05.167304 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2614c412-c2de-4410-bf1e-a6b01eecaaff-kube-api-access-kp48x" (OuterVolumeSpecName: "kube-api-access-kp48x") pod "2614c412-c2de-4410-bf1e-a6b01eecaaff" (UID: "2614c412-c2de-4410-bf1e-a6b01eecaaff"). InnerVolumeSpecName "kube-api-access-kp48x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:54:05 crc kubenswrapper[4917]: I0318 08:54:05.235299 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp48x\" (UniqueName: \"kubernetes.io/projected/2614c412-c2de-4410-bf1e-a6b01eecaaff-kube-api-access-kp48x\") on node \"crc\" DevicePath \"\"" Mar 18 08:54:05 crc kubenswrapper[4917]: I0318 08:54:05.492818 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563734-vnh84" event={"ID":"2614c412-c2de-4410-bf1e-a6b01eecaaff","Type":"ContainerDied","Data":"1b02287248e8b9abfdb87be80e21372f561df6f4c7ceeed37ec5c59b50442001"} Mar 18 08:54:05 crc kubenswrapper[4917]: I0318 08:54:05.492876 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563734-vnh84" Mar 18 08:54:05 crc kubenswrapper[4917]: I0318 08:54:05.492885 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b02287248e8b9abfdb87be80e21372f561df6f4c7ceeed37ec5c59b50442001" Mar 18 08:54:05 crc kubenswrapper[4917]: I0318 08:54:05.570733 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563728-jfwvp"] Mar 18 08:54:05 crc kubenswrapper[4917]: I0318 08:54:05.586422 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563728-jfwvp"] Mar 18 08:54:05 crc kubenswrapper[4917]: I0318 08:54:05.790543 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1942ea5e-436f-4396-a0e9-a864c699b2b7" path="/var/lib/kubelet/pods/1942ea5e-436f-4396-a0e9-a864c699b2b7/volumes" Mar 18 08:54:13 crc kubenswrapper[4917]: I0318 08:54:13.773535 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:54:13 crc kubenswrapper[4917]: E0318 08:54:13.774713 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:54:15 crc kubenswrapper[4917]: I0318 08:54:15.349118 4917 scope.go:117] "RemoveContainer" containerID="8c5dab954c1253d6a6388b22ad220d9cb4d7cd39e04d77c84f4d70ad4280c0c4" Mar 18 08:54:24 crc kubenswrapper[4917]: I0318 08:54:24.773299 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:54:24 crc kubenswrapper[4917]: E0318 08:54:24.774425 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:54:27 crc kubenswrapper[4917]: I0318 08:54:27.930666 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w2jrt"] Mar 18 08:54:27 crc kubenswrapper[4917]: E0318 08:54:27.931524 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2614c412-c2de-4410-bf1e-a6b01eecaaff" containerName="oc" Mar 18 08:54:27 crc kubenswrapper[4917]: I0318 08:54:27.931541 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2614c412-c2de-4410-bf1e-a6b01eecaaff" containerName="oc" Mar 18 08:54:27 crc kubenswrapper[4917]: I0318 08:54:27.931783 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2614c412-c2de-4410-bf1e-a6b01eecaaff" containerName="oc" Mar 18 08:54:27 crc kubenswrapper[4917]: I0318 08:54:27.933778 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:27 crc kubenswrapper[4917]: I0318 08:54:27.954544 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2jrt"] Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.040952 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4dxn\" (UniqueName: \"kubernetes.io/projected/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-kube-api-access-w4dxn\") pod \"certified-operators-w2jrt\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.041004 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-catalog-content\") pod \"certified-operators-w2jrt\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.041040 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-utilities\") pod \"certified-operators-w2jrt\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.142616 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4dxn\" (UniqueName: \"kubernetes.io/projected/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-kube-api-access-w4dxn\") pod \"certified-operators-w2jrt\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.142668 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-catalog-content\") pod \"certified-operators-w2jrt\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.142697 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-utilities\") pod \"certified-operators-w2jrt\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.143375 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-catalog-content\") pod \"certified-operators-w2jrt\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.143400 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-utilities\") pod \"certified-operators-w2jrt\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.162476 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4dxn\" (UniqueName: \"kubernetes.io/projected/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-kube-api-access-w4dxn\") pod \"certified-operators-w2jrt\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.261124 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:28 crc kubenswrapper[4917]: I0318 08:54:28.763829 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2jrt"] Mar 18 08:54:29 crc kubenswrapper[4917]: I0318 08:54:29.780825 4917 generic.go:334] "Generic (PLEG): container finished" podID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerID="3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5" exitCode=0 Mar 18 08:54:29 crc kubenswrapper[4917]: I0318 08:54:29.784884 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jrt" event={"ID":"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc","Type":"ContainerDied","Data":"3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5"} Mar 18 08:54:29 crc kubenswrapper[4917]: I0318 08:54:29.784935 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jrt" event={"ID":"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc","Type":"ContainerStarted","Data":"a994ab5c3c2c540b12fcf129c0cf998a5bc48524f05b20487a50bfdfd6c81588"} Mar 18 08:54:30 crc kubenswrapper[4917]: I0318 08:54:30.797031 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jrt" event={"ID":"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc","Type":"ContainerStarted","Data":"853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5"} Mar 18 08:54:32 crc kubenswrapper[4917]: I0318 08:54:32.815927 4917 generic.go:334] "Generic (PLEG): container finished" podID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerID="853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5" exitCode=0 Mar 18 08:54:32 crc kubenswrapper[4917]: I0318 08:54:32.816043 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jrt" event={"ID":"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc","Type":"ContainerDied","Data":"853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5"} Mar 18 08:54:33 crc kubenswrapper[4917]: I0318 08:54:33.827074 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jrt" event={"ID":"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc","Type":"ContainerStarted","Data":"af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf"} Mar 18 08:54:33 crc kubenswrapper[4917]: I0318 08:54:33.851672 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w2jrt" podStartSLOduration=3.4474248 podStartE2EDuration="6.851650918s" podCreationTimestamp="2026-03-18 08:54:27 +0000 UTC" firstStartedPulling="2026-03-18 08:54:29.783133312 +0000 UTC m=+7654.724288026" lastFinishedPulling="2026-03-18 08:54:33.18735943 +0000 UTC m=+7658.128514144" observedRunningTime="2026-03-18 08:54:33.847436806 +0000 UTC m=+7658.788591540" watchObservedRunningTime="2026-03-18 08:54:33.851650918 +0000 UTC m=+7658.792805692" Mar 18 08:54:37 crc kubenswrapper[4917]: I0318 08:54:37.773212 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:54:37 crc kubenswrapper[4917]: E0318 08:54:37.774150 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:54:38 crc kubenswrapper[4917]: I0318 08:54:38.261415 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:38 crc kubenswrapper[4917]: I0318 08:54:38.261832 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:39 crc kubenswrapper[4917]: I0318 08:54:39.312653 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-w2jrt" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerName="registry-server" probeResult="failure" output=< Mar 18 08:54:39 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:54:39 crc kubenswrapper[4917]: > Mar 18 08:54:48 crc kubenswrapper[4917]: I0318 08:54:48.308735 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:48 crc kubenswrapper[4917]: I0318 08:54:48.379075 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:48 crc kubenswrapper[4917]: I0318 08:54:48.565352 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w2jrt"] Mar 18 08:54:49 crc kubenswrapper[4917]: I0318 08:54:49.982924 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w2jrt" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerName="registry-server" containerID="cri-o://af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf" gracePeriod=2 Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.448646 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.506517 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-catalog-content\") pod \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.506655 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-utilities\") pod \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.506711 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4dxn\" (UniqueName: \"kubernetes.io/projected/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-kube-api-access-w4dxn\") pod \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\" (UID: \"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc\") " Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.508146 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-utilities" (OuterVolumeSpecName: "utilities") pod "993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" (UID: "993f8c8e-e1a1-4f38-b6e0-743eb3c646bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.517833 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-kube-api-access-w4dxn" (OuterVolumeSpecName: "kube-api-access-w4dxn") pod "993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" (UID: "993f8c8e-e1a1-4f38-b6e0-743eb3c646bc"). InnerVolumeSpecName "kube-api-access-w4dxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.559356 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" (UID: "993f8c8e-e1a1-4f38-b6e0-743eb3c646bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.609507 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.609560 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.609576 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4dxn\" (UniqueName: \"kubernetes.io/projected/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc-kube-api-access-w4dxn\") on node \"crc\" DevicePath \"\"" Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.996638 4917 generic.go:334] "Generic (PLEG): container finished" podID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerID="af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf" exitCode=0 Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.996681 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jrt" event={"ID":"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc","Type":"ContainerDied","Data":"af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf"} Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.996724 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2jrt" Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.996756 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2jrt" event={"ID":"993f8c8e-e1a1-4f38-b6e0-743eb3c646bc","Type":"ContainerDied","Data":"a994ab5c3c2c540b12fcf129c0cf998a5bc48524f05b20487a50bfdfd6c81588"} Mar 18 08:54:50 crc kubenswrapper[4917]: I0318 08:54:50.996787 4917 scope.go:117] "RemoveContainer" containerID="af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf" Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.016286 4917 scope.go:117] "RemoveContainer" containerID="853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5" Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.055274 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w2jrt"] Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.058325 4917 scope.go:117] "RemoveContainer" containerID="3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5" Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.064598 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w2jrt"] Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.089812 4917 scope.go:117] "RemoveContainer" containerID="af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf" Mar 18 08:54:51 crc kubenswrapper[4917]: E0318 08:54:51.090260 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf\": container with ID starting with af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf not found: ID does not exist" containerID="af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf" Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.090316 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf"} err="failed to get container status \"af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf\": rpc error: code = NotFound desc = could not find container \"af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf\": container with ID starting with af663b7a1208f5fd6a61a692b5119a3130f2daac6eaf54d5e0cd842091e11ddf not found: ID does not exist" Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.090352 4917 scope.go:117] "RemoveContainer" containerID="853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5" Mar 18 08:54:51 crc kubenswrapper[4917]: E0318 08:54:51.090984 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5\": container with ID starting with 853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5 not found: ID does not exist" containerID="853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5" Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.091024 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5"} err="failed to get container status \"853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5\": rpc error: code = NotFound desc = could not find container \"853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5\": container with ID starting with 853d6e84cf7ebfbe917e96dcfac21d3d55933d378bc31747df15837371ca93f5 not found: ID does not exist" Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.091067 4917 scope.go:117] "RemoveContainer" containerID="3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5" Mar 18 08:54:51 crc kubenswrapper[4917]: E0318 08:54:51.091329 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5\": container with ID starting with 3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5 not found: ID does not exist" containerID="3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5" Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.091367 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5"} err="failed to get container status \"3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5\": rpc error: code = NotFound desc = could not find container \"3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5\": container with ID starting with 3431ebd62186d293f23ba08f0cfe45e3aae59cc464320cf482ed1c18450edcb5 not found: ID does not exist" Mar 18 08:54:51 crc kubenswrapper[4917]: I0318 08:54:51.788426 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" path="/var/lib/kubelet/pods/993f8c8e-e1a1-4f38-b6e0-743eb3c646bc/volumes" Mar 18 08:54:52 crc kubenswrapper[4917]: I0318 08:54:52.774477 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:54:52 crc kubenswrapper[4917]: E0318 08:54:52.775084 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:55:03 crc kubenswrapper[4917]: I0318 08:55:03.772728 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:55:03 crc kubenswrapper[4917]: E0318 08:55:03.773794 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:55:14 crc kubenswrapper[4917]: I0318 08:55:14.775027 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:55:14 crc kubenswrapper[4917]: E0318 08:55:14.778775 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:55:27 crc kubenswrapper[4917]: I0318 08:55:27.775097 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:55:27 crc kubenswrapper[4917]: E0318 08:55:27.775754 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:55:41 crc kubenswrapper[4917]: I0318 08:55:41.773499 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:55:41 crc kubenswrapper[4917]: E0318 08:55:41.774346 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:55:53 crc kubenswrapper[4917]: I0318 08:55:53.773185 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:55:53 crc kubenswrapper[4917]: E0318 08:55:53.774778 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:55:59 crc kubenswrapper[4917]: I0318 08:55:59.763800 4917 generic.go:334] "Generic (PLEG): container finished" podID="b993b322-7111-4544-9def-988363263914" containerID="5e526f156aefe260f590f8583370e9f000b638b54f5858cd89bdc35b751fad5f" exitCode=0 Mar 18 08:55:59 crc kubenswrapper[4917]: I0318 08:55:59.763918 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" event={"ID":"b993b322-7111-4544-9def-988363263914","Type":"ContainerDied","Data":"5e526f156aefe260f590f8583370e9f000b638b54f5858cd89bdc35b751fad5f"} Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.151454 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563736-wqf6d"] Mar 18 08:56:00 crc kubenswrapper[4917]: E0318 08:56:00.152229 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerName="registry-server" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.152345 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerName="registry-server" Mar 18 08:56:00 crc kubenswrapper[4917]: E0318 08:56:00.152457 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerName="extract-content" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.152542 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerName="extract-content" Mar 18 08:56:00 crc kubenswrapper[4917]: E0318 08:56:00.152687 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerName="extract-utilities" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.152805 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerName="extract-utilities" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.153188 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="993f8c8e-e1a1-4f38-b6e0-743eb3c646bc" containerName="registry-server" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.154222 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563736-wqf6d" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.156562 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.157319 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.157464 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.174115 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563736-wqf6d"] Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.256512 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vlfr\" (UniqueName: \"kubernetes.io/projected/a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1-kube-api-access-6vlfr\") pod \"auto-csr-approver-29563736-wqf6d\" (UID: \"a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1\") " pod="openshift-infra/auto-csr-approver-29563736-wqf6d" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.358331 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vlfr\" (UniqueName: \"kubernetes.io/projected/a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1-kube-api-access-6vlfr\") pod \"auto-csr-approver-29563736-wqf6d\" (UID: \"a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1\") " pod="openshift-infra/auto-csr-approver-29563736-wqf6d" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.383513 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vlfr\" (UniqueName: \"kubernetes.io/projected/a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1-kube-api-access-6vlfr\") pod \"auto-csr-approver-29563736-wqf6d\" (UID: \"a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1\") " pod="openshift-infra/auto-csr-approver-29563736-wqf6d" Mar 18 08:56:00 crc kubenswrapper[4917]: I0318 08:56:00.475558 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563736-wqf6d" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:00.987218 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563736-wqf6d"] Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.139042 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.276556 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-inventory\") pod \"b993b322-7111-4544-9def-988363263914\" (UID: \"b993b322-7111-4544-9def-988363263914\") " Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.276668 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-ssh-key-openstack-cell1\") pod \"b993b322-7111-4544-9def-988363263914\" (UID: \"b993b322-7111-4544-9def-988363263914\") " Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.276702 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v675q\" (UniqueName: \"kubernetes.io/projected/b993b322-7111-4544-9def-988363263914-kube-api-access-v675q\") pod \"b993b322-7111-4544-9def-988363263914\" (UID: \"b993b322-7111-4544-9def-988363263914\") " Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.276719 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-secret-0\") pod \"b993b322-7111-4544-9def-988363263914\" (UID: \"b993b322-7111-4544-9def-988363263914\") " Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.276977 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-combined-ca-bundle\") pod \"b993b322-7111-4544-9def-988363263914\" (UID: \"b993b322-7111-4544-9def-988363263914\") " Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.281950 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b993b322-7111-4544-9def-988363263914-kube-api-access-v675q" (OuterVolumeSpecName: "kube-api-access-v675q") pod "b993b322-7111-4544-9def-988363263914" (UID: "b993b322-7111-4544-9def-988363263914"). InnerVolumeSpecName "kube-api-access-v675q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.283760 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b993b322-7111-4544-9def-988363263914" (UID: "b993b322-7111-4544-9def-988363263914"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.309550 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-inventory" (OuterVolumeSpecName: "inventory") pod "b993b322-7111-4544-9def-988363263914" (UID: "b993b322-7111-4544-9def-988363263914"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.313748 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b993b322-7111-4544-9def-988363263914" (UID: "b993b322-7111-4544-9def-988363263914"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.316085 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b993b322-7111-4544-9def-988363263914" (UID: "b993b322-7111-4544-9def-988363263914"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.379773 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.379805 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.379820 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v675q\" (UniqueName: \"kubernetes.io/projected/b993b322-7111-4544-9def-988363263914-kube-api-access-v675q\") on node \"crc\" DevicePath \"\"" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.379830 4917 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.379843 4917 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b993b322-7111-4544-9def-988363263914-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.787413 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.787423 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f7mjl" event={"ID":"b993b322-7111-4544-9def-988363263914","Type":"ContainerDied","Data":"072d7a14a5f02de16f0df34f96352d10a036a546b1a6b6bdf8a5b7967b615f00"} Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.787473 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="072d7a14a5f02de16f0df34f96352d10a036a546b1a6b6bdf8a5b7967b615f00" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.797411 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563736-wqf6d" event={"ID":"a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1","Type":"ContainerStarted","Data":"b6dd0daf433a55197283ecd191284115bac6e93d8a4d2cfeaca528d0bd9d10a6"} Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.891850 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-v42ph"] Mar 18 08:56:01 crc kubenswrapper[4917]: E0318 08:56:01.892288 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b993b322-7111-4544-9def-988363263914" containerName="libvirt-openstack-openstack-cell1" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.892307 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="b993b322-7111-4544-9def-988363263914" containerName="libvirt-openstack-openstack-cell1" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.892468 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="b993b322-7111-4544-9def-988363263914" containerName="libvirt-openstack-openstack-cell1" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.893316 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.895577 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.895929 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.896224 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.896240 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.896274 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.896228 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.897768 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 08:56:01 crc kubenswrapper[4917]: I0318 08:56:01.910890 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-v42ph"] Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005012 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005074 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-inventory\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005105 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005131 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5tdp\" (UniqueName: \"kubernetes.io/projected/65333a03-4e64-4807-bae2-cb0b1d517e08-kube-api-access-p5tdp\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005159 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005261 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005360 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005401 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005459 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005494 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.005525 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108109 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108196 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108253 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108321 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108420 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108508 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108567 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-inventory\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108627 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108659 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5tdp\" (UniqueName: \"kubernetes.io/projected/65333a03-4e64-4807-bae2-cb0b1d517e08-kube-api-access-p5tdp\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108712 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.108811 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.110098 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.113175 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.113454 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.118482 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.118691 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.118891 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.120690 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.121498 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.122063 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.122912 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-inventory\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.144329 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5tdp\" (UniqueName: \"kubernetes.io/projected/65333a03-4e64-4807-bae2-cb0b1d517e08-kube-api-access-p5tdp\") pod \"nova-cell1-openstack-openstack-cell1-v42ph\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.235042 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:56:02 crc kubenswrapper[4917]: W0318 08:56:02.565797 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65333a03_4e64_4807_bae2_cb0b1d517e08.slice/crio-79957832fea6d6d4c4c828a2a94c735d1cb3ac66f0c3e7196f6e838709cd1e70 WatchSource:0}: Error finding container 79957832fea6d6d4c4c828a2a94c735d1cb3ac66f0c3e7196f6e838709cd1e70: Status 404 returned error can't find the container with id 79957832fea6d6d4c4c828a2a94c735d1cb3ac66f0c3e7196f6e838709cd1e70 Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.569263 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-v42ph"] Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.810487 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" event={"ID":"65333a03-4e64-4807-bae2-cb0b1d517e08","Type":"ContainerStarted","Data":"79957832fea6d6d4c4c828a2a94c735d1cb3ac66f0c3e7196f6e838709cd1e70"} Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.812958 4917 generic.go:334] "Generic (PLEG): container finished" podID="a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1" containerID="434c9e72d2908e8db3cdb3c80494091728bc164d6a0a963c23317c797d345177" exitCode=0 Mar 18 08:56:02 crc kubenswrapper[4917]: I0318 08:56:02.813034 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563736-wqf6d" event={"ID":"a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1","Type":"ContainerDied","Data":"434c9e72d2908e8db3cdb3c80494091728bc164d6a0a963c23317c797d345177"} Mar 18 08:56:03 crc kubenswrapper[4917]: I0318 08:56:03.825497 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" event={"ID":"65333a03-4e64-4807-bae2-cb0b1d517e08","Type":"ContainerStarted","Data":"270f536638eaa8b7f98eeb988aff10b6756be813c47b340a031fb06692b16f74"} Mar 18 08:56:03 crc kubenswrapper[4917]: I0318 08:56:03.861558 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" podStartSLOduration=2.345756781 podStartE2EDuration="2.861518555s" podCreationTimestamp="2026-03-18 08:56:01 +0000 UTC" firstStartedPulling="2026-03-18 08:56:02.569415925 +0000 UTC m=+7747.510570639" lastFinishedPulling="2026-03-18 08:56:03.085177699 +0000 UTC m=+7748.026332413" observedRunningTime="2026-03-18 08:56:03.858432561 +0000 UTC m=+7748.799587295" watchObservedRunningTime="2026-03-18 08:56:03.861518555 +0000 UTC m=+7748.802673269" Mar 18 08:56:04 crc kubenswrapper[4917]: I0318 08:56:04.182946 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563736-wqf6d" Mar 18 08:56:04 crc kubenswrapper[4917]: I0318 08:56:04.274290 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vlfr\" (UniqueName: \"kubernetes.io/projected/a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1-kube-api-access-6vlfr\") pod \"a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1\" (UID: \"a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1\") " Mar 18 08:56:04 crc kubenswrapper[4917]: I0318 08:56:04.279356 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1-kube-api-access-6vlfr" (OuterVolumeSpecName: "kube-api-access-6vlfr") pod "a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1" (UID: "a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1"). InnerVolumeSpecName "kube-api-access-6vlfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:56:04 crc kubenswrapper[4917]: I0318 08:56:04.376418 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vlfr\" (UniqueName: \"kubernetes.io/projected/a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1-kube-api-access-6vlfr\") on node \"crc\" DevicePath \"\"" Mar 18 08:56:04 crc kubenswrapper[4917]: I0318 08:56:04.842798 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563736-wqf6d" Mar 18 08:56:04 crc kubenswrapper[4917]: I0318 08:56:04.843801 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563736-wqf6d" event={"ID":"a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1","Type":"ContainerDied","Data":"b6dd0daf433a55197283ecd191284115bac6e93d8a4d2cfeaca528d0bd9d10a6"} Mar 18 08:56:04 crc kubenswrapper[4917]: I0318 08:56:04.843848 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6dd0daf433a55197283ecd191284115bac6e93d8a4d2cfeaca528d0bd9d10a6" Mar 18 08:56:05 crc kubenswrapper[4917]: I0318 08:56:05.272888 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563730-lrq66"] Mar 18 08:56:05 crc kubenswrapper[4917]: I0318 08:56:05.284663 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563730-lrq66"] Mar 18 08:56:05 crc kubenswrapper[4917]: I0318 08:56:05.789177 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cfa3930-b29b-4512-8ef0-8cd47dad802c" path="/var/lib/kubelet/pods/8cfa3930-b29b-4512-8ef0-8cd47dad802c/volumes" Mar 18 08:56:06 crc kubenswrapper[4917]: I0318 08:56:06.772904 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:56:06 crc kubenswrapper[4917]: E0318 08:56:06.773357 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:56:15 crc kubenswrapper[4917]: I0318 08:56:15.483241 4917 scope.go:117] "RemoveContainer" containerID="b3c64dee29a1077b817560911e670c63303d298e2f0f9abf9ff70e07f3a922e2" Mar 18 08:56:19 crc kubenswrapper[4917]: I0318 08:56:19.772669 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:56:19 crc kubenswrapper[4917]: E0318 08:56:19.773508 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:56:30 crc kubenswrapper[4917]: I0318 08:56:30.773251 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:56:30 crc kubenswrapper[4917]: E0318 08:56:30.774338 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:56:43 crc kubenswrapper[4917]: I0318 08:56:43.773337 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:56:43 crc kubenswrapper[4917]: E0318 08:56:43.774876 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:56:54 crc kubenswrapper[4917]: I0318 08:56:54.773546 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:56:54 crc kubenswrapper[4917]: E0318 08:56:54.774488 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.164528 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-548qx"] Mar 18 08:57:07 crc kubenswrapper[4917]: E0318 08:57:07.165768 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1" containerName="oc" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.165788 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1" containerName="oc" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.166133 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1" containerName="oc" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.168530 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.195861 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-548qx"] Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.357653 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-catalog-content\") pod \"redhat-marketplace-548qx\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.357796 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr6x9\" (UniqueName: \"kubernetes.io/projected/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-kube-api-access-rr6x9\") pod \"redhat-marketplace-548qx\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.357849 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-utilities\") pod \"redhat-marketplace-548qx\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.460848 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-catalog-content\") pod \"redhat-marketplace-548qx\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.461315 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-catalog-content\") pod \"redhat-marketplace-548qx\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.461882 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr6x9\" (UniqueName: \"kubernetes.io/projected/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-kube-api-access-rr6x9\") pod \"redhat-marketplace-548qx\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.462179 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-utilities\") pod \"redhat-marketplace-548qx\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.462418 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-utilities\") pod \"redhat-marketplace-548qx\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.487782 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr6x9\" (UniqueName: \"kubernetes.io/projected/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-kube-api-access-rr6x9\") pod \"redhat-marketplace-548qx\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:07 crc kubenswrapper[4917]: I0318 08:57:07.498387 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:08 crc kubenswrapper[4917]: I0318 08:57:08.100481 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-548qx"] Mar 18 08:57:08 crc kubenswrapper[4917]: I0318 08:57:08.626457 4917 generic.go:334] "Generic (PLEG): container finished" podID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerID="a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179" exitCode=0 Mar 18 08:57:08 crc kubenswrapper[4917]: I0318 08:57:08.626579 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-548qx" event={"ID":"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e","Type":"ContainerDied","Data":"a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179"} Mar 18 08:57:08 crc kubenswrapper[4917]: I0318 08:57:08.626763 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-548qx" event={"ID":"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e","Type":"ContainerStarted","Data":"62e020652a9d4a0eebfbd6a89d76e1ef30bdf3f2fd472beebe110631860e5fc2"} Mar 18 08:57:08 crc kubenswrapper[4917]: I0318 08:57:08.773368 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:57:08 crc kubenswrapper[4917]: E0318 08:57:08.773788 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:57:09 crc kubenswrapper[4917]: I0318 08:57:09.639125 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-548qx" event={"ID":"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e","Type":"ContainerStarted","Data":"019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63"} Mar 18 08:57:10 crc kubenswrapper[4917]: I0318 08:57:10.654341 4917 generic.go:334] "Generic (PLEG): container finished" podID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerID="019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63" exitCode=0 Mar 18 08:57:10 crc kubenswrapper[4917]: I0318 08:57:10.654380 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-548qx" event={"ID":"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e","Type":"ContainerDied","Data":"019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63"} Mar 18 08:57:11 crc kubenswrapper[4917]: I0318 08:57:11.670651 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-548qx" event={"ID":"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e","Type":"ContainerStarted","Data":"b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d"} Mar 18 08:57:11 crc kubenswrapper[4917]: I0318 08:57:11.705792 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-548qx" podStartSLOduration=2.236531907 podStartE2EDuration="4.705773279s" podCreationTimestamp="2026-03-18 08:57:07 +0000 UTC" firstStartedPulling="2026-03-18 08:57:08.629280264 +0000 UTC m=+7813.570434978" lastFinishedPulling="2026-03-18 08:57:11.098521596 +0000 UTC m=+7816.039676350" observedRunningTime="2026-03-18 08:57:11.696926285 +0000 UTC m=+7816.638081039" watchObservedRunningTime="2026-03-18 08:57:11.705773279 +0000 UTC m=+7816.646927993" Mar 18 08:57:17 crc kubenswrapper[4917]: I0318 08:57:17.499129 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:17 crc kubenswrapper[4917]: I0318 08:57:17.499836 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:17 crc kubenswrapper[4917]: I0318 08:57:17.577255 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:17 crc kubenswrapper[4917]: I0318 08:57:17.801491 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.136904 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-548qx"] Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.137427 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-548qx" podUID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerName="registry-server" containerID="cri-o://b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d" gracePeriod=2 Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.623992 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.783352 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr6x9\" (UniqueName: \"kubernetes.io/projected/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-kube-api-access-rr6x9\") pod \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.783404 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-utilities\") pod \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.783459 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-catalog-content\") pod \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\" (UID: \"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e\") " Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.784479 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-utilities" (OuterVolumeSpecName: "utilities") pod "c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" (UID: "c2cb0cd4-6995-4f5e-af97-0ad35f1f758e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.793656 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-kube-api-access-rr6x9" (OuterVolumeSpecName: "kube-api-access-rr6x9") pod "c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" (UID: "c2cb0cd4-6995-4f5e-af97-0ad35f1f758e"). InnerVolumeSpecName "kube-api-access-rr6x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.812349 4917 generic.go:334] "Generic (PLEG): container finished" podID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerID="b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d" exitCode=0 Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.812398 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-548qx" event={"ID":"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e","Type":"ContainerDied","Data":"b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d"} Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.812456 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-548qx" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.812478 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-548qx" event={"ID":"c2cb0cd4-6995-4f5e-af97-0ad35f1f758e","Type":"ContainerDied","Data":"62e020652a9d4a0eebfbd6a89d76e1ef30bdf3f2fd472beebe110631860e5fc2"} Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.812519 4917 scope.go:117] "RemoveContainer" containerID="b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.834317 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" (UID: "c2cb0cd4-6995-4f5e-af97-0ad35f1f758e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.854108 4917 scope.go:117] "RemoveContainer" containerID="019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.878623 4917 scope.go:117] "RemoveContainer" containerID="a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.887785 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr6x9\" (UniqueName: \"kubernetes.io/projected/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-kube-api-access-rr6x9\") on node \"crc\" DevicePath \"\"" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.887935 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.889004 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.923175 4917 scope.go:117] "RemoveContainer" containerID="b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d" Mar 18 08:57:21 crc kubenswrapper[4917]: E0318 08:57:21.923933 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d\": container with ID starting with b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d not found: ID does not exist" containerID="b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.923991 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d"} err="failed to get container status \"b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d\": rpc error: code = NotFound desc = could not find container \"b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d\": container with ID starting with b7b9974cbbff476b5605b994b1d96031ede0adfa2141c4c5c8423ca2b955223d not found: ID does not exist" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.924028 4917 scope.go:117] "RemoveContainer" containerID="019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63" Mar 18 08:57:21 crc kubenswrapper[4917]: E0318 08:57:21.924706 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63\": container with ID starting with 019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63 not found: ID does not exist" containerID="019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.924751 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63"} err="failed to get container status \"019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63\": rpc error: code = NotFound desc = could not find container \"019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63\": container with ID starting with 019e02acf12c3d36155c0ba31248b28615c37fcadc763b7769779360df97cd63 not found: ID does not exist" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.924781 4917 scope.go:117] "RemoveContainer" containerID="a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179" Mar 18 08:57:21 crc kubenswrapper[4917]: E0318 08:57:21.925212 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179\": container with ID starting with a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179 not found: ID does not exist" containerID="a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179" Mar 18 08:57:21 crc kubenswrapper[4917]: I0318 08:57:21.925244 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179"} err="failed to get container status \"a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179\": rpc error: code = NotFound desc = could not find container \"a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179\": container with ID starting with a4a2f1376548a8529c9a9e64499c6d2c76e64aabb06c8dde0e11a23882012179 not found: ID does not exist" Mar 18 08:57:22 crc kubenswrapper[4917]: I0318 08:57:22.163666 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-548qx"] Mar 18 08:57:22 crc kubenswrapper[4917]: I0318 08:57:22.178094 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-548qx"] Mar 18 08:57:23 crc kubenswrapper[4917]: I0318 08:57:23.775232 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:57:23 crc kubenswrapper[4917]: E0318 08:57:23.776366 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 08:57:23 crc kubenswrapper[4917]: I0318 08:57:23.795959 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" path="/var/lib/kubelet/pods/c2cb0cd4-6995-4f5e-af97-0ad35f1f758e/volumes" Mar 18 08:57:37 crc kubenswrapper[4917]: I0318 08:57:37.776030 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 08:57:39 crc kubenswrapper[4917]: I0318 08:57:39.006669 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"a573aaa8431dbcec8312c697a1f722048d6ff3b34cc7b982e4853b7a5ec2a1fe"} Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.872306 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k5s9b"] Mar 18 08:57:54 crc kubenswrapper[4917]: E0318 08:57:54.873398 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerName="registry-server" Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.873417 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerName="registry-server" Mar 18 08:57:54 crc kubenswrapper[4917]: E0318 08:57:54.873436 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerName="extract-content" Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.873444 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerName="extract-content" Mar 18 08:57:54 crc kubenswrapper[4917]: E0318 08:57:54.873465 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerName="extract-utilities" Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.873473 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerName="extract-utilities" Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.873770 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2cb0cd4-6995-4f5e-af97-0ad35f1f758e" containerName="registry-server" Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.875969 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.886393 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5s9b"] Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.951401 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-utilities\") pod \"community-operators-k5s9b\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.951495 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-catalog-content\") pod \"community-operators-k5s9b\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:54 crc kubenswrapper[4917]: I0318 08:57:54.951879 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjw6l\" (UniqueName: \"kubernetes.io/projected/cb6da239-0296-401d-ad86-a57ec42e96be-kube-api-access-sjw6l\") pod \"community-operators-k5s9b\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:55 crc kubenswrapper[4917]: I0318 08:57:55.054310 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjw6l\" (UniqueName: \"kubernetes.io/projected/cb6da239-0296-401d-ad86-a57ec42e96be-kube-api-access-sjw6l\") pod \"community-operators-k5s9b\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:55 crc kubenswrapper[4917]: I0318 08:57:55.055741 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-utilities\") pod \"community-operators-k5s9b\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:55 crc kubenswrapper[4917]: I0318 08:57:55.055816 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-catalog-content\") pod \"community-operators-k5s9b\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:55 crc kubenswrapper[4917]: I0318 08:57:55.056135 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-utilities\") pod \"community-operators-k5s9b\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:55 crc kubenswrapper[4917]: I0318 08:57:55.056236 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-catalog-content\") pod \"community-operators-k5s9b\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:55 crc kubenswrapper[4917]: I0318 08:57:55.074215 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjw6l\" (UniqueName: \"kubernetes.io/projected/cb6da239-0296-401d-ad86-a57ec42e96be-kube-api-access-sjw6l\") pod \"community-operators-k5s9b\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:55 crc kubenswrapper[4917]: I0318 08:57:55.199328 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:57:55 crc kubenswrapper[4917]: I0318 08:57:55.755136 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k5s9b"] Mar 18 08:57:55 crc kubenswrapper[4917]: W0318 08:57:55.759726 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb6da239_0296_401d_ad86_a57ec42e96be.slice/crio-1907639d28167e0f9714ed5946b501f7a9a835b7d679db6d87fc6c51f8c4b478 WatchSource:0}: Error finding container 1907639d28167e0f9714ed5946b501f7a9a835b7d679db6d87fc6c51f8c4b478: Status 404 returned error can't find the container with id 1907639d28167e0f9714ed5946b501f7a9a835b7d679db6d87fc6c51f8c4b478 Mar 18 08:57:56 crc kubenswrapper[4917]: I0318 08:57:56.207163 4917 generic.go:334] "Generic (PLEG): container finished" podID="cb6da239-0296-401d-ad86-a57ec42e96be" containerID="dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15" exitCode=0 Mar 18 08:57:56 crc kubenswrapper[4917]: I0318 08:57:56.207217 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5s9b" event={"ID":"cb6da239-0296-401d-ad86-a57ec42e96be","Type":"ContainerDied","Data":"dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15"} Mar 18 08:57:56 crc kubenswrapper[4917]: I0318 08:57:56.207462 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5s9b" event={"ID":"cb6da239-0296-401d-ad86-a57ec42e96be","Type":"ContainerStarted","Data":"1907639d28167e0f9714ed5946b501f7a9a835b7d679db6d87fc6c51f8c4b478"} Mar 18 08:57:57 crc kubenswrapper[4917]: I0318 08:57:57.222705 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5s9b" event={"ID":"cb6da239-0296-401d-ad86-a57ec42e96be","Type":"ContainerStarted","Data":"1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db"} Mar 18 08:57:59 crc kubenswrapper[4917]: I0318 08:57:59.246369 4917 generic.go:334] "Generic (PLEG): container finished" podID="cb6da239-0296-401d-ad86-a57ec42e96be" containerID="1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db" exitCode=0 Mar 18 08:57:59 crc kubenswrapper[4917]: I0318 08:57:59.246496 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5s9b" event={"ID":"cb6da239-0296-401d-ad86-a57ec42e96be","Type":"ContainerDied","Data":"1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db"} Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.146545 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563738-f8pln"] Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.148253 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563738-f8pln" Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.150930 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.151200 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.151253 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.158491 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563738-f8pln"] Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.258247 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5s9b" event={"ID":"cb6da239-0296-401d-ad86-a57ec42e96be","Type":"ContainerStarted","Data":"f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e"} Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.271103 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsplf\" (UniqueName: \"kubernetes.io/projected/23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0-kube-api-access-wsplf\") pod \"auto-csr-approver-29563738-f8pln\" (UID: \"23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0\") " pod="openshift-infra/auto-csr-approver-29563738-f8pln" Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.286271 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k5s9b" podStartSLOduration=2.81090473 podStartE2EDuration="6.286246353s" podCreationTimestamp="2026-03-18 08:57:54 +0000 UTC" firstStartedPulling="2026-03-18 08:57:56.209950729 +0000 UTC m=+7861.151105433" lastFinishedPulling="2026-03-18 08:57:59.685292332 +0000 UTC m=+7864.626447056" observedRunningTime="2026-03-18 08:58:00.274808965 +0000 UTC m=+7865.215963679" watchObservedRunningTime="2026-03-18 08:58:00.286246353 +0000 UTC m=+7865.227401067" Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.372907 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsplf\" (UniqueName: \"kubernetes.io/projected/23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0-kube-api-access-wsplf\") pod \"auto-csr-approver-29563738-f8pln\" (UID: \"23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0\") " pod="openshift-infra/auto-csr-approver-29563738-f8pln" Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.403858 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsplf\" (UniqueName: \"kubernetes.io/projected/23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0-kube-api-access-wsplf\") pod \"auto-csr-approver-29563738-f8pln\" (UID: \"23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0\") " pod="openshift-infra/auto-csr-approver-29563738-f8pln" Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.475516 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563738-f8pln" Mar 18 08:58:00 crc kubenswrapper[4917]: I0318 08:58:00.974596 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563738-f8pln"] Mar 18 08:58:01 crc kubenswrapper[4917]: I0318 08:58:01.267985 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563738-f8pln" event={"ID":"23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0","Type":"ContainerStarted","Data":"c553af74294900f8f4e50289d988c778b45cf5467d520508159baf1bdb213f28"} Mar 18 08:58:03 crc kubenswrapper[4917]: I0318 08:58:03.289160 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563738-f8pln" event={"ID":"23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0","Type":"ContainerStarted","Data":"bc032706519eed172391d6b4e007f9991f6db6e91c0db146c38cf0083773161f"} Mar 18 08:58:04 crc kubenswrapper[4917]: I0318 08:58:04.307006 4917 generic.go:334] "Generic (PLEG): container finished" podID="23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0" containerID="bc032706519eed172391d6b4e007f9991f6db6e91c0db146c38cf0083773161f" exitCode=0 Mar 18 08:58:04 crc kubenswrapper[4917]: I0318 08:58:04.307085 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563738-f8pln" event={"ID":"23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0","Type":"ContainerDied","Data":"bc032706519eed172391d6b4e007f9991f6db6e91c0db146c38cf0083773161f"} Mar 18 08:58:05 crc kubenswrapper[4917]: I0318 08:58:05.199840 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:58:05 crc kubenswrapper[4917]: I0318 08:58:05.200173 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:58:05 crc kubenswrapper[4917]: I0318 08:58:05.244956 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:58:05 crc kubenswrapper[4917]: I0318 08:58:05.371850 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:58:05 crc kubenswrapper[4917]: I0318 08:58:05.497826 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k5s9b"] Mar 18 08:58:05 crc kubenswrapper[4917]: I0318 08:58:05.720471 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563738-f8pln" Mar 18 08:58:05 crc kubenswrapper[4917]: I0318 08:58:05.799555 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsplf\" (UniqueName: \"kubernetes.io/projected/23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0-kube-api-access-wsplf\") pod \"23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0\" (UID: \"23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0\") " Mar 18 08:58:05 crc kubenswrapper[4917]: I0318 08:58:05.805461 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0-kube-api-access-wsplf" (OuterVolumeSpecName: "kube-api-access-wsplf") pod "23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0" (UID: "23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0"). InnerVolumeSpecName "kube-api-access-wsplf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:58:05 crc kubenswrapper[4917]: I0318 08:58:05.902367 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsplf\" (UniqueName: \"kubernetes.io/projected/23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0-kube-api-access-wsplf\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:06 crc kubenswrapper[4917]: I0318 08:58:06.329465 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563738-f8pln" Mar 18 08:58:06 crc kubenswrapper[4917]: I0318 08:58:06.329487 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563738-f8pln" event={"ID":"23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0","Type":"ContainerDied","Data":"c553af74294900f8f4e50289d988c778b45cf5467d520508159baf1bdb213f28"} Mar 18 08:58:06 crc kubenswrapper[4917]: I0318 08:58:06.330037 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c553af74294900f8f4e50289d988c778b45cf5467d520508159baf1bdb213f28" Mar 18 08:58:06 crc kubenswrapper[4917]: I0318 08:58:06.393656 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563732-xrclx"] Mar 18 08:58:06 crc kubenswrapper[4917]: I0318 08:58:06.404510 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563732-xrclx"] Mar 18 08:58:07 crc kubenswrapper[4917]: I0318 08:58:07.340803 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k5s9b" podUID="cb6da239-0296-401d-ad86-a57ec42e96be" containerName="registry-server" containerID="cri-o://f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e" gracePeriod=2 Mar 18 08:58:07 crc kubenswrapper[4917]: I0318 08:58:07.785793 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94147ec3-b2d8-4a9c-8f4f-3951816f88b1" path="/var/lib/kubelet/pods/94147ec3-b2d8-4a9c-8f4f-3951816f88b1/volumes" Mar 18 08:58:07 crc kubenswrapper[4917]: I0318 08:58:07.875423 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:58:07 crc kubenswrapper[4917]: I0318 08:58:07.946640 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjw6l\" (UniqueName: \"kubernetes.io/projected/cb6da239-0296-401d-ad86-a57ec42e96be-kube-api-access-sjw6l\") pod \"cb6da239-0296-401d-ad86-a57ec42e96be\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " Mar 18 08:58:07 crc kubenswrapper[4917]: I0318 08:58:07.946714 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-utilities\") pod \"cb6da239-0296-401d-ad86-a57ec42e96be\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " Mar 18 08:58:07 crc kubenswrapper[4917]: I0318 08:58:07.946763 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-catalog-content\") pod \"cb6da239-0296-401d-ad86-a57ec42e96be\" (UID: \"cb6da239-0296-401d-ad86-a57ec42e96be\") " Mar 18 08:58:07 crc kubenswrapper[4917]: I0318 08:58:07.948763 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-utilities" (OuterVolumeSpecName: "utilities") pod "cb6da239-0296-401d-ad86-a57ec42e96be" (UID: "cb6da239-0296-401d-ad86-a57ec42e96be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:58:07 crc kubenswrapper[4917]: I0318 08:58:07.953684 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb6da239-0296-401d-ad86-a57ec42e96be-kube-api-access-sjw6l" (OuterVolumeSpecName: "kube-api-access-sjw6l") pod "cb6da239-0296-401d-ad86-a57ec42e96be" (UID: "cb6da239-0296-401d-ad86-a57ec42e96be"). InnerVolumeSpecName "kube-api-access-sjw6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.009649 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb6da239-0296-401d-ad86-a57ec42e96be" (UID: "cb6da239-0296-401d-ad86-a57ec42e96be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.049267 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.049302 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb6da239-0296-401d-ad86-a57ec42e96be-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.049312 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjw6l\" (UniqueName: \"kubernetes.io/projected/cb6da239-0296-401d-ad86-a57ec42e96be-kube-api-access-sjw6l\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.352527 4917 generic.go:334] "Generic (PLEG): container finished" podID="cb6da239-0296-401d-ad86-a57ec42e96be" containerID="f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e" exitCode=0 Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.352619 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k5s9b" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.352614 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5s9b" event={"ID":"cb6da239-0296-401d-ad86-a57ec42e96be","Type":"ContainerDied","Data":"f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e"} Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.353834 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k5s9b" event={"ID":"cb6da239-0296-401d-ad86-a57ec42e96be","Type":"ContainerDied","Data":"1907639d28167e0f9714ed5946b501f7a9a835b7d679db6d87fc6c51f8c4b478"} Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.353869 4917 scope.go:117] "RemoveContainer" containerID="f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.383765 4917 scope.go:117] "RemoveContainer" containerID="1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.400839 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k5s9b"] Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.412043 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k5s9b"] Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.418806 4917 scope.go:117] "RemoveContainer" containerID="dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.484419 4917 scope.go:117] "RemoveContainer" containerID="f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e" Mar 18 08:58:08 crc kubenswrapper[4917]: E0318 08:58:08.485119 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e\": container with ID starting with f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e not found: ID does not exist" containerID="f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.485158 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e"} err="failed to get container status \"f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e\": rpc error: code = NotFound desc = could not find container \"f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e\": container with ID starting with f0747b942bab7ad7fe5ba0c8044f7f3d2e23937600144f543d1cf8aeed76756e not found: ID does not exist" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.485188 4917 scope.go:117] "RemoveContainer" containerID="1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db" Mar 18 08:58:08 crc kubenswrapper[4917]: E0318 08:58:08.485496 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db\": container with ID starting with 1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db not found: ID does not exist" containerID="1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.485521 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db"} err="failed to get container status \"1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db\": rpc error: code = NotFound desc = could not find container \"1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db\": container with ID starting with 1965b3f4d01af041aeac68fc6490170348219ae6c4231100db7b783639be86db not found: ID does not exist" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.485536 4917 scope.go:117] "RemoveContainer" containerID="dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15" Mar 18 08:58:08 crc kubenswrapper[4917]: E0318 08:58:08.485860 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15\": container with ID starting with dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15 not found: ID does not exist" containerID="dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15" Mar 18 08:58:08 crc kubenswrapper[4917]: I0318 08:58:08.485884 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15"} err="failed to get container status \"dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15\": rpc error: code = NotFound desc = could not find container \"dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15\": container with ID starting with dca3dd4596d28aedc242ec3d2d2bed7c5da4a9d7f53de7ba474803cb19691e15 not found: ID does not exist" Mar 18 08:58:09 crc kubenswrapper[4917]: I0318 08:58:09.786695 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb6da239-0296-401d-ad86-a57ec42e96be" path="/var/lib/kubelet/pods/cb6da239-0296-401d-ad86-a57ec42e96be/volumes" Mar 18 08:58:15 crc kubenswrapper[4917]: I0318 08:58:15.636900 4917 scope.go:117] "RemoveContainer" containerID="e0e17de92ae998bf6224aeb6ec107acb0c61d7849c7395cf7b839059baf5bfec" Mar 18 08:58:39 crc kubenswrapper[4917]: I0318 08:58:39.717113 4917 generic.go:334] "Generic (PLEG): container finished" podID="65333a03-4e64-4807-bae2-cb0b1d517e08" containerID="270f536638eaa8b7f98eeb988aff10b6756be813c47b340a031fb06692b16f74" exitCode=0 Mar 18 08:58:39 crc kubenswrapper[4917]: I0318 08:58:39.717786 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" event={"ID":"65333a03-4e64-4807-bae2-cb0b1d517e08","Type":"ContainerDied","Data":"270f536638eaa8b7f98eeb988aff10b6756be813c47b340a031fb06692b16f74"} Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.191847 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.301251 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cells-global-config-0\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.301624 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-ssh-key-openstack-cell1\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.301685 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-0\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.301736 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-2\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.301808 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5tdp\" (UniqueName: \"kubernetes.io/projected/65333a03-4e64-4807-bae2-cb0b1d517e08-kube-api-access-p5tdp\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.301856 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-1\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.301900 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-3\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.301952 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-1\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.302024 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-combined-ca-bundle\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.302122 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-0\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.302163 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-inventory\") pod \"65333a03-4e64-4807-bae2-cb0b1d517e08\" (UID: \"65333a03-4e64-4807-bae2-cb0b1d517e08\") " Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.307010 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65333a03-4e64-4807-bae2-cb0b1d517e08-kube-api-access-p5tdp" (OuterVolumeSpecName: "kube-api-access-p5tdp") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "kube-api-access-p5tdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.307941 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.337432 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.337554 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.338125 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.340251 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-inventory" (OuterVolumeSpecName: "inventory") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.340683 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.340779 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.341243 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.341762 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.360971 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "65333a03-4e64-4807-bae2-cb0b1d517e08" (UID: "65333a03-4e64-4807-bae2-cb0b1d517e08"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405242 4917 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405277 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405287 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5tdp\" (UniqueName: \"kubernetes.io/projected/65333a03-4e64-4807-bae2-cb0b1d517e08-kube-api-access-p5tdp\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405296 4917 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405304 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405314 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405322 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405332 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405344 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405357 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/65333a03-4e64-4807-bae2-cb0b1d517e08-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.405366 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/65333a03-4e64-4807-bae2-cb0b1d517e08-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.754429 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" event={"ID":"65333a03-4e64-4807-bae2-cb0b1d517e08","Type":"ContainerDied","Data":"79957832fea6d6d4c4c828a2a94c735d1cb3ac66f0c3e7196f6e838709cd1e70"} Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.754483 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-v42ph" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.754504 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79957832fea6d6d4c4c828a2a94c735d1cb3ac66f0c3e7196f6e838709cd1e70" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.861736 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-48kxh"] Mar 18 08:58:41 crc kubenswrapper[4917]: E0318 08:58:41.862410 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6da239-0296-401d-ad86-a57ec42e96be" containerName="extract-content" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.862437 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6da239-0296-401d-ad86-a57ec42e96be" containerName="extract-content" Mar 18 08:58:41 crc kubenswrapper[4917]: E0318 08:58:41.862467 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6da239-0296-401d-ad86-a57ec42e96be" containerName="extract-utilities" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.862476 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6da239-0296-401d-ad86-a57ec42e96be" containerName="extract-utilities" Mar 18 08:58:41 crc kubenswrapper[4917]: E0318 08:58:41.862497 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0" containerName="oc" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.862507 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0" containerName="oc" Mar 18 08:58:41 crc kubenswrapper[4917]: E0318 08:58:41.862521 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb6da239-0296-401d-ad86-a57ec42e96be" containerName="registry-server" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.862529 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb6da239-0296-401d-ad86-a57ec42e96be" containerName="registry-server" Mar 18 08:58:41 crc kubenswrapper[4917]: E0318 08:58:41.862553 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65333a03-4e64-4807-bae2-cb0b1d517e08" containerName="nova-cell1-openstack-openstack-cell1" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.862561 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="65333a03-4e64-4807-bae2-cb0b1d517e08" containerName="nova-cell1-openstack-openstack-cell1" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.862843 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="65333a03-4e64-4807-bae2-cb0b1d517e08" containerName="nova-cell1-openstack-openstack-cell1" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.862866 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb6da239-0296-401d-ad86-a57ec42e96be" containerName="registry-server" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.862895 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0" containerName="oc" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.864007 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.865651 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.866618 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.867071 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.867444 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.867852 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.872329 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-48kxh"] Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.918683 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.918731 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vrnx\" (UniqueName: \"kubernetes.io/projected/cc4c5b61-1b45-4d87-aa89-88daf0227751-kube-api-access-9vrnx\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.918778 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.918807 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.918867 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.918912 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:41 crc kubenswrapper[4917]: I0318 08:58:41.918984 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-inventory\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.020463 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.020553 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.020696 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.020786 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.020954 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-inventory\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.021008 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.021052 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vrnx\" (UniqueName: \"kubernetes.io/projected/cc4c5b61-1b45-4d87-aa89-88daf0227751-kube-api-access-9vrnx\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.025942 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.026507 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.026693 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.029427 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-inventory\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.031513 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.034723 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.044499 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vrnx\" (UniqueName: \"kubernetes.io/projected/cc4c5b61-1b45-4d87-aa89-88daf0227751-kube-api-access-9vrnx\") pod \"telemetry-openstack-openstack-cell1-48kxh\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.200305 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 08:58:42 crc kubenswrapper[4917]: I0318 08:58:42.818761 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-48kxh"] Mar 18 08:58:43 crc kubenswrapper[4917]: I0318 08:58:43.785862 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-48kxh" event={"ID":"cc4c5b61-1b45-4d87-aa89-88daf0227751","Type":"ContainerStarted","Data":"c0b62786110cb52392af6c91547c5562695f0e079b4f51945de1cd9761885e5f"} Mar 18 08:58:43 crc kubenswrapper[4917]: I0318 08:58:43.786229 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-48kxh" event={"ID":"cc4c5b61-1b45-4d87-aa89-88daf0227751","Type":"ContainerStarted","Data":"e1c55305deec41c719dca7f3f673585689215e2cc7647b1301087d6ebce4f59b"} Mar 18 08:58:43 crc kubenswrapper[4917]: I0318 08:58:43.811278 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-48kxh" podStartSLOduration=2.37707839 podStartE2EDuration="2.811256315s" podCreationTimestamp="2026-03-18 08:58:41 +0000 UTC" firstStartedPulling="2026-03-18 08:58:42.821394948 +0000 UTC m=+7907.762549662" lastFinishedPulling="2026-03-18 08:58:43.255572863 +0000 UTC m=+7908.196727587" observedRunningTime="2026-03-18 08:58:43.809573814 +0000 UTC m=+7908.750728538" watchObservedRunningTime="2026-03-18 08:58:43.811256315 +0000 UTC m=+7908.752411049" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.281040 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-csvs9"] Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.283656 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.297103 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csvs9"] Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.432709 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-utilities\") pod \"redhat-operators-csvs9\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.433269 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-catalog-content\") pod \"redhat-operators-csvs9\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.433353 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhv8k\" (UniqueName: \"kubernetes.io/projected/ff528591-e783-400f-acde-26b35d3af80e-kube-api-access-nhv8k\") pod \"redhat-operators-csvs9\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.535720 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-utilities\") pod \"redhat-operators-csvs9\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.535807 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-catalog-content\") pod \"redhat-operators-csvs9\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.535835 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhv8k\" (UniqueName: \"kubernetes.io/projected/ff528591-e783-400f-acde-26b35d3af80e-kube-api-access-nhv8k\") pod \"redhat-operators-csvs9\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.536358 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-utilities\") pod \"redhat-operators-csvs9\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.536426 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-catalog-content\") pod \"redhat-operators-csvs9\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.562035 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhv8k\" (UniqueName: \"kubernetes.io/projected/ff528591-e783-400f-acde-26b35d3af80e-kube-api-access-nhv8k\") pod \"redhat-operators-csvs9\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:30 crc kubenswrapper[4917]: I0318 08:59:30.611465 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:31 crc kubenswrapper[4917]: I0318 08:59:31.091614 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csvs9"] Mar 18 08:59:31 crc kubenswrapper[4917]: I0318 08:59:31.353166 4917 generic.go:334] "Generic (PLEG): container finished" podID="ff528591-e783-400f-acde-26b35d3af80e" containerID="08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976" exitCode=0 Mar 18 08:59:31 crc kubenswrapper[4917]: I0318 08:59:31.353355 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csvs9" event={"ID":"ff528591-e783-400f-acde-26b35d3af80e","Type":"ContainerDied","Data":"08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976"} Mar 18 08:59:31 crc kubenswrapper[4917]: I0318 08:59:31.353820 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csvs9" event={"ID":"ff528591-e783-400f-acde-26b35d3af80e","Type":"ContainerStarted","Data":"1042a454b4aeb373fc0133df516021ed6372d7c8ca55c34a670e8485d989ab53"} Mar 18 08:59:31 crc kubenswrapper[4917]: I0318 08:59:31.355576 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 08:59:33 crc kubenswrapper[4917]: I0318 08:59:33.373974 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csvs9" event={"ID":"ff528591-e783-400f-acde-26b35d3af80e","Type":"ContainerStarted","Data":"60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8"} Mar 18 08:59:38 crc kubenswrapper[4917]: I0318 08:59:38.424107 4917 generic.go:334] "Generic (PLEG): container finished" podID="ff528591-e783-400f-acde-26b35d3af80e" containerID="60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8" exitCode=0 Mar 18 08:59:38 crc kubenswrapper[4917]: I0318 08:59:38.424175 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csvs9" event={"ID":"ff528591-e783-400f-acde-26b35d3af80e","Type":"ContainerDied","Data":"60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8"} Mar 18 08:59:39 crc kubenswrapper[4917]: I0318 08:59:39.437786 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csvs9" event={"ID":"ff528591-e783-400f-acde-26b35d3af80e","Type":"ContainerStarted","Data":"469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e"} Mar 18 08:59:39 crc kubenswrapper[4917]: I0318 08:59:39.460577 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-csvs9" podStartSLOduration=1.8411781999999999 podStartE2EDuration="9.460553031s" podCreationTimestamp="2026-03-18 08:59:30 +0000 UTC" firstStartedPulling="2026-03-18 08:59:31.355313191 +0000 UTC m=+7956.296467915" lastFinishedPulling="2026-03-18 08:59:38.974688022 +0000 UTC m=+7963.915842746" observedRunningTime="2026-03-18 08:59:39.454539745 +0000 UTC m=+7964.395694479" watchObservedRunningTime="2026-03-18 08:59:39.460553031 +0000 UTC m=+7964.401707745" Mar 18 08:59:40 crc kubenswrapper[4917]: I0318 08:59:40.612268 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:40 crc kubenswrapper[4917]: I0318 08:59:40.613528 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:41 crc kubenswrapper[4917]: I0318 08:59:41.659324 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-csvs9" podUID="ff528591-e783-400f-acde-26b35d3af80e" containerName="registry-server" probeResult="failure" output=< Mar 18 08:59:41 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 08:59:41 crc kubenswrapper[4917]: > Mar 18 08:59:50 crc kubenswrapper[4917]: I0318 08:59:50.711137 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:50 crc kubenswrapper[4917]: I0318 08:59:50.799359 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:50 crc kubenswrapper[4917]: I0318 08:59:50.961263 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csvs9"] Mar 18 08:59:52 crc kubenswrapper[4917]: I0318 08:59:52.566484 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-csvs9" podUID="ff528591-e783-400f-acde-26b35d3af80e" containerName="registry-server" containerID="cri-o://469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e" gracePeriod=2 Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.095635 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.234526 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-catalog-content\") pod \"ff528591-e783-400f-acde-26b35d3af80e\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.234785 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-utilities\") pod \"ff528591-e783-400f-acde-26b35d3af80e\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.234892 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhv8k\" (UniqueName: \"kubernetes.io/projected/ff528591-e783-400f-acde-26b35d3af80e-kube-api-access-nhv8k\") pod \"ff528591-e783-400f-acde-26b35d3af80e\" (UID: \"ff528591-e783-400f-acde-26b35d3af80e\") " Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.235544 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-utilities" (OuterVolumeSpecName: "utilities") pod "ff528591-e783-400f-acde-26b35d3af80e" (UID: "ff528591-e783-400f-acde-26b35d3af80e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.241634 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff528591-e783-400f-acde-26b35d3af80e-kube-api-access-nhv8k" (OuterVolumeSpecName: "kube-api-access-nhv8k") pod "ff528591-e783-400f-acde-26b35d3af80e" (UID: "ff528591-e783-400f-acde-26b35d3af80e"). InnerVolumeSpecName "kube-api-access-nhv8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.337292 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.337336 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhv8k\" (UniqueName: \"kubernetes.io/projected/ff528591-e783-400f-acde-26b35d3af80e-kube-api-access-nhv8k\") on node \"crc\" DevicePath \"\"" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.374954 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff528591-e783-400f-acde-26b35d3af80e" (UID: "ff528591-e783-400f-acde-26b35d3af80e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.439065 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff528591-e783-400f-acde-26b35d3af80e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.580556 4917 generic.go:334] "Generic (PLEG): container finished" podID="ff528591-e783-400f-acde-26b35d3af80e" containerID="469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e" exitCode=0 Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.580660 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csvs9" event={"ID":"ff528591-e783-400f-acde-26b35d3af80e","Type":"ContainerDied","Data":"469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e"} Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.580711 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csvs9" event={"ID":"ff528591-e783-400f-acde-26b35d3af80e","Type":"ContainerDied","Data":"1042a454b4aeb373fc0133df516021ed6372d7c8ca55c34a670e8485d989ab53"} Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.580730 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csvs9" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.580749 4917 scope.go:117] "RemoveContainer" containerID="469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.612563 4917 scope.go:117] "RemoveContainer" containerID="60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.625505 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csvs9"] Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.634674 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-csvs9"] Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.667004 4917 scope.go:117] "RemoveContainer" containerID="08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.707132 4917 scope.go:117] "RemoveContainer" containerID="469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e" Mar 18 08:59:53 crc kubenswrapper[4917]: E0318 08:59:53.707655 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e\": container with ID starting with 469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e not found: ID does not exist" containerID="469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.707704 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e"} err="failed to get container status \"469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e\": rpc error: code = NotFound desc = could not find container \"469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e\": container with ID starting with 469b996285680f95927de706f53449bddf2342be8875142d2a996281343fa41e not found: ID does not exist" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.707736 4917 scope.go:117] "RemoveContainer" containerID="60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8" Mar 18 08:59:53 crc kubenswrapper[4917]: E0318 08:59:53.708102 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8\": container with ID starting with 60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8 not found: ID does not exist" containerID="60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.708127 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8"} err="failed to get container status \"60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8\": rpc error: code = NotFound desc = could not find container \"60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8\": container with ID starting with 60a7a04f4b7d5ac0c48c0faae58f09a2ee4a12aef9735ad0fa2ea07512a1c8e8 not found: ID does not exist" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.708142 4917 scope.go:117] "RemoveContainer" containerID="08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976" Mar 18 08:59:53 crc kubenswrapper[4917]: E0318 08:59:53.708383 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976\": container with ID starting with 08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976 not found: ID does not exist" containerID="08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.708408 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976"} err="failed to get container status \"08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976\": rpc error: code = NotFound desc = could not find container \"08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976\": container with ID starting with 08850d6f8e54399bf725403b7df1b3ea0395b964396e1b9d1f839174e0f98976 not found: ID does not exist" Mar 18 08:59:53 crc kubenswrapper[4917]: I0318 08:59:53.785115 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff528591-e783-400f-acde-26b35d3af80e" path="/var/lib/kubelet/pods/ff528591-e783-400f-acde-26b35d3af80e/volumes" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.164309 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563740-klzkp"] Mar 18 09:00:00 crc kubenswrapper[4917]: E0318 09:00:00.165563 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff528591-e783-400f-acde-26b35d3af80e" containerName="extract-utilities" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.165598 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff528591-e783-400f-acde-26b35d3af80e" containerName="extract-utilities" Mar 18 09:00:00 crc kubenswrapper[4917]: E0318 09:00:00.165631 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff528591-e783-400f-acde-26b35d3af80e" containerName="extract-content" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.165640 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff528591-e783-400f-acde-26b35d3af80e" containerName="extract-content" Mar 18 09:00:00 crc kubenswrapper[4917]: E0318 09:00:00.165664 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff528591-e783-400f-acde-26b35d3af80e" containerName="registry-server" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.165674 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff528591-e783-400f-acde-26b35d3af80e" containerName="registry-server" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.165924 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff528591-e783-400f-acde-26b35d3af80e" containerName="registry-server" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.166884 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563740-klzkp" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.169508 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.169855 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.179131 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.185513 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4"] Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.186862 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.188852 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.198783 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.207374 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563740-klzkp"] Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.217658 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4"] Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.280673 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0af091ff-13bc-4112-a010-f90e3d0aba49-secret-volume\") pod \"collect-profiles-29563740-86hr4\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.281126 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j82cw\" (UniqueName: \"kubernetes.io/projected/0af091ff-13bc-4112-a010-f90e3d0aba49-kube-api-access-j82cw\") pod \"collect-profiles-29563740-86hr4\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.281422 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6dnl\" (UniqueName: \"kubernetes.io/projected/4d28518f-85ba-4cf0-adc4-f72e31386ad3-kube-api-access-c6dnl\") pod \"auto-csr-approver-29563740-klzkp\" (UID: \"4d28518f-85ba-4cf0-adc4-f72e31386ad3\") " pod="openshift-infra/auto-csr-approver-29563740-klzkp" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.281573 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0af091ff-13bc-4112-a010-f90e3d0aba49-config-volume\") pod \"collect-profiles-29563740-86hr4\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.384603 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0af091ff-13bc-4112-a010-f90e3d0aba49-config-volume\") pod \"collect-profiles-29563740-86hr4\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.384742 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0af091ff-13bc-4112-a010-f90e3d0aba49-secret-volume\") pod \"collect-profiles-29563740-86hr4\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.384843 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j82cw\" (UniqueName: \"kubernetes.io/projected/0af091ff-13bc-4112-a010-f90e3d0aba49-kube-api-access-j82cw\") pod \"collect-profiles-29563740-86hr4\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.384930 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6dnl\" (UniqueName: \"kubernetes.io/projected/4d28518f-85ba-4cf0-adc4-f72e31386ad3-kube-api-access-c6dnl\") pod \"auto-csr-approver-29563740-klzkp\" (UID: \"4d28518f-85ba-4cf0-adc4-f72e31386ad3\") " pod="openshift-infra/auto-csr-approver-29563740-klzkp" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.385531 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0af091ff-13bc-4112-a010-f90e3d0aba49-config-volume\") pod \"collect-profiles-29563740-86hr4\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.391553 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0af091ff-13bc-4112-a010-f90e3d0aba49-secret-volume\") pod \"collect-profiles-29563740-86hr4\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.408999 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6dnl\" (UniqueName: \"kubernetes.io/projected/4d28518f-85ba-4cf0-adc4-f72e31386ad3-kube-api-access-c6dnl\") pod \"auto-csr-approver-29563740-klzkp\" (UID: \"4d28518f-85ba-4cf0-adc4-f72e31386ad3\") " pod="openshift-infra/auto-csr-approver-29563740-klzkp" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.409993 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j82cw\" (UniqueName: \"kubernetes.io/projected/0af091ff-13bc-4112-a010-f90e3d0aba49-kube-api-access-j82cw\") pod \"collect-profiles-29563740-86hr4\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.521813 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563740-klzkp" Mar 18 09:00:00 crc kubenswrapper[4917]: I0318 09:00:00.530141 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:01 crc kubenswrapper[4917]: I0318 09:00:01.054761 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563740-klzkp"] Mar 18 09:00:01 crc kubenswrapper[4917]: W0318 09:00:01.064318 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d28518f_85ba_4cf0_adc4_f72e31386ad3.slice/crio-f4b1482a5f6dc5e5062cec342b87c212da4cab9ab2c3adc5b4075cb74bfa15b4 WatchSource:0}: Error finding container f4b1482a5f6dc5e5062cec342b87c212da4cab9ab2c3adc5b4075cb74bfa15b4: Status 404 returned error can't find the container with id f4b1482a5f6dc5e5062cec342b87c212da4cab9ab2c3adc5b4075cb74bfa15b4 Mar 18 09:00:01 crc kubenswrapper[4917]: I0318 09:00:01.070618 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4"] Mar 18 09:00:01 crc kubenswrapper[4917]: I0318 09:00:01.661501 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" event={"ID":"0af091ff-13bc-4112-a010-f90e3d0aba49","Type":"ContainerStarted","Data":"329c3a74061f57365fad1ffb6ec8686d5892482281503168789b768236e9e0ed"} Mar 18 09:00:01 crc kubenswrapper[4917]: I0318 09:00:01.661972 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" event={"ID":"0af091ff-13bc-4112-a010-f90e3d0aba49","Type":"ContainerStarted","Data":"36e51b49b8517e9586fe35d7a7caa5d80ccb763f01284c7d231900c4f7c4bcd3"} Mar 18 09:00:01 crc kubenswrapper[4917]: I0318 09:00:01.663915 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563740-klzkp" event={"ID":"4d28518f-85ba-4cf0-adc4-f72e31386ad3","Type":"ContainerStarted","Data":"f4b1482a5f6dc5e5062cec342b87c212da4cab9ab2c3adc5b4075cb74bfa15b4"} Mar 18 09:00:01 crc kubenswrapper[4917]: I0318 09:00:01.689602 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" podStartSLOduration=1.689559589 podStartE2EDuration="1.689559589s" podCreationTimestamp="2026-03-18 09:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:00:01.680635423 +0000 UTC m=+7986.621790137" watchObservedRunningTime="2026-03-18 09:00:01.689559589 +0000 UTC m=+7986.630714323" Mar 18 09:00:02 crc kubenswrapper[4917]: I0318 09:00:02.675920 4917 generic.go:334] "Generic (PLEG): container finished" podID="0af091ff-13bc-4112-a010-f90e3d0aba49" containerID="329c3a74061f57365fad1ffb6ec8686d5892482281503168789b768236e9e0ed" exitCode=0 Mar 18 09:00:02 crc kubenswrapper[4917]: I0318 09:00:02.676151 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" event={"ID":"0af091ff-13bc-4112-a010-f90e3d0aba49","Type":"ContainerDied","Data":"329c3a74061f57365fad1ffb6ec8686d5892482281503168789b768236e9e0ed"} Mar 18 09:00:02 crc kubenswrapper[4917]: I0318 09:00:02.929488 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:00:02 crc kubenswrapper[4917]: I0318 09:00:02.929614 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.097355 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.178168 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j82cw\" (UniqueName: \"kubernetes.io/projected/0af091ff-13bc-4112-a010-f90e3d0aba49-kube-api-access-j82cw\") pod \"0af091ff-13bc-4112-a010-f90e3d0aba49\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.178212 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0af091ff-13bc-4112-a010-f90e3d0aba49-config-volume\") pod \"0af091ff-13bc-4112-a010-f90e3d0aba49\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.178516 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0af091ff-13bc-4112-a010-f90e3d0aba49-secret-volume\") pod \"0af091ff-13bc-4112-a010-f90e3d0aba49\" (UID: \"0af091ff-13bc-4112-a010-f90e3d0aba49\") " Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.179473 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0af091ff-13bc-4112-a010-f90e3d0aba49-config-volume" (OuterVolumeSpecName: "config-volume") pod "0af091ff-13bc-4112-a010-f90e3d0aba49" (UID: "0af091ff-13bc-4112-a010-f90e3d0aba49"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.185285 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0af091ff-13bc-4112-a010-f90e3d0aba49-kube-api-access-j82cw" (OuterVolumeSpecName: "kube-api-access-j82cw") pod "0af091ff-13bc-4112-a010-f90e3d0aba49" (UID: "0af091ff-13bc-4112-a010-f90e3d0aba49"). InnerVolumeSpecName "kube-api-access-j82cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.186866 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0af091ff-13bc-4112-a010-f90e3d0aba49-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0af091ff-13bc-4112-a010-f90e3d0aba49" (UID: "0af091ff-13bc-4112-a010-f90e3d0aba49"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.281016 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0af091ff-13bc-4112-a010-f90e3d0aba49-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.281056 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j82cw\" (UniqueName: \"kubernetes.io/projected/0af091ff-13bc-4112-a010-f90e3d0aba49-kube-api-access-j82cw\") on node \"crc\" DevicePath \"\"" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.281071 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0af091ff-13bc-4112-a010-f90e3d0aba49-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.701098 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" event={"ID":"0af091ff-13bc-4112-a010-f90e3d0aba49","Type":"ContainerDied","Data":"36e51b49b8517e9586fe35d7a7caa5d80ccb763f01284c7d231900c4f7c4bcd3"} Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.701149 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36e51b49b8517e9586fe35d7a7caa5d80ccb763f01284c7d231900c4f7c4bcd3" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.701196 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563740-86hr4" Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.788805 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k"] Mar 18 09:00:04 crc kubenswrapper[4917]: I0318 09:00:04.805416 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563695-b954k"] Mar 18 09:00:05 crc kubenswrapper[4917]: I0318 09:00:05.710642 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563740-klzkp" event={"ID":"4d28518f-85ba-4cf0-adc4-f72e31386ad3","Type":"ContainerStarted","Data":"14c61bb9a27dbd77cb16aa0f74c3ac9f18c1a61d10be01982aa4a2710c585da0"} Mar 18 09:00:05 crc kubenswrapper[4917]: I0318 09:00:05.735472 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563740-klzkp" podStartSLOduration=1.59434269 podStartE2EDuration="5.735450126s" podCreationTimestamp="2026-03-18 09:00:00 +0000 UTC" firstStartedPulling="2026-03-18 09:00:01.069777262 +0000 UTC m=+7986.010931976" lastFinishedPulling="2026-03-18 09:00:05.210884648 +0000 UTC m=+7990.152039412" observedRunningTime="2026-03-18 09:00:05.724199453 +0000 UTC m=+7990.665354177" watchObservedRunningTime="2026-03-18 09:00:05.735450126 +0000 UTC m=+7990.676604850" Mar 18 09:00:05 crc kubenswrapper[4917]: I0318 09:00:05.784203 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a612ff9-5b00-44d7-9966-5fba72c2963b" path="/var/lib/kubelet/pods/2a612ff9-5b00-44d7-9966-5fba72c2963b/volumes" Mar 18 09:00:06 crc kubenswrapper[4917]: I0318 09:00:06.724312 4917 generic.go:334] "Generic (PLEG): container finished" podID="4d28518f-85ba-4cf0-adc4-f72e31386ad3" containerID="14c61bb9a27dbd77cb16aa0f74c3ac9f18c1a61d10be01982aa4a2710c585da0" exitCode=0 Mar 18 09:00:06 crc kubenswrapper[4917]: I0318 09:00:06.724370 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563740-klzkp" event={"ID":"4d28518f-85ba-4cf0-adc4-f72e31386ad3","Type":"ContainerDied","Data":"14c61bb9a27dbd77cb16aa0f74c3ac9f18c1a61d10be01982aa4a2710c585da0"} Mar 18 09:00:08 crc kubenswrapper[4917]: I0318 09:00:08.186679 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563740-klzkp" Mar 18 09:00:08 crc kubenswrapper[4917]: I0318 09:00:08.272836 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6dnl\" (UniqueName: \"kubernetes.io/projected/4d28518f-85ba-4cf0-adc4-f72e31386ad3-kube-api-access-c6dnl\") pod \"4d28518f-85ba-4cf0-adc4-f72e31386ad3\" (UID: \"4d28518f-85ba-4cf0-adc4-f72e31386ad3\") " Mar 18 09:00:08 crc kubenswrapper[4917]: I0318 09:00:08.280110 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d28518f-85ba-4cf0-adc4-f72e31386ad3-kube-api-access-c6dnl" (OuterVolumeSpecName: "kube-api-access-c6dnl") pod "4d28518f-85ba-4cf0-adc4-f72e31386ad3" (UID: "4d28518f-85ba-4cf0-adc4-f72e31386ad3"). InnerVolumeSpecName "kube-api-access-c6dnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:00:08 crc kubenswrapper[4917]: I0318 09:00:08.375165 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6dnl\" (UniqueName: \"kubernetes.io/projected/4d28518f-85ba-4cf0-adc4-f72e31386ad3-kube-api-access-c6dnl\") on node \"crc\" DevicePath \"\"" Mar 18 09:00:08 crc kubenswrapper[4917]: I0318 09:00:08.750773 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563740-klzkp" event={"ID":"4d28518f-85ba-4cf0-adc4-f72e31386ad3","Type":"ContainerDied","Data":"f4b1482a5f6dc5e5062cec342b87c212da4cab9ab2c3adc5b4075cb74bfa15b4"} Mar 18 09:00:08 crc kubenswrapper[4917]: I0318 09:00:08.750809 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b1482a5f6dc5e5062cec342b87c212da4cab9ab2c3adc5b4075cb74bfa15b4" Mar 18 09:00:08 crc kubenswrapper[4917]: I0318 09:00:08.750863 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563740-klzkp" Mar 18 09:00:08 crc kubenswrapper[4917]: I0318 09:00:08.794632 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563734-vnh84"] Mar 18 09:00:08 crc kubenswrapper[4917]: I0318 09:00:08.804739 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563734-vnh84"] Mar 18 09:00:09 crc kubenswrapper[4917]: I0318 09:00:09.786636 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2614c412-c2de-4410-bf1e-a6b01eecaaff" path="/var/lib/kubelet/pods/2614c412-c2de-4410-bf1e-a6b01eecaaff/volumes" Mar 18 09:00:15 crc kubenswrapper[4917]: I0318 09:00:15.794173 4917 scope.go:117] "RemoveContainer" containerID="4d0274c6dd4cbda94d8d055eca4e35e8ea9d6f36db2ba77958742562e530e48d" Mar 18 09:00:15 crc kubenswrapper[4917]: I0318 09:00:15.829575 4917 scope.go:117] "RemoveContainer" containerID="0debe370d574331b34a0547cd908f07e50c5fcd819a01116bcfdf837737af576" Mar 18 09:00:32 crc kubenswrapper[4917]: I0318 09:00:32.929104 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:00:32 crc kubenswrapper[4917]: I0318 09:00:32.929881 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.165349 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29563741-qfnz6"] Mar 18 09:01:00 crc kubenswrapper[4917]: E0318 09:01:00.166491 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0af091ff-13bc-4112-a010-f90e3d0aba49" containerName="collect-profiles" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.166511 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0af091ff-13bc-4112-a010-f90e3d0aba49" containerName="collect-profiles" Mar 18 09:01:00 crc kubenswrapper[4917]: E0318 09:01:00.166558 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d28518f-85ba-4cf0-adc4-f72e31386ad3" containerName="oc" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.166568 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d28518f-85ba-4cf0-adc4-f72e31386ad3" containerName="oc" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.166812 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0af091ff-13bc-4112-a010-f90e3d0aba49" containerName="collect-profiles" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.166851 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d28518f-85ba-4cf0-adc4-f72e31386ad3" containerName="oc" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.167616 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.194197 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563741-qfnz6"] Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.254761 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-config-data\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.254843 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-fernet-keys\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.254928 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22rv7\" (UniqueName: \"kubernetes.io/projected/1a111eec-671e-4a20-9e9a-5ce32fd77146-kube-api-access-22rv7\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.255231 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-combined-ca-bundle\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.356266 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-combined-ca-bundle\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.356326 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-config-data\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.356352 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-fernet-keys\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.356383 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22rv7\" (UniqueName: \"kubernetes.io/projected/1a111eec-671e-4a20-9e9a-5ce32fd77146-kube-api-access-22rv7\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.363878 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-config-data\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.364609 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-combined-ca-bundle\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.364874 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-fernet-keys\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.374390 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22rv7\" (UniqueName: \"kubernetes.io/projected/1a111eec-671e-4a20-9e9a-5ce32fd77146-kube-api-access-22rv7\") pod \"keystone-cron-29563741-qfnz6\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:00 crc kubenswrapper[4917]: I0318 09:01:00.501407 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:01 crc kubenswrapper[4917]: I0318 09:01:01.003359 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29563741-qfnz6"] Mar 18 09:01:01 crc kubenswrapper[4917]: I0318 09:01:01.332279 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563741-qfnz6" event={"ID":"1a111eec-671e-4a20-9e9a-5ce32fd77146","Type":"ContainerStarted","Data":"9a589d19616fccb55fc14d2470053f4b3eeed128eeec5772359d530f37135e44"} Mar 18 09:01:01 crc kubenswrapper[4917]: I0318 09:01:01.332641 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563741-qfnz6" event={"ID":"1a111eec-671e-4a20-9e9a-5ce32fd77146","Type":"ContainerStarted","Data":"273be3ad79b8a03f94a62b4de654c6d302bfeab852152dea4eacb868b2f22185"} Mar 18 09:01:01 crc kubenswrapper[4917]: I0318 09:01:01.357606 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29563741-qfnz6" podStartSLOduration=1.357567134 podStartE2EDuration="1.357567134s" podCreationTimestamp="2026-03-18 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:01:01.351803374 +0000 UTC m=+8046.292958108" watchObservedRunningTime="2026-03-18 09:01:01.357567134 +0000 UTC m=+8046.298721848" Mar 18 09:01:02 crc kubenswrapper[4917]: I0318 09:01:02.929293 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:01:02 crc kubenswrapper[4917]: I0318 09:01:02.929779 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:01:02 crc kubenswrapper[4917]: I0318 09:01:02.929841 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 09:01:02 crc kubenswrapper[4917]: I0318 09:01:02.931164 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a573aaa8431dbcec8312c697a1f722048d6ff3b34cc7b982e4853b7a5ec2a1fe"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:01:02 crc kubenswrapper[4917]: I0318 09:01:02.931260 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://a573aaa8431dbcec8312c697a1f722048d6ff3b34cc7b982e4853b7a5ec2a1fe" gracePeriod=600 Mar 18 09:01:03 crc kubenswrapper[4917]: I0318 09:01:03.354690 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="a573aaa8431dbcec8312c697a1f722048d6ff3b34cc7b982e4853b7a5ec2a1fe" exitCode=0 Mar 18 09:01:03 crc kubenswrapper[4917]: I0318 09:01:03.354786 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"a573aaa8431dbcec8312c697a1f722048d6ff3b34cc7b982e4853b7a5ec2a1fe"} Mar 18 09:01:03 crc kubenswrapper[4917]: I0318 09:01:03.355397 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73"} Mar 18 09:01:03 crc kubenswrapper[4917]: I0318 09:01:03.355451 4917 scope.go:117] "RemoveContainer" containerID="e4c2cb4010dba676f1ca911457aff7b3e3552d8a89fcad2986703a8a08761369" Mar 18 09:01:04 crc kubenswrapper[4917]: I0318 09:01:04.368424 4917 generic.go:334] "Generic (PLEG): container finished" podID="1a111eec-671e-4a20-9e9a-5ce32fd77146" containerID="9a589d19616fccb55fc14d2470053f4b3eeed128eeec5772359d530f37135e44" exitCode=0 Mar 18 09:01:04 crc kubenswrapper[4917]: I0318 09:01:04.368495 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563741-qfnz6" event={"ID":"1a111eec-671e-4a20-9e9a-5ce32fd77146","Type":"ContainerDied","Data":"9a589d19616fccb55fc14d2470053f4b3eeed128eeec5772359d530f37135e44"} Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.760387 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.779735 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-combined-ca-bundle\") pod \"1a111eec-671e-4a20-9e9a-5ce32fd77146\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.780068 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-config-data\") pod \"1a111eec-671e-4a20-9e9a-5ce32fd77146\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.780225 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22rv7\" (UniqueName: \"kubernetes.io/projected/1a111eec-671e-4a20-9e9a-5ce32fd77146-kube-api-access-22rv7\") pod \"1a111eec-671e-4a20-9e9a-5ce32fd77146\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.780343 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-fernet-keys\") pod \"1a111eec-671e-4a20-9e9a-5ce32fd77146\" (UID: \"1a111eec-671e-4a20-9e9a-5ce32fd77146\") " Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.787798 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1a111eec-671e-4a20-9e9a-5ce32fd77146" (UID: "1a111eec-671e-4a20-9e9a-5ce32fd77146"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.793995 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a111eec-671e-4a20-9e9a-5ce32fd77146-kube-api-access-22rv7" (OuterVolumeSpecName: "kube-api-access-22rv7") pod "1a111eec-671e-4a20-9e9a-5ce32fd77146" (UID: "1a111eec-671e-4a20-9e9a-5ce32fd77146"). InnerVolumeSpecName "kube-api-access-22rv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.816975 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a111eec-671e-4a20-9e9a-5ce32fd77146" (UID: "1a111eec-671e-4a20-9e9a-5ce32fd77146"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.842296 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-config-data" (OuterVolumeSpecName: "config-data") pod "1a111eec-671e-4a20-9e9a-5ce32fd77146" (UID: "1a111eec-671e-4a20-9e9a-5ce32fd77146"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.882856 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.883006 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22rv7\" (UniqueName: \"kubernetes.io/projected/1a111eec-671e-4a20-9e9a-5ce32fd77146-kube-api-access-22rv7\") on node \"crc\" DevicePath \"\"" Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.883095 4917 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 18 09:01:05 crc kubenswrapper[4917]: I0318 09:01:05.883161 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a111eec-671e-4a20-9e9a-5ce32fd77146-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:01:06 crc kubenswrapper[4917]: I0318 09:01:06.391680 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29563741-qfnz6" event={"ID":"1a111eec-671e-4a20-9e9a-5ce32fd77146","Type":"ContainerDied","Data":"273be3ad79b8a03f94a62b4de654c6d302bfeab852152dea4eacb868b2f22185"} Mar 18 09:01:06 crc kubenswrapper[4917]: I0318 09:01:06.391994 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="273be3ad79b8a03f94a62b4de654c6d302bfeab852152dea4eacb868b2f22185" Mar 18 09:01:06 crc kubenswrapper[4917]: I0318 09:01:06.392064 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29563741-qfnz6" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.146597 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563742-7zlwq"] Mar 18 09:02:00 crc kubenswrapper[4917]: E0318 09:02:00.147579 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a111eec-671e-4a20-9e9a-5ce32fd77146" containerName="keystone-cron" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.147618 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a111eec-671e-4a20-9e9a-5ce32fd77146" containerName="keystone-cron" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.147908 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a111eec-671e-4a20-9e9a-5ce32fd77146" containerName="keystone-cron" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.148807 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563742-7zlwq" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.152249 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.152301 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.152312 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.160404 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563742-7zlwq"] Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.248776 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbk6s\" (UniqueName: \"kubernetes.io/projected/0fb94b62-26ed-48b5-b514-1aa766ad7265-kube-api-access-zbk6s\") pod \"auto-csr-approver-29563742-7zlwq\" (UID: \"0fb94b62-26ed-48b5-b514-1aa766ad7265\") " pod="openshift-infra/auto-csr-approver-29563742-7zlwq" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.352272 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbk6s\" (UniqueName: \"kubernetes.io/projected/0fb94b62-26ed-48b5-b514-1aa766ad7265-kube-api-access-zbk6s\") pod \"auto-csr-approver-29563742-7zlwq\" (UID: \"0fb94b62-26ed-48b5-b514-1aa766ad7265\") " pod="openshift-infra/auto-csr-approver-29563742-7zlwq" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.375260 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbk6s\" (UniqueName: \"kubernetes.io/projected/0fb94b62-26ed-48b5-b514-1aa766ad7265-kube-api-access-zbk6s\") pod \"auto-csr-approver-29563742-7zlwq\" (UID: \"0fb94b62-26ed-48b5-b514-1aa766ad7265\") " pod="openshift-infra/auto-csr-approver-29563742-7zlwq" Mar 18 09:02:00 crc kubenswrapper[4917]: I0318 09:02:00.477109 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563742-7zlwq" Mar 18 09:02:01 crc kubenswrapper[4917]: I0318 09:02:01.008499 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563742-7zlwq"] Mar 18 09:02:02 crc kubenswrapper[4917]: I0318 09:02:02.012804 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563742-7zlwq" event={"ID":"0fb94b62-26ed-48b5-b514-1aa766ad7265","Type":"ContainerStarted","Data":"ebf75e7d0cc0a323de8d701819b3c2313f77b7f7c2a0efaca78de047cac0ef22"} Mar 18 09:02:04 crc kubenswrapper[4917]: I0318 09:02:04.035490 4917 generic.go:334] "Generic (PLEG): container finished" podID="0fb94b62-26ed-48b5-b514-1aa766ad7265" containerID="fc52184b6832c596b2d8c941cf28507e2eda437800fe6abd54f59bf47ad3076a" exitCode=0 Mar 18 09:02:04 crc kubenswrapper[4917]: I0318 09:02:04.035535 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563742-7zlwq" event={"ID":"0fb94b62-26ed-48b5-b514-1aa766ad7265","Type":"ContainerDied","Data":"fc52184b6832c596b2d8c941cf28507e2eda437800fe6abd54f59bf47ad3076a"} Mar 18 09:02:05 crc kubenswrapper[4917]: I0318 09:02:05.412366 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563742-7zlwq" Mar 18 09:02:05 crc kubenswrapper[4917]: I0318 09:02:05.562139 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbk6s\" (UniqueName: \"kubernetes.io/projected/0fb94b62-26ed-48b5-b514-1aa766ad7265-kube-api-access-zbk6s\") pod \"0fb94b62-26ed-48b5-b514-1aa766ad7265\" (UID: \"0fb94b62-26ed-48b5-b514-1aa766ad7265\") " Mar 18 09:02:05 crc kubenswrapper[4917]: I0318 09:02:05.571563 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fb94b62-26ed-48b5-b514-1aa766ad7265-kube-api-access-zbk6s" (OuterVolumeSpecName: "kube-api-access-zbk6s") pod "0fb94b62-26ed-48b5-b514-1aa766ad7265" (UID: "0fb94b62-26ed-48b5-b514-1aa766ad7265"). InnerVolumeSpecName "kube-api-access-zbk6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:02:05 crc kubenswrapper[4917]: I0318 09:02:05.665124 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbk6s\" (UniqueName: \"kubernetes.io/projected/0fb94b62-26ed-48b5-b514-1aa766ad7265-kube-api-access-zbk6s\") on node \"crc\" DevicePath \"\"" Mar 18 09:02:06 crc kubenswrapper[4917]: I0318 09:02:06.092283 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563742-7zlwq" event={"ID":"0fb94b62-26ed-48b5-b514-1aa766ad7265","Type":"ContainerDied","Data":"ebf75e7d0cc0a323de8d701819b3c2313f77b7f7c2a0efaca78de047cac0ef22"} Mar 18 09:02:06 crc kubenswrapper[4917]: I0318 09:02:06.092880 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebf75e7d0cc0a323de8d701819b3c2313f77b7f7c2a0efaca78de047cac0ef22" Mar 18 09:02:06 crc kubenswrapper[4917]: I0318 09:02:06.093251 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563742-7zlwq" Mar 18 09:02:06 crc kubenswrapper[4917]: I0318 09:02:06.507393 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563736-wqf6d"] Mar 18 09:02:06 crc kubenswrapper[4917]: I0318 09:02:06.521786 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563736-wqf6d"] Mar 18 09:02:07 crc kubenswrapper[4917]: I0318 09:02:07.101892 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc4c5b61-1b45-4d87-aa89-88daf0227751" containerID="c0b62786110cb52392af6c91547c5562695f0e079b4f51945de1cd9761885e5f" exitCode=0 Mar 18 09:02:07 crc kubenswrapper[4917]: I0318 09:02:07.101939 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-48kxh" event={"ID":"cc4c5b61-1b45-4d87-aa89-88daf0227751","Type":"ContainerDied","Data":"c0b62786110cb52392af6c91547c5562695f0e079b4f51945de1cd9761885e5f"} Mar 18 09:02:07 crc kubenswrapper[4917]: I0318 09:02:07.787294 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1" path="/var/lib/kubelet/pods/a150ab76-b6d8-4160-a5b8-e6ee9ca68ba1/volumes" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.576370 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.741486 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-2\") pod \"cc4c5b61-1b45-4d87-aa89-88daf0227751\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.741564 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-1\") pod \"cc4c5b61-1b45-4d87-aa89-88daf0227751\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.741617 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-0\") pod \"cc4c5b61-1b45-4d87-aa89-88daf0227751\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.741637 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vrnx\" (UniqueName: \"kubernetes.io/projected/cc4c5b61-1b45-4d87-aa89-88daf0227751-kube-api-access-9vrnx\") pod \"cc4c5b61-1b45-4d87-aa89-88daf0227751\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.741774 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-telemetry-combined-ca-bundle\") pod \"cc4c5b61-1b45-4d87-aa89-88daf0227751\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.741793 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ssh-key-openstack-cell1\") pod \"cc4c5b61-1b45-4d87-aa89-88daf0227751\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.741876 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-inventory\") pod \"cc4c5b61-1b45-4d87-aa89-88daf0227751\" (UID: \"cc4c5b61-1b45-4d87-aa89-88daf0227751\") " Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.747833 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "cc4c5b61-1b45-4d87-aa89-88daf0227751" (UID: "cc4c5b61-1b45-4d87-aa89-88daf0227751"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.766881 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc4c5b61-1b45-4d87-aa89-88daf0227751-kube-api-access-9vrnx" (OuterVolumeSpecName: "kube-api-access-9vrnx") pod "cc4c5b61-1b45-4d87-aa89-88daf0227751" (UID: "cc4c5b61-1b45-4d87-aa89-88daf0227751"). InnerVolumeSpecName "kube-api-access-9vrnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.774144 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "cc4c5b61-1b45-4d87-aa89-88daf0227751" (UID: "cc4c5b61-1b45-4d87-aa89-88daf0227751"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.774476 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "cc4c5b61-1b45-4d87-aa89-88daf0227751" (UID: "cc4c5b61-1b45-4d87-aa89-88daf0227751"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.776541 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-inventory" (OuterVolumeSpecName: "inventory") pod "cc4c5b61-1b45-4d87-aa89-88daf0227751" (UID: "cc4c5b61-1b45-4d87-aa89-88daf0227751"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.778183 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "cc4c5b61-1b45-4d87-aa89-88daf0227751" (UID: "cc4c5b61-1b45-4d87-aa89-88daf0227751"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.788110 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cc4c5b61-1b45-4d87-aa89-88daf0227751" (UID: "cc4c5b61-1b45-4d87-aa89-88daf0227751"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.844133 4917 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.844164 4917 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.844173 4917 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.844183 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vrnx\" (UniqueName: \"kubernetes.io/projected/cc4c5b61-1b45-4d87-aa89-88daf0227751-kube-api-access-9vrnx\") on node \"crc\" DevicePath \"\"" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.844192 4917 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.844202 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 09:02:08 crc kubenswrapper[4917]: I0318 09:02:08.844211 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc4c5b61-1b45-4d87-aa89-88daf0227751-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.125379 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-48kxh" event={"ID":"cc4c5b61-1b45-4d87-aa89-88daf0227751","Type":"ContainerDied","Data":"e1c55305deec41c719dca7f3f673585689215e2cc7647b1301087d6ebce4f59b"} Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.125446 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1c55305deec41c719dca7f3f673585689215e2cc7647b1301087d6ebce4f59b" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.125992 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-48kxh" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.219301 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-c2fcp"] Mar 18 09:02:09 crc kubenswrapper[4917]: E0318 09:02:09.219767 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc4c5b61-1b45-4d87-aa89-88daf0227751" containerName="telemetry-openstack-openstack-cell1" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.219785 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc4c5b61-1b45-4d87-aa89-88daf0227751" containerName="telemetry-openstack-openstack-cell1" Mar 18 09:02:09 crc kubenswrapper[4917]: E0318 09:02:09.219808 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fb94b62-26ed-48b5-b514-1aa766ad7265" containerName="oc" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.219815 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fb94b62-26ed-48b5-b514-1aa766ad7265" containerName="oc" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.220052 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fb94b62-26ed-48b5-b514-1aa766ad7265" containerName="oc" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.220071 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc4c5b61-1b45-4d87-aa89-88daf0227751" containerName="telemetry-openstack-openstack-cell1" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.220817 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.223378 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.223404 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.223708 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.224208 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.229325 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.243398 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-c2fcp"] Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.362224 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.362936 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.363423 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.363968 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.364265 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pgr7\" (UniqueName: \"kubernetes.io/projected/4d740554-2999-4fcd-89ec-f5377c3db109-kube-api-access-8pgr7\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.467283 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.467697 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.467738 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.467769 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pgr7\" (UniqueName: \"kubernetes.io/projected/4d740554-2999-4fcd-89ec-f5377c3db109-kube-api-access-8pgr7\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.467877 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.473784 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.474291 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.474298 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.474914 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.484327 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pgr7\" (UniqueName: \"kubernetes.io/projected/4d740554-2999-4fcd-89ec-f5377c3db109-kube-api-access-8pgr7\") pod \"neutron-sriov-openstack-openstack-cell1-c2fcp\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:09 crc kubenswrapper[4917]: I0318 09:02:09.541488 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:02:10 crc kubenswrapper[4917]: I0318 09:02:10.112612 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-c2fcp"] Mar 18 09:02:10 crc kubenswrapper[4917]: I0318 09:02:10.139316 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" event={"ID":"4d740554-2999-4fcd-89ec-f5377c3db109","Type":"ContainerStarted","Data":"785f8345cb7ec9b36de0b30fd9aa17b2b22b6bfde4f7da91e3afcf6c55b16c86"} Mar 18 09:02:11 crc kubenswrapper[4917]: I0318 09:02:11.150304 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" event={"ID":"4d740554-2999-4fcd-89ec-f5377c3db109","Type":"ContainerStarted","Data":"df03dee2a4cd14ec8994822ab22ce5d2120121c205a20f99ef185a1f2b20fb60"} Mar 18 09:02:11 crc kubenswrapper[4917]: I0318 09:02:11.173411 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" podStartSLOduration=1.6085963699999999 podStartE2EDuration="2.173392784s" podCreationTimestamp="2026-03-18 09:02:09 +0000 UTC" firstStartedPulling="2026-03-18 09:02:10.112229466 +0000 UTC m=+8115.053384180" lastFinishedPulling="2026-03-18 09:02:10.67702588 +0000 UTC m=+8115.618180594" observedRunningTime="2026-03-18 09:02:11.16827182 +0000 UTC m=+8116.109426524" watchObservedRunningTime="2026-03-18 09:02:11.173392784 +0000 UTC m=+8116.114547498" Mar 18 09:02:16 crc kubenswrapper[4917]: I0318 09:02:16.006247 4917 scope.go:117] "RemoveContainer" containerID="434c9e72d2908e8db3cdb3c80494091728bc164d6a0a963c23317c797d345177" Mar 18 09:03:08 crc kubenswrapper[4917]: I0318 09:03:08.872462 4917 generic.go:334] "Generic (PLEG): container finished" podID="4d740554-2999-4fcd-89ec-f5377c3db109" containerID="df03dee2a4cd14ec8994822ab22ce5d2120121c205a20f99ef185a1f2b20fb60" exitCode=0 Mar 18 09:03:08 crc kubenswrapper[4917]: I0318 09:03:08.872529 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" event={"ID":"4d740554-2999-4fcd-89ec-f5377c3db109","Type":"ContainerDied","Data":"df03dee2a4cd14ec8994822ab22ce5d2120121c205a20f99ef185a1f2b20fb60"} Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.377271 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.472312 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-inventory\") pod \"4d740554-2999-4fcd-89ec-f5377c3db109\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.472453 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-combined-ca-bundle\") pod \"4d740554-2999-4fcd-89ec-f5377c3db109\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.472541 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-agent-neutron-config-0\") pod \"4d740554-2999-4fcd-89ec-f5377c3db109\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.472709 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-ssh-key-openstack-cell1\") pod \"4d740554-2999-4fcd-89ec-f5377c3db109\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.472936 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pgr7\" (UniqueName: \"kubernetes.io/projected/4d740554-2999-4fcd-89ec-f5377c3db109-kube-api-access-8pgr7\") pod \"4d740554-2999-4fcd-89ec-f5377c3db109\" (UID: \"4d740554-2999-4fcd-89ec-f5377c3db109\") " Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.477964 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "4d740554-2999-4fcd-89ec-f5377c3db109" (UID: "4d740554-2999-4fcd-89ec-f5377c3db109"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.479041 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d740554-2999-4fcd-89ec-f5377c3db109-kube-api-access-8pgr7" (OuterVolumeSpecName: "kube-api-access-8pgr7") pod "4d740554-2999-4fcd-89ec-f5377c3db109" (UID: "4d740554-2999-4fcd-89ec-f5377c3db109"). InnerVolumeSpecName "kube-api-access-8pgr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.499349 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-inventory" (OuterVolumeSpecName: "inventory") pod "4d740554-2999-4fcd-89ec-f5377c3db109" (UID: "4d740554-2999-4fcd-89ec-f5377c3db109"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.507028 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "4d740554-2999-4fcd-89ec-f5377c3db109" (UID: "4d740554-2999-4fcd-89ec-f5377c3db109"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.515790 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "4d740554-2999-4fcd-89ec-f5377c3db109" (UID: "4d740554-2999-4fcd-89ec-f5377c3db109"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.575285 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.575326 4917 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.575342 4917 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.575356 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/4d740554-2999-4fcd-89ec-f5377c3db109-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.575370 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pgr7\" (UniqueName: \"kubernetes.io/projected/4d740554-2999-4fcd-89ec-f5377c3db109-kube-api-access-8pgr7\") on node \"crc\" DevicePath \"\"" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.894087 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" event={"ID":"4d740554-2999-4fcd-89ec-f5377c3db109","Type":"ContainerDied","Data":"785f8345cb7ec9b36de0b30fd9aa17b2b22b6bfde4f7da91e3afcf6c55b16c86"} Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.894148 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="785f8345cb7ec9b36de0b30fd9aa17b2b22b6bfde4f7da91e3afcf6c55b16c86" Mar 18 09:03:10 crc kubenswrapper[4917]: I0318 09:03:10.894174 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-c2fcp" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.111139 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-ttm74"] Mar 18 09:03:11 crc kubenswrapper[4917]: E0318 09:03:11.111887 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d740554-2999-4fcd-89ec-f5377c3db109" containerName="neutron-sriov-openstack-openstack-cell1" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.111920 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d740554-2999-4fcd-89ec-f5377c3db109" containerName="neutron-sriov-openstack-openstack-cell1" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.112216 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d740554-2999-4fcd-89ec-f5377c3db109" containerName="neutron-sriov-openstack-openstack-cell1" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.113319 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.118021 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.118870 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.119027 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.120127 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.122037 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.129224 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-ttm74"] Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.190380 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.190712 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.190773 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg25g\" (UniqueName: \"kubernetes.io/projected/e81deaf1-264a-4b25-b851-b95a06067a3a-kube-api-access-pg25g\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.190849 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.190906 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.293375 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.293469 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.293575 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.293778 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.293836 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg25g\" (UniqueName: \"kubernetes.io/projected/e81deaf1-264a-4b25-b851-b95a06067a3a-kube-api-access-pg25g\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.297810 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.299807 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.303627 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.305120 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.318648 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg25g\" (UniqueName: \"kubernetes.io/projected/e81deaf1-264a-4b25-b851-b95a06067a3a-kube-api-access-pg25g\") pod \"neutron-dhcp-openstack-openstack-cell1-ttm74\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:11 crc kubenswrapper[4917]: I0318 09:03:11.473453 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:03:12 crc kubenswrapper[4917]: I0318 09:03:12.056534 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-ttm74"] Mar 18 09:03:12 crc kubenswrapper[4917]: I0318 09:03:12.914376 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" event={"ID":"e81deaf1-264a-4b25-b851-b95a06067a3a","Type":"ContainerStarted","Data":"edc46c2a2db2ba7b942c86a82779f1aa1542fa4ba4e20e915f6c8361d77a0423"} Mar 18 09:03:12 crc kubenswrapper[4917]: I0318 09:03:12.914775 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" event={"ID":"e81deaf1-264a-4b25-b851-b95a06067a3a","Type":"ContainerStarted","Data":"0f9184c99bd757e515a72df57bd438561c5edee3410555d179ccdca70ecdf0f5"} Mar 18 09:03:12 crc kubenswrapper[4917]: I0318 09:03:12.959779 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" podStartSLOduration=1.482666479 podStartE2EDuration="1.959750035s" podCreationTimestamp="2026-03-18 09:03:11 +0000 UTC" firstStartedPulling="2026-03-18 09:03:12.06562655 +0000 UTC m=+8177.006781264" lastFinishedPulling="2026-03-18 09:03:12.542710106 +0000 UTC m=+8177.483864820" observedRunningTime="2026-03-18 09:03:12.9455071 +0000 UTC m=+8177.886661834" watchObservedRunningTime="2026-03-18 09:03:12.959750035 +0000 UTC m=+8177.900904809" Mar 18 09:03:32 crc kubenswrapper[4917]: I0318 09:03:32.928692 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:03:32 crc kubenswrapper[4917]: I0318 09:03:32.929527 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.162440 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563744-69l8h"] Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.164813 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-69l8h" Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.169488 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.169817 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.170026 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.187788 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-69l8h"] Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.222905 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgfpr\" (UniqueName: \"kubernetes.io/projected/a4235a41-9c05-481f-a1b6-2b95d083027f-kube-api-access-wgfpr\") pod \"auto-csr-approver-29563744-69l8h\" (UID: \"a4235a41-9c05-481f-a1b6-2b95d083027f\") " pod="openshift-infra/auto-csr-approver-29563744-69l8h" Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.325343 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgfpr\" (UniqueName: \"kubernetes.io/projected/a4235a41-9c05-481f-a1b6-2b95d083027f-kube-api-access-wgfpr\") pod \"auto-csr-approver-29563744-69l8h\" (UID: \"a4235a41-9c05-481f-a1b6-2b95d083027f\") " pod="openshift-infra/auto-csr-approver-29563744-69l8h" Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.352174 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgfpr\" (UniqueName: \"kubernetes.io/projected/a4235a41-9c05-481f-a1b6-2b95d083027f-kube-api-access-wgfpr\") pod \"auto-csr-approver-29563744-69l8h\" (UID: \"a4235a41-9c05-481f-a1b6-2b95d083027f\") " pod="openshift-infra/auto-csr-approver-29563744-69l8h" Mar 18 09:04:00 crc kubenswrapper[4917]: I0318 09:04:00.498176 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-69l8h" Mar 18 09:04:01 crc kubenswrapper[4917]: I0318 09:04:01.011931 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-69l8h"] Mar 18 09:04:01 crc kubenswrapper[4917]: W0318 09:04:01.022700 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4235a41_9c05_481f_a1b6_2b95d083027f.slice/crio-06cd3ea60dbc6570c3450f0cd93ffb7f45bd75c9560381c730919d3abe77c77f WatchSource:0}: Error finding container 06cd3ea60dbc6570c3450f0cd93ffb7f45bd75c9560381c730919d3abe77c77f: Status 404 returned error can't find the container with id 06cd3ea60dbc6570c3450f0cd93ffb7f45bd75c9560381c730919d3abe77c77f Mar 18 09:04:01 crc kubenswrapper[4917]: I0318 09:04:01.481561 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-69l8h" event={"ID":"a4235a41-9c05-481f-a1b6-2b95d083027f","Type":"ContainerStarted","Data":"06cd3ea60dbc6570c3450f0cd93ffb7f45bd75c9560381c730919d3abe77c77f"} Mar 18 09:04:02 crc kubenswrapper[4917]: I0318 09:04:02.929327 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:04:02 crc kubenswrapper[4917]: I0318 09:04:02.929797 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:04:03 crc kubenswrapper[4917]: I0318 09:04:03.510761 4917 generic.go:334] "Generic (PLEG): container finished" podID="a4235a41-9c05-481f-a1b6-2b95d083027f" containerID="ca630a3de1c7610996d21582a17f3b5e3d2a0ebdccfb3b752865b73ccdcdc930" exitCode=0 Mar 18 09:04:03 crc kubenswrapper[4917]: I0318 09:04:03.510839 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-69l8h" event={"ID":"a4235a41-9c05-481f-a1b6-2b95d083027f","Type":"ContainerDied","Data":"ca630a3de1c7610996d21582a17f3b5e3d2a0ebdccfb3b752865b73ccdcdc930"} Mar 18 09:04:04 crc kubenswrapper[4917]: I0318 09:04:04.900485 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-69l8h" Mar 18 09:04:04 crc kubenswrapper[4917]: I0318 09:04:04.953810 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgfpr\" (UniqueName: \"kubernetes.io/projected/a4235a41-9c05-481f-a1b6-2b95d083027f-kube-api-access-wgfpr\") pod \"a4235a41-9c05-481f-a1b6-2b95d083027f\" (UID: \"a4235a41-9c05-481f-a1b6-2b95d083027f\") " Mar 18 09:04:04 crc kubenswrapper[4917]: I0318 09:04:04.976786 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4235a41-9c05-481f-a1b6-2b95d083027f-kube-api-access-wgfpr" (OuterVolumeSpecName: "kube-api-access-wgfpr") pod "a4235a41-9c05-481f-a1b6-2b95d083027f" (UID: "a4235a41-9c05-481f-a1b6-2b95d083027f"). InnerVolumeSpecName "kube-api-access-wgfpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:04:05 crc kubenswrapper[4917]: I0318 09:04:05.056479 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgfpr\" (UniqueName: \"kubernetes.io/projected/a4235a41-9c05-481f-a1b6-2b95d083027f-kube-api-access-wgfpr\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:05 crc kubenswrapper[4917]: I0318 09:04:05.543111 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563744-69l8h" event={"ID":"a4235a41-9c05-481f-a1b6-2b95d083027f","Type":"ContainerDied","Data":"06cd3ea60dbc6570c3450f0cd93ffb7f45bd75c9560381c730919d3abe77c77f"} Mar 18 09:04:05 crc kubenswrapper[4917]: I0318 09:04:05.543175 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06cd3ea60dbc6570c3450f0cd93ffb7f45bd75c9560381c730919d3abe77c77f" Mar 18 09:04:05 crc kubenswrapper[4917]: I0318 09:04:05.543182 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563744-69l8h" Mar 18 09:04:05 crc kubenswrapper[4917]: I0318 09:04:05.983175 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563738-f8pln"] Mar 18 09:04:05 crc kubenswrapper[4917]: I0318 09:04:05.994091 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563738-f8pln"] Mar 18 09:04:07 crc kubenswrapper[4917]: I0318 09:04:07.786918 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0" path="/var/lib/kubelet/pods/23ee4cf1-0e0d-48cb-a49e-ba90fc4f24b0/volumes" Mar 18 09:04:16 crc kubenswrapper[4917]: I0318 09:04:16.118007 4917 scope.go:117] "RemoveContainer" containerID="bc032706519eed172391d6b4e007f9991f6db6e91c0db146c38cf0083773161f" Mar 18 09:04:29 crc kubenswrapper[4917]: I0318 09:04:29.815672 4917 generic.go:334] "Generic (PLEG): container finished" podID="e81deaf1-264a-4b25-b851-b95a06067a3a" containerID="edc46c2a2db2ba7b942c86a82779f1aa1542fa4ba4e20e915f6c8361d77a0423" exitCode=0 Mar 18 09:04:29 crc kubenswrapper[4917]: I0318 09:04:29.816125 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" event={"ID":"e81deaf1-264a-4b25-b851-b95a06067a3a","Type":"ContainerDied","Data":"edc46c2a2db2ba7b942c86a82779f1aa1542fa4ba4e20e915f6c8361d77a0423"} Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.445074 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.550869 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg25g\" (UniqueName: \"kubernetes.io/projected/e81deaf1-264a-4b25-b851-b95a06067a3a-kube-api-access-pg25g\") pod \"e81deaf1-264a-4b25-b851-b95a06067a3a\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.550976 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-ssh-key-openstack-cell1\") pod \"e81deaf1-264a-4b25-b851-b95a06067a3a\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.551077 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-inventory\") pod \"e81deaf1-264a-4b25-b851-b95a06067a3a\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.551136 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-agent-neutron-config-0\") pod \"e81deaf1-264a-4b25-b851-b95a06067a3a\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.551177 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-combined-ca-bundle\") pod \"e81deaf1-264a-4b25-b851-b95a06067a3a\" (UID: \"e81deaf1-264a-4b25-b851-b95a06067a3a\") " Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.574853 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "e81deaf1-264a-4b25-b851-b95a06067a3a" (UID: "e81deaf1-264a-4b25-b851-b95a06067a3a"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.581914 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81deaf1-264a-4b25-b851-b95a06067a3a-kube-api-access-pg25g" (OuterVolumeSpecName: "kube-api-access-pg25g") pod "e81deaf1-264a-4b25-b851-b95a06067a3a" (UID: "e81deaf1-264a-4b25-b851-b95a06067a3a"). InnerVolumeSpecName "kube-api-access-pg25g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.590693 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-inventory" (OuterVolumeSpecName: "inventory") pod "e81deaf1-264a-4b25-b851-b95a06067a3a" (UID: "e81deaf1-264a-4b25-b851-b95a06067a3a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.592643 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "e81deaf1-264a-4b25-b851-b95a06067a3a" (UID: "e81deaf1-264a-4b25-b851-b95a06067a3a"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.601039 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e81deaf1-264a-4b25-b851-b95a06067a3a" (UID: "e81deaf1-264a-4b25-b851-b95a06067a3a"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.653461 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pg25g\" (UniqueName: \"kubernetes.io/projected/e81deaf1-264a-4b25-b851-b95a06067a3a-kube-api-access-pg25g\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.653500 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.653510 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.653529 4917 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.653541 4917 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81deaf1-264a-4b25-b851-b95a06067a3a-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.854814 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" event={"ID":"e81deaf1-264a-4b25-b851-b95a06067a3a","Type":"ContainerDied","Data":"0f9184c99bd757e515a72df57bd438561c5edee3410555d179ccdca70ecdf0f5"} Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.854860 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f9184c99bd757e515a72df57bd438561c5edee3410555d179ccdca70ecdf0f5" Mar 18 09:04:31 crc kubenswrapper[4917]: I0318 09:04:31.854902 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-ttm74" Mar 18 09:04:32 crc kubenswrapper[4917]: I0318 09:04:32.928813 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:04:32 crc kubenswrapper[4917]: I0318 09:04:32.929203 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:04:32 crc kubenswrapper[4917]: I0318 09:04:32.929269 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 09:04:32 crc kubenswrapper[4917]: I0318 09:04:32.930235 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:04:32 crc kubenswrapper[4917]: I0318 09:04:32.930317 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" gracePeriod=600 Mar 18 09:04:33 crc kubenswrapper[4917]: E0318 09:04:33.066331 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:04:33 crc kubenswrapper[4917]: I0318 09:04:33.879905 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" exitCode=0 Mar 18 09:04:33 crc kubenswrapper[4917]: I0318 09:04:33.879956 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73"} Mar 18 09:04:33 crc kubenswrapper[4917]: I0318 09:04:33.879993 4917 scope.go:117] "RemoveContainer" containerID="a573aaa8431dbcec8312c697a1f722048d6ff3b34cc7b982e4853b7a5ec2a1fe" Mar 18 09:04:33 crc kubenswrapper[4917]: I0318 09:04:33.880810 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:04:33 crc kubenswrapper[4917]: E0318 09:04:33.881274 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:04:37 crc kubenswrapper[4917]: I0318 09:04:37.617622 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:04:37 crc kubenswrapper[4917]: I0318 09:04:37.618237 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="fb39efb5-af75-4cea-a6c7-68f10cf1732c" containerName="nova-cell0-conductor-conductor" containerID="cri-o://57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4" gracePeriod=30 Mar 18 09:04:37 crc kubenswrapper[4917]: I0318 09:04:37.647772 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:04:37 crc kubenswrapper[4917]: I0318 09:04:37.648029 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="4b871c34-192d-4fe7-84ca-d2728b5f3900" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d6958e39261689a5e8457d30ee0c9a46f632a159b5914fabb8a8b980ae467217" gracePeriod=30 Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.305424 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w"] Mar 18 09:04:38 crc kubenswrapper[4917]: E0318 09:04:38.307712 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81deaf1-264a-4b25-b851-b95a06067a3a" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.307746 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81deaf1-264a-4b25-b851-b95a06067a3a" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 18 09:04:38 crc kubenswrapper[4917]: E0318 09:04:38.307764 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4235a41-9c05-481f-a1b6-2b95d083027f" containerName="oc" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.307773 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4235a41-9c05-481f-a1b6-2b95d083027f" containerName="oc" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.308028 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81deaf1-264a-4b25-b851-b95a06067a3a" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.308060 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4235a41-9c05-481f-a1b6-2b95d083027f" containerName="oc" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.308938 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.314374 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.314458 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.314780 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.315855 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-ddvxp" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.317386 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.318020 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.318280 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.321312 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w"] Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.398426 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.398504 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.398710 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.398958 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.399047 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vcvv\" (UniqueName: \"kubernetes.io/projected/68b25068-0099-413b-8a35-d218abba4a8c-kube-api-access-8vcvv\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.399148 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/68b25068-0099-413b-8a35-d218abba4a8c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.399220 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.399262 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.399298 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.399388 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.399657 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.473784 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.474009 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-log" containerID="cri-o://438f26f42f32838aafb20712ca0eac3c5b39291fc19fc6bc965d2d5fb9b1359d" gracePeriod=30 Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.474141 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-api" containerID="cri-o://7fc041ac2395b2c811e5ffcfa31fb4c9ae542eadaba3a16769fb5c397fc14505" gracePeriod=30 Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.492795 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.493009 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4c71fad2-8395-413b-8028-b10578eb45e5" containerName="nova-scheduler-scheduler" containerID="cri-o://56729c5d3ea1726d96d4077b5709e46532c09fde7bfad3870041bb8fde4dd330" gracePeriod=30 Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502004 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502107 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502138 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vcvv\" (UniqueName: \"kubernetes.io/projected/68b25068-0099-413b-8a35-d218abba4a8c-kube-api-access-8vcvv\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502166 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/68b25068-0099-413b-8a35-d218abba4a8c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502212 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502241 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502270 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502302 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502356 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502423 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.502493 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.509027 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.509229 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-log" containerID="cri-o://af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd" gracePeriod=30 Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.509668 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.509695 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-metadata" containerID="cri-o://0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a" gracePeriod=30 Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.510023 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.510836 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.511748 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.513158 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/68b25068-0099-413b-8a35-d218abba4a8c-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.514960 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.516731 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.517086 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.531941 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.533369 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.550961 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vcvv\" (UniqueName: \"kubernetes.io/projected/68b25068-0099-413b-8a35-d218abba4a8c-kube-api-access-8vcvv\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.640708 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.934439 4917 generic.go:334] "Generic (PLEG): container finished" podID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerID="438f26f42f32838aafb20712ca0eac3c5b39291fc19fc6bc965d2d5fb9b1359d" exitCode=143 Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.934750 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af","Type":"ContainerDied","Data":"438f26f42f32838aafb20712ca0eac3c5b39291fc19fc6bc965d2d5fb9b1359d"} Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.938840 4917 generic.go:334] "Generic (PLEG): container finished" podID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerID="af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd" exitCode=143 Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.938930 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c74b75-1b23-437f-bdbf-6888f4087bbb","Type":"ContainerDied","Data":"af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd"} Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.942520 4917 generic.go:334] "Generic (PLEG): container finished" podID="4b871c34-192d-4fe7-84ca-d2728b5f3900" containerID="d6958e39261689a5e8457d30ee0c9a46f632a159b5914fabb8a8b980ae467217" exitCode=0 Mar 18 09:04:38 crc kubenswrapper[4917]: I0318 09:04:38.942549 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4b871c34-192d-4fe7-84ca-d2728b5f3900","Type":"ContainerDied","Data":"d6958e39261689a5e8457d30ee0c9a46f632a159b5914fabb8a8b980ae467217"} Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.294872 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.411700 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w"] Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.420126 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.427637 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-config-data\") pod \"4b871c34-192d-4fe7-84ca-d2728b5f3900\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.427874 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c88tg\" (UniqueName: \"kubernetes.io/projected/4b871c34-192d-4fe7-84ca-d2728b5f3900-kube-api-access-c88tg\") pod \"4b871c34-192d-4fe7-84ca-d2728b5f3900\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.427972 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-combined-ca-bundle\") pod \"4b871c34-192d-4fe7-84ca-d2728b5f3900\" (UID: \"4b871c34-192d-4fe7-84ca-d2728b5f3900\") " Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.434814 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b871c34-192d-4fe7-84ca-d2728b5f3900-kube-api-access-c88tg" (OuterVolumeSpecName: "kube-api-access-c88tg") pod "4b871c34-192d-4fe7-84ca-d2728b5f3900" (UID: "4b871c34-192d-4fe7-84ca-d2728b5f3900"). InnerVolumeSpecName "kube-api-access-c88tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.465845 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-config-data" (OuterVolumeSpecName: "config-data") pod "4b871c34-192d-4fe7-84ca-d2728b5f3900" (UID: "4b871c34-192d-4fe7-84ca-d2728b5f3900"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:39 crc kubenswrapper[4917]: E0318 09:04:39.480287 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56729c5d3ea1726d96d4077b5709e46532c09fde7bfad3870041bb8fde4dd330" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.481475 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b871c34-192d-4fe7-84ca-d2728b5f3900" (UID: "4b871c34-192d-4fe7-84ca-d2728b5f3900"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:39 crc kubenswrapper[4917]: E0318 09:04:39.483074 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56729c5d3ea1726d96d4077b5709e46532c09fde7bfad3870041bb8fde4dd330" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:04:39 crc kubenswrapper[4917]: E0318 09:04:39.484704 4917 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="56729c5d3ea1726d96d4077b5709e46532c09fde7bfad3870041bb8fde4dd330" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 18 09:04:39 crc kubenswrapper[4917]: E0318 09:04:39.484745 4917 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4c71fad2-8395-413b-8028-b10578eb45e5" containerName="nova-scheduler-scheduler" Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.530697 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c88tg\" (UniqueName: \"kubernetes.io/projected/4b871c34-192d-4fe7-84ca-d2728b5f3900-kube-api-access-c88tg\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.531029 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.531044 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b871c34-192d-4fe7-84ca-d2728b5f3900-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.961762 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.961759 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4b871c34-192d-4fe7-84ca-d2728b5f3900","Type":"ContainerDied","Data":"8e87d7dfd210988cb5e80527460bb96a736e2242d9b6629d3049f8878d9b9692"} Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.961964 4917 scope.go:117] "RemoveContainer" containerID="d6958e39261689a5e8457d30ee0c9a46f632a159b5914fabb8a8b980ae467217" Mar 18 09:04:39 crc kubenswrapper[4917]: I0318 09:04:39.966289 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" event={"ID":"68b25068-0099-413b-8a35-d218abba4a8c","Type":"ContainerStarted","Data":"c638f4e7c4ac2a97db57f46c3d9dc674b5d7ca0bad8b024ac47606b4938728d5"} Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.000879 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.031818 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.046198 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:04:40 crc kubenswrapper[4917]: E0318 09:04:40.046960 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b871c34-192d-4fe7-84ca-d2728b5f3900" containerName="nova-cell1-conductor-conductor" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.046982 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b871c34-192d-4fe7-84ca-d2728b5f3900" containerName="nova-cell1-conductor-conductor" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.047313 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b871c34-192d-4fe7-84ca-d2728b5f3900" containerName="nova-cell1-conductor-conductor" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.048424 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.051275 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.061941 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.145284 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa5863-119c-460a-8ba2-e02c385319a9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4faa5863-119c-460a-8ba2-e02c385319a9\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.145383 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdmsx\" (UniqueName: \"kubernetes.io/projected/4faa5863-119c-460a-8ba2-e02c385319a9-kube-api-access-tdmsx\") pod \"nova-cell1-conductor-0\" (UID: \"4faa5863-119c-460a-8ba2-e02c385319a9\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.145748 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa5863-119c-460a-8ba2-e02c385319a9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4faa5863-119c-460a-8ba2-e02c385319a9\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.249050 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa5863-119c-460a-8ba2-e02c385319a9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4faa5863-119c-460a-8ba2-e02c385319a9\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.249130 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdmsx\" (UniqueName: \"kubernetes.io/projected/4faa5863-119c-460a-8ba2-e02c385319a9-kube-api-access-tdmsx\") pod \"nova-cell1-conductor-0\" (UID: \"4faa5863-119c-460a-8ba2-e02c385319a9\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.249283 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa5863-119c-460a-8ba2-e02c385319a9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4faa5863-119c-460a-8ba2-e02c385319a9\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.257371 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa5863-119c-460a-8ba2-e02c385319a9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4faa5863-119c-460a-8ba2-e02c385319a9\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.259932 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa5863-119c-460a-8ba2-e02c385319a9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4faa5863-119c-460a-8ba2-e02c385319a9\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.275140 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdmsx\" (UniqueName: \"kubernetes.io/projected/4faa5863-119c-460a-8ba2-e02c385319a9-kube-api-access-tdmsx\") pod \"nova-cell1-conductor-0\" (UID: \"4faa5863-119c-460a-8ba2-e02c385319a9\") " pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.374648 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:40 crc kubenswrapper[4917]: W0318 09:04:40.887915 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4faa5863_119c_460a_8ba2_e02c385319a9.slice/crio-2b29accffe98e7564ae84756dcad80948b87852834b7643371cd89b5a52e215e WatchSource:0}: Error finding container 2b29accffe98e7564ae84756dcad80948b87852834b7643371cd89b5a52e215e: Status 404 returned error can't find the container with id 2b29accffe98e7564ae84756dcad80948b87852834b7643371cd89b5a52e215e Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.890623 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 18 09:04:40 crc kubenswrapper[4917]: I0318 09:04:40.981845 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4faa5863-119c-460a-8ba2-e02c385319a9","Type":"ContainerStarted","Data":"2b29accffe98e7564ae84756dcad80948b87852834b7643371cd89b5a52e215e"} Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.409953 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.577018 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgtrt\" (UniqueName: \"kubernetes.io/projected/fb39efb5-af75-4cea-a6c7-68f10cf1732c-kube-api-access-pgtrt\") pod \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.577826 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-combined-ca-bundle\") pod \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.578179 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-config-data\") pod \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\" (UID: \"fb39efb5-af75-4cea-a6c7-68f10cf1732c\") " Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.582528 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb39efb5-af75-4cea-a6c7-68f10cf1732c-kube-api-access-pgtrt" (OuterVolumeSpecName: "kube-api-access-pgtrt") pod "fb39efb5-af75-4cea-a6c7-68f10cf1732c" (UID: "fb39efb5-af75-4cea-a6c7-68f10cf1732c"). InnerVolumeSpecName "kube-api-access-pgtrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.603768 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-config-data" (OuterVolumeSpecName: "config-data") pod "fb39efb5-af75-4cea-a6c7-68f10cf1732c" (UID: "fb39efb5-af75-4cea-a6c7-68f10cf1732c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.625828 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb39efb5-af75-4cea-a6c7-68f10cf1732c" (UID: "fb39efb5-af75-4cea-a6c7-68f10cf1732c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.681722 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.681764 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgtrt\" (UniqueName: \"kubernetes.io/projected/fb39efb5-af75-4cea-a6c7-68f10cf1732c-kube-api-access-pgtrt\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.681782 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb39efb5-af75-4cea-a6c7-68f10cf1732c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:41 crc kubenswrapper[4917]: I0318 09:04:41.786861 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b871c34-192d-4fe7-84ca-d2728b5f3900" path="/var/lib/kubelet/pods/4b871c34-192d-4fe7-84ca-d2728b5f3900/volumes" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.021250 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4faa5863-119c-460a-8ba2-e02c385319a9","Type":"ContainerStarted","Data":"87be822d62ef5e066ffc4ca0e5d30181cd349e57c1d6f0ba822b1100b3d262e3"} Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.023613 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.025420 4917 generic.go:334] "Generic (PLEG): container finished" podID="fb39efb5-af75-4cea-a6c7-68f10cf1732c" containerID="57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4" exitCode=0 Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.025554 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.025566 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fb39efb5-af75-4cea-a6c7-68f10cf1732c","Type":"ContainerDied","Data":"57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4"} Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.025637 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fb39efb5-af75-4cea-a6c7-68f10cf1732c","Type":"ContainerDied","Data":"617e4b560411148dab62f735a40f03ea9fcbbd4213545cfc85a843a1c0b9766c"} Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.025671 4917 scope.go:117] "RemoveContainer" containerID="57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.033219 4917 generic.go:334] "Generic (PLEG): container finished" podID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerID="7fc041ac2395b2c811e5ffcfa31fb4c9ae542eadaba3a16769fb5c397fc14505" exitCode=0 Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.033272 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af","Type":"ContainerDied","Data":"7fc041ac2395b2c811e5ffcfa31fb4c9ae542eadaba3a16769fb5c397fc14505"} Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.061977 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.061955137 podStartE2EDuration="3.061955137s" podCreationTimestamp="2026-03-18 09:04:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:04:42.059290103 +0000 UTC m=+8267.000444817" watchObservedRunningTime="2026-03-18 09:04:42.061955137 +0000 UTC m=+8267.003109851" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.205932 4917 scope.go:117] "RemoveContainer" containerID="57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4" Mar 18 09:04:42 crc kubenswrapper[4917]: E0318 09:04:42.210169 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4\": container with ID starting with 57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4 not found: ID does not exist" containerID="57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.210222 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4"} err="failed to get container status \"57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4\": rpc error: code = NotFound desc = could not find container \"57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4\": container with ID starting with 57d6162e903ed434f275bb7c860074b2c190030b3692451febb86b72228c84a4 not found: ID does not exist" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.218513 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.222647 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.236454 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.259709 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:04:42 crc kubenswrapper[4917]: E0318 09:04:42.260177 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-api" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.260195 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-api" Mar 18 09:04:42 crc kubenswrapper[4917]: E0318 09:04:42.260210 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb39efb5-af75-4cea-a6c7-68f10cf1732c" containerName="nova-cell0-conductor-conductor" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.260217 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb39efb5-af75-4cea-a6c7-68f10cf1732c" containerName="nova-cell0-conductor-conductor" Mar 18 09:04:42 crc kubenswrapper[4917]: E0318 09:04:42.260243 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-log" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.260249 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-log" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.260458 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb39efb5-af75-4cea-a6c7-68f10cf1732c" containerName="nova-cell0-conductor-conductor" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.260489 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-log" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.260500 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" containerName="nova-api-api" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.261171 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.261254 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.262901 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.305494 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-combined-ca-bundle\") pod \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.305559 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-config-data\") pod \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.305707 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-public-tls-certs\") pod \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.305785 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qfq8\" (UniqueName: \"kubernetes.io/projected/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-kube-api-access-5qfq8\") pod \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.305904 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-logs\") pod \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.305980 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-internal-tls-certs\") pod \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\" (UID: \"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.308218 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-logs" (OuterVolumeSpecName: "logs") pod "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" (UID: "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.321794 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-kube-api-access-5qfq8" (OuterVolumeSpecName: "kube-api-access-5qfq8") pod "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" (UID: "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af"). InnerVolumeSpecName "kube-api-access-5qfq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.333840 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" (UID: "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.339216 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-config-data" (OuterVolumeSpecName: "config-data") pod "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" (UID: "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.380255 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" (UID: "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.389685 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" (UID: "3c9c11ff-fe6c-4b81-9414-aa5fa66e21af"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.393102 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.407877 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8b2c27-8097-4e77-a187-bcb38a11b60f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5a8b2c27-8097-4e77-a187-bcb38a11b60f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.408084 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8b2c27-8097-4e77-a187-bcb38a11b60f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5a8b2c27-8097-4e77-a187-bcb38a11b60f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.408425 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gk45\" (UniqueName: \"kubernetes.io/projected/5a8b2c27-8097-4e77-a187-bcb38a11b60f-kube-api-access-4gk45\") pod \"nova-cell0-conductor-0\" (UID: \"5a8b2c27-8097-4e77-a187-bcb38a11b60f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.408552 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.408638 4917 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.408707 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qfq8\" (UniqueName: \"kubernetes.io/projected/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-kube-api-access-5qfq8\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.408763 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.408832 4917 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.408902 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.512954 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-nova-metadata-tls-certs\") pod \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.513350 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c74b75-1b23-437f-bdbf-6888f4087bbb-logs\") pod \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.513492 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdr56\" (UniqueName: \"kubernetes.io/projected/e5c74b75-1b23-437f-bdbf-6888f4087bbb-kube-api-access-hdr56\") pod \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.513550 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-combined-ca-bundle\") pod \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.513630 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-config-data\") pod \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\" (UID: \"e5c74b75-1b23-437f-bdbf-6888f4087bbb\") " Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.513709 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5c74b75-1b23-437f-bdbf-6888f4087bbb-logs" (OuterVolumeSpecName: "logs") pod "e5c74b75-1b23-437f-bdbf-6888f4087bbb" (UID: "e5c74b75-1b23-437f-bdbf-6888f4087bbb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.514032 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8b2c27-8097-4e77-a187-bcb38a11b60f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5a8b2c27-8097-4e77-a187-bcb38a11b60f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.514211 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8b2c27-8097-4e77-a187-bcb38a11b60f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5a8b2c27-8097-4e77-a187-bcb38a11b60f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.514326 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gk45\" (UniqueName: \"kubernetes.io/projected/5a8b2c27-8097-4e77-a187-bcb38a11b60f-kube-api-access-4gk45\") pod \"nova-cell0-conductor-0\" (UID: \"5a8b2c27-8097-4e77-a187-bcb38a11b60f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.514505 4917 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5c74b75-1b23-437f-bdbf-6888f4087bbb-logs\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.517405 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c74b75-1b23-437f-bdbf-6888f4087bbb-kube-api-access-hdr56" (OuterVolumeSpecName: "kube-api-access-hdr56") pod "e5c74b75-1b23-437f-bdbf-6888f4087bbb" (UID: "e5c74b75-1b23-437f-bdbf-6888f4087bbb"). InnerVolumeSpecName "kube-api-access-hdr56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.518051 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a8b2c27-8097-4e77-a187-bcb38a11b60f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5a8b2c27-8097-4e77-a187-bcb38a11b60f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.522466 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a8b2c27-8097-4e77-a187-bcb38a11b60f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5a8b2c27-8097-4e77-a187-bcb38a11b60f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.539718 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gk45\" (UniqueName: \"kubernetes.io/projected/5a8b2c27-8097-4e77-a187-bcb38a11b60f-kube-api-access-4gk45\") pod \"nova-cell0-conductor-0\" (UID: \"5a8b2c27-8097-4e77-a187-bcb38a11b60f\") " pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.556663 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-config-data" (OuterVolumeSpecName: "config-data") pod "e5c74b75-1b23-437f-bdbf-6888f4087bbb" (UID: "e5c74b75-1b23-437f-bdbf-6888f4087bbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.579322 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5c74b75-1b23-437f-bdbf-6888f4087bbb" (UID: "e5c74b75-1b23-437f-bdbf-6888f4087bbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.593886 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e5c74b75-1b23-437f-bdbf-6888f4087bbb" (UID: "e5c74b75-1b23-437f-bdbf-6888f4087bbb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.600136 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.615848 4917 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.615882 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdr56\" (UniqueName: \"kubernetes.io/projected/e5c74b75-1b23-437f-bdbf-6888f4087bbb-kube-api-access-hdr56\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.615895 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:42 crc kubenswrapper[4917]: I0318 09:04:42.615910 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c74b75-1b23-437f-bdbf-6888f4087bbb-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.055061 4917 generic.go:334] "Generic (PLEG): container finished" podID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerID="0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a" exitCode=0 Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.055120 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.055140 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c74b75-1b23-437f-bdbf-6888f4087bbb","Type":"ContainerDied","Data":"0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a"} Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.057830 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5c74b75-1b23-437f-bdbf-6888f4087bbb","Type":"ContainerDied","Data":"b070cc99e96f703a2548a0ab380d8a8b20c85ba95cbd6dd9e4698d8b03b0dc29"} Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.057864 4917 scope.go:117] "RemoveContainer" containerID="0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.076396 4917 generic.go:334] "Generic (PLEG): container finished" podID="4c71fad2-8395-413b-8028-b10578eb45e5" containerID="56729c5d3ea1726d96d4077b5709e46532c09fde7bfad3870041bb8fde4dd330" exitCode=0 Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.076498 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c71fad2-8395-413b-8028-b10578eb45e5","Type":"ContainerDied","Data":"56729c5d3ea1726d96d4077b5709e46532c09fde7bfad3870041bb8fde4dd330"} Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.080003 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" event={"ID":"68b25068-0099-413b-8a35-d218abba4a8c","Type":"ContainerStarted","Data":"b58887cb3dc11904d0a88979bacbc9ed68ebbcbdfaf99dc52e972df4928897c4"} Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.093368 4917 scope.go:117] "RemoveContainer" containerID="af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.108152 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.112122 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3c9c11ff-fe6c-4b81-9414-aa5fa66e21af","Type":"ContainerDied","Data":"52a2c202febb5a69b862b1c09325f15d64703bb73f7869f33db702a20467a465"} Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.119747 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.132555 4917 scope.go:117] "RemoveContainer" containerID="0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a" Mar 18 09:04:43 crc kubenswrapper[4917]: E0318 09:04:43.135960 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a\": container with ID starting with 0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a not found: ID does not exist" containerID="0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.136003 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a"} err="failed to get container status \"0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a\": rpc error: code = NotFound desc = could not find container \"0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a\": container with ID starting with 0a60b94c942b85da880e0652635d634d2cfdbf873777b262b05ef0eff89fbb2a not found: ID does not exist" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.136026 4917 scope.go:117] "RemoveContainer" containerID="af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd" Mar 18 09:04:43 crc kubenswrapper[4917]: E0318 09:04:43.138955 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd\": container with ID starting with af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd not found: ID does not exist" containerID="af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.138997 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd"} err="failed to get container status \"af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd\": rpc error: code = NotFound desc = could not find container \"af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd\": container with ID starting with af9a0b07a7e74ab43a8529162580a04db822c8a7b8dbf0fae7d242cf34f74abd not found: ID does not exist" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.139018 4917 scope.go:117] "RemoveContainer" containerID="7fc041ac2395b2c811e5ffcfa31fb4c9ae542eadaba3a16769fb5c397fc14505" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.175989 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" podStartSLOduration=2.226075642 podStartE2EDuration="5.175964597s" podCreationTimestamp="2026-03-18 09:04:38 +0000 UTC" firstStartedPulling="2026-03-18 09:04:39.419860861 +0000 UTC m=+8264.361015585" lastFinishedPulling="2026-03-18 09:04:42.369749826 +0000 UTC m=+8267.310904540" observedRunningTime="2026-03-18 09:04:43.099070082 +0000 UTC m=+8268.040224806" watchObservedRunningTime="2026-03-18 09:04:43.175964597 +0000 UTC m=+8268.117119321" Mar 18 09:04:43 crc kubenswrapper[4917]: W0318 09:04:43.200907 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a8b2c27_8097_4e77_a187_bcb38a11b60f.slice/crio-df2e9d701e0e19cd72772eb5fe17934e6be672940966ae2a7569fc1499e61755 WatchSource:0}: Error finding container df2e9d701e0e19cd72772eb5fe17934e6be672940966ae2a7569fc1499e61755: Status 404 returned error can't find the container with id df2e9d701e0e19cd72772eb5fe17934e6be672940966ae2a7569fc1499e61755 Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.213191 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.222371 4917 scope.go:117] "RemoveContainer" containerID="438f26f42f32838aafb20712ca0eac3c5b39291fc19fc6bc965d2d5fb9b1359d" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.234363 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.244793 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:04:43 crc kubenswrapper[4917]: E0318 09:04:43.245458 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-metadata" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.245470 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-metadata" Mar 18 09:04:43 crc kubenswrapper[4917]: E0318 09:04:43.245488 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-log" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.245494 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-log" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.245733 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-metadata" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.245753 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" containerName="nova-metadata-log" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.250778 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.254041 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.254336 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.258903 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.276979 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.297887 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.303701 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.306163 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.310211 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.310752 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.310805 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.317689 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.332626 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a66cf9-c416-4b82-99f9-7f389cb14e79-config-data\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.332698 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a66cf9-c416-4b82-99f9-7f389cb14e79-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.332729 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a66cf9-c416-4b82-99f9-7f389cb14e79-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.332791 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a66cf9-c416-4b82-99f9-7f389cb14e79-logs\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.333061 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bplsf\" (UniqueName: \"kubernetes.io/projected/f6a66cf9-c416-4b82-99f9-7f389cb14e79-kube-api-access-bplsf\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.388333 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.435525 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a66cf9-c416-4b82-99f9-7f389cb14e79-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.435579 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a66cf9-c416-4b82-99f9-7f389cb14e79-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.435662 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a66cf9-c416-4b82-99f9-7f389cb14e79-logs\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.435708 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.435739 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nngr4\" (UniqueName: \"kubernetes.io/projected/b0b28978-b862-4cb1-9af3-bf8e71479244-kube-api-access-nngr4\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.435869 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-config-data\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.435901 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bplsf\" (UniqueName: \"kubernetes.io/projected/f6a66cf9-c416-4b82-99f9-7f389cb14e79-kube-api-access-bplsf\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.435980 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.436043 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0b28978-b862-4cb1-9af3-bf8e71479244-logs\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.436118 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-public-tls-certs\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.436680 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a66cf9-c416-4b82-99f9-7f389cb14e79-config-data\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.439924 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a66cf9-c416-4b82-99f9-7f389cb14e79-logs\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.441402 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a66cf9-c416-4b82-99f9-7f389cb14e79-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.444639 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a66cf9-c416-4b82-99f9-7f389cb14e79-config-data\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.444975 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a66cf9-c416-4b82-99f9-7f389cb14e79-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.453287 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bplsf\" (UniqueName: \"kubernetes.io/projected/f6a66cf9-c416-4b82-99f9-7f389cb14e79-kube-api-access-bplsf\") pod \"nova-metadata-0\" (UID: \"f6a66cf9-c416-4b82-99f9-7f389cb14e79\") " pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.538179 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szsqs\" (UniqueName: \"kubernetes.io/projected/4c71fad2-8395-413b-8028-b10578eb45e5-kube-api-access-szsqs\") pod \"4c71fad2-8395-413b-8028-b10578eb45e5\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.538257 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-config-data\") pod \"4c71fad2-8395-413b-8028-b10578eb45e5\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.538447 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-combined-ca-bundle\") pod \"4c71fad2-8395-413b-8028-b10578eb45e5\" (UID: \"4c71fad2-8395-413b-8028-b10578eb45e5\") " Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.538968 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.539016 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nngr4\" (UniqueName: \"kubernetes.io/projected/b0b28978-b862-4cb1-9af3-bf8e71479244-kube-api-access-nngr4\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.539098 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-config-data\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.539157 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.539205 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0b28978-b862-4cb1-9af3-bf8e71479244-logs\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.539233 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-public-tls-certs\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.541640 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0b28978-b862-4cb1-9af3-bf8e71479244-logs\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.542244 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c71fad2-8395-413b-8028-b10578eb45e5-kube-api-access-szsqs" (OuterVolumeSpecName: "kube-api-access-szsqs") pod "4c71fad2-8395-413b-8028-b10578eb45e5" (UID: "4c71fad2-8395-413b-8028-b10578eb45e5"). InnerVolumeSpecName "kube-api-access-szsqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.544232 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.544700 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-config-data\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.545426 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-public-tls-certs\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.545814 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b0b28978-b862-4cb1-9af3-bf8e71479244-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.562648 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nngr4\" (UniqueName: \"kubernetes.io/projected/b0b28978-b862-4cb1-9af3-bf8e71479244-kube-api-access-nngr4\") pod \"nova-api-0\" (UID: \"b0b28978-b862-4cb1-9af3-bf8e71479244\") " pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.566901 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-config-data" (OuterVolumeSpecName: "config-data") pod "4c71fad2-8395-413b-8028-b10578eb45e5" (UID: "4c71fad2-8395-413b-8028-b10578eb45e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.571372 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c71fad2-8395-413b-8028-b10578eb45e5" (UID: "4c71fad2-8395-413b-8028-b10578eb45e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.641730 4917 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.641773 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szsqs\" (UniqueName: \"kubernetes.io/projected/4c71fad2-8395-413b-8028-b10578eb45e5-kube-api-access-szsqs\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.641788 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c71fad2-8395-413b-8028-b10578eb45e5-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.669730 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.685194 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.798060 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c9c11ff-fe6c-4b81-9414-aa5fa66e21af" path="/var/lib/kubelet/pods/3c9c11ff-fe6c-4b81-9414-aa5fa66e21af/volumes" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.804635 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c74b75-1b23-437f-bdbf-6888f4087bbb" path="/var/lib/kubelet/pods/e5c74b75-1b23-437f-bdbf-6888f4087bbb/volumes" Mar 18 09:04:43 crc kubenswrapper[4917]: I0318 09:04:43.805630 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb39efb5-af75-4cea-a6c7-68f10cf1732c" path="/var/lib/kubelet/pods/fb39efb5-af75-4cea-a6c7-68f10cf1732c/volumes" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.122887 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5a8b2c27-8097-4e77-a187-bcb38a11b60f","Type":"ContainerStarted","Data":"bb1117341edb8e2b3ff2954e82fad34bc9aaed7847b7fc4c0227854dd900d4d1"} Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.123008 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5a8b2c27-8097-4e77-a187-bcb38a11b60f","Type":"ContainerStarted","Data":"df2e9d701e0e19cd72772eb5fe17934e6be672940966ae2a7569fc1499e61755"} Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.123041 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.127165 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.127243 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4c71fad2-8395-413b-8028-b10578eb45e5","Type":"ContainerDied","Data":"79eee33b579434d1e3c51a87416451de3d11bf8e0abd9cc4d612b942978d53cb"} Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.127305 4917 scope.go:117] "RemoveContainer" containerID="56729c5d3ea1726d96d4077b5709e46532c09fde7bfad3870041bb8fde4dd330" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.151955 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.151931638 podStartE2EDuration="2.151931638s" podCreationTimestamp="2026-03-18 09:04:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:04:44.13927567 +0000 UTC m=+8269.080430384" watchObservedRunningTime="2026-03-18 09:04:44.151931638 +0000 UTC m=+8269.093086362" Mar 18 09:04:44 crc kubenswrapper[4917]: W0318 09:04:44.190834 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6a66cf9_c416_4b82_99f9_7f389cb14e79.slice/crio-adc1d999cae6b5b2271105e0feb71c47e7cdc69a1fcbbdc502c53b0e82c5a91e WatchSource:0}: Error finding container adc1d999cae6b5b2271105e0feb71c47e7cdc69a1fcbbdc502c53b0e82c5a91e: Status 404 returned error can't find the container with id adc1d999cae6b5b2271105e0feb71c47e7cdc69a1fcbbdc502c53b0e82c5a91e Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.209809 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.231384 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.274492 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.290501 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:04:44 crc kubenswrapper[4917]: E0318 09:04:44.290932 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c71fad2-8395-413b-8028-b10578eb45e5" containerName="nova-scheduler-scheduler" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.290949 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c71fad2-8395-413b-8028-b10578eb45e5" containerName="nova-scheduler-scheduler" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.291153 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c71fad2-8395-413b-8028-b10578eb45e5" containerName="nova-scheduler-scheduler" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.291903 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.293563 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.301283 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.313668 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.370079 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b3ccdd1-1e4c-4671-b8cd-c950889b24be-config-data\") pod \"nova-scheduler-0\" (UID: \"3b3ccdd1-1e4c-4671-b8cd-c950889b24be\") " pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.370246 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3ccdd1-1e4c-4671-b8cd-c950889b24be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3b3ccdd1-1e4c-4671-b8cd-c950889b24be\") " pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.370321 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdcs\" (UniqueName: \"kubernetes.io/projected/3b3ccdd1-1e4c-4671-b8cd-c950889b24be-kube-api-access-hqdcs\") pod \"nova-scheduler-0\" (UID: \"3b3ccdd1-1e4c-4671-b8cd-c950889b24be\") " pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.472005 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b3ccdd1-1e4c-4671-b8cd-c950889b24be-config-data\") pod \"nova-scheduler-0\" (UID: \"3b3ccdd1-1e4c-4671-b8cd-c950889b24be\") " pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.472770 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3ccdd1-1e4c-4671-b8cd-c950889b24be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3b3ccdd1-1e4c-4671-b8cd-c950889b24be\") " pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.473030 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdcs\" (UniqueName: \"kubernetes.io/projected/3b3ccdd1-1e4c-4671-b8cd-c950889b24be-kube-api-access-hqdcs\") pod \"nova-scheduler-0\" (UID: \"3b3ccdd1-1e4c-4671-b8cd-c950889b24be\") " pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.478432 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b3ccdd1-1e4c-4671-b8cd-c950889b24be-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3b3ccdd1-1e4c-4671-b8cd-c950889b24be\") " pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.480521 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b3ccdd1-1e4c-4671-b8cd-c950889b24be-config-data\") pod \"nova-scheduler-0\" (UID: \"3b3ccdd1-1e4c-4671-b8cd-c950889b24be\") " pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.493133 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdcs\" (UniqueName: \"kubernetes.io/projected/3b3ccdd1-1e4c-4671-b8cd-c950889b24be-kube-api-access-hqdcs\") pod \"nova-scheduler-0\" (UID: \"3b3ccdd1-1e4c-4671-b8cd-c950889b24be\") " pod="openstack/nova-scheduler-0" Mar 18 09:04:44 crc kubenswrapper[4917]: I0318 09:04:44.566327 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.085367 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.149365 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b0b28978-b862-4cb1-9af3-bf8e71479244","Type":"ContainerStarted","Data":"5ca54cfca049aa20a7f3c358fa7ac7ed1ac1987ad76cb633aa1ce35f6663e6ca"} Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.149485 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b0b28978-b862-4cb1-9af3-bf8e71479244","Type":"ContainerStarted","Data":"12dd231e3ac719a7bdc7188eb2b2e14acd9b029c08d5541dc753b03066ad400e"} Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.149502 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b0b28978-b862-4cb1-9af3-bf8e71479244","Type":"ContainerStarted","Data":"da451cd0a438852f9c9bd8ffd40a2b237b3853b3610069abfa0d17df4dfacd84"} Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.151055 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3b3ccdd1-1e4c-4671-b8cd-c950889b24be","Type":"ContainerStarted","Data":"f2409fe4a0a053c7f19a7b3fc4b5281f632ff44f8ac71e65ea939db71146110c"} Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.154798 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6a66cf9-c416-4b82-99f9-7f389cb14e79","Type":"ContainerStarted","Data":"206fe4cf918abcc79fbc12eff8d9ec9904fe236980a3d720d1471ae7f28a0008"} Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.154837 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6a66cf9-c416-4b82-99f9-7f389cb14e79","Type":"ContainerStarted","Data":"c77d0b008dff2280504880cfb58330905e522436a7959b83356b396fae7e595c"} Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.154847 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f6a66cf9-c416-4b82-99f9-7f389cb14e79","Type":"ContainerStarted","Data":"adc1d999cae6b5b2271105e0feb71c47e7cdc69a1fcbbdc502c53b0e82c5a91e"} Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.183690 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.183668361 podStartE2EDuration="2.183668361s" podCreationTimestamp="2026-03-18 09:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:04:45.169838025 +0000 UTC m=+8270.110992749" watchObservedRunningTime="2026-03-18 09:04:45.183668361 +0000 UTC m=+8270.124823085" Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.191478 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.19146392 podStartE2EDuration="2.19146392s" podCreationTimestamp="2026-03-18 09:04:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:04:45.190931217 +0000 UTC m=+8270.132085921" watchObservedRunningTime="2026-03-18 09:04:45.19146392 +0000 UTC m=+8270.132618634" Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.775347 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:04:45 crc kubenswrapper[4917]: E0318 09:04:45.776276 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:04:45 crc kubenswrapper[4917]: I0318 09:04:45.787422 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c71fad2-8395-413b-8028-b10578eb45e5" path="/var/lib/kubelet/pods/4c71fad2-8395-413b-8028-b10578eb45e5/volumes" Mar 18 09:04:46 crc kubenswrapper[4917]: I0318 09:04:46.166863 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3b3ccdd1-1e4c-4671-b8cd-c950889b24be","Type":"ContainerStarted","Data":"b3a2f9a162e40851726c250095144226a22fc63bb5430d42e6b2b0b69e11eea9"} Mar 18 09:04:46 crc kubenswrapper[4917]: I0318 09:04:46.193550 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.193531793 podStartE2EDuration="2.193531793s" podCreationTimestamp="2026-03-18 09:04:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:04:46.186990304 +0000 UTC m=+8271.128145018" watchObservedRunningTime="2026-03-18 09:04:46.193531793 +0000 UTC m=+8271.134686497" Mar 18 09:04:49 crc kubenswrapper[4917]: I0318 09:04:49.567070 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 18 09:04:50 crc kubenswrapper[4917]: I0318 09:04:50.423967 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 18 09:04:52 crc kubenswrapper[4917]: I0318 09:04:52.654092 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 18 09:04:53 crc kubenswrapper[4917]: I0318 09:04:53.670921 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:04:53 crc kubenswrapper[4917]: I0318 09:04:53.670991 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 18 09:04:53 crc kubenswrapper[4917]: I0318 09:04:53.685663 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:04:53 crc kubenswrapper[4917]: I0318 09:04:53.685732 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 18 09:04:54 crc kubenswrapper[4917]: I0318 09:04:54.567425 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 18 09:04:54 crc kubenswrapper[4917]: I0318 09:04:54.596438 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 18 09:04:54 crc kubenswrapper[4917]: I0318 09:04:54.683802 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f6a66cf9-c416-4b82-99f9-7f389cb14e79" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.238:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:04:54 crc kubenswrapper[4917]: I0318 09:04:54.683917 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f6a66cf9-c416-4b82-99f9-7f389cb14e79" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.238:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:04:54 crc kubenswrapper[4917]: I0318 09:04:54.697744 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b0b28978-b862-4cb1-9af3-bf8e71479244" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.239:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:04:54 crc kubenswrapper[4917]: I0318 09:04:54.697766 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b0b28978-b862-4cb1-9af3-bf8e71479244" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.239:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 18 09:04:55 crc kubenswrapper[4917]: I0318 09:04:55.300644 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 18 09:04:58 crc kubenswrapper[4917]: I0318 09:04:58.774123 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:04:58 crc kubenswrapper[4917]: E0318 09:04:58.775041 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:05:01 crc kubenswrapper[4917]: I0318 09:05:01.670872 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:05:01 crc kubenswrapper[4917]: I0318 09:05:01.671431 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 18 09:05:01 crc kubenswrapper[4917]: I0318 09:05:01.685832 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:05:01 crc kubenswrapper[4917]: I0318 09:05:01.685901 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 18 09:05:03 crc kubenswrapper[4917]: I0318 09:05:03.683861 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:05:03 crc kubenswrapper[4917]: I0318 09:05:03.685568 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 18 09:05:03 crc kubenswrapper[4917]: I0318 09:05:03.692522 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:05:03 crc kubenswrapper[4917]: I0318 09:05:03.693668 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:05:03 crc kubenswrapper[4917]: I0318 09:05:03.698901 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 18 09:05:03 crc kubenswrapper[4917]: I0318 09:05:03.702799 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:05:04 crc kubenswrapper[4917]: I0318 09:05:04.377082 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 18 09:05:04 crc kubenswrapper[4917]: I0318 09:05:04.389850 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 18 09:05:11 crc kubenswrapper[4917]: I0318 09:05:11.772897 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:05:11 crc kubenswrapper[4917]: E0318 09:05:11.773977 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:05:23 crc kubenswrapper[4917]: I0318 09:05:23.773116 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:05:23 crc kubenswrapper[4917]: E0318 09:05:23.773936 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:05:36 crc kubenswrapper[4917]: I0318 09:05:36.773119 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:05:36 crc kubenswrapper[4917]: E0318 09:05:36.773936 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:05:43 crc kubenswrapper[4917]: I0318 09:05:43.734053 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-czmn9"] Mar 18 09:05:43 crc kubenswrapper[4917]: I0318 09:05:43.738864 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:43 crc kubenswrapper[4917]: I0318 09:05:43.750122 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-czmn9"] Mar 18 09:05:43 crc kubenswrapper[4917]: I0318 09:05:43.864789 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdtsg\" (UniqueName: \"kubernetes.io/projected/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-kube-api-access-sdtsg\") pod \"certified-operators-czmn9\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:43 crc kubenswrapper[4917]: I0318 09:05:43.865121 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-utilities\") pod \"certified-operators-czmn9\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:43 crc kubenswrapper[4917]: I0318 09:05:43.865178 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-catalog-content\") pod \"certified-operators-czmn9\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:44 crc kubenswrapper[4917]: I0318 09:05:44.685252 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-utilities\") pod \"certified-operators-czmn9\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:44 crc kubenswrapper[4917]: I0318 09:05:44.685566 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-catalog-content\") pod \"certified-operators-czmn9\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:44 crc kubenswrapper[4917]: I0318 09:05:44.685715 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdtsg\" (UniqueName: \"kubernetes.io/projected/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-kube-api-access-sdtsg\") pod \"certified-operators-czmn9\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:44 crc kubenswrapper[4917]: I0318 09:05:44.686990 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-utilities\") pod \"certified-operators-czmn9\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:44 crc kubenswrapper[4917]: I0318 09:05:44.687788 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-catalog-content\") pod \"certified-operators-czmn9\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:44 crc kubenswrapper[4917]: I0318 09:05:44.743625 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdtsg\" (UniqueName: \"kubernetes.io/projected/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-kube-api-access-sdtsg\") pod \"certified-operators-czmn9\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:44 crc kubenswrapper[4917]: I0318 09:05:44.973519 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:45 crc kubenswrapper[4917]: I0318 09:05:45.451099 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-czmn9"] Mar 18 09:05:45 crc kubenswrapper[4917]: I0318 09:05:45.885605 4917 generic.go:334] "Generic (PLEG): container finished" podID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerID="4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5" exitCode=0 Mar 18 09:05:45 crc kubenswrapper[4917]: I0318 09:05:45.885650 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czmn9" event={"ID":"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0","Type":"ContainerDied","Data":"4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5"} Mar 18 09:05:45 crc kubenswrapper[4917]: I0318 09:05:45.887021 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czmn9" event={"ID":"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0","Type":"ContainerStarted","Data":"32c74277fa00716ae2c91a30eab99a9f8b4678a2242140c6d41c6fbf540b19e2"} Mar 18 09:05:47 crc kubenswrapper[4917]: I0318 09:05:47.912090 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czmn9" event={"ID":"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0","Type":"ContainerStarted","Data":"9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2"} Mar 18 09:05:48 crc kubenswrapper[4917]: E0318 09:05:48.707496 4917 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaab33a3f_ee0c_4bec_8d31_3c090a3b43e0.slice/crio-9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2.scope\": RecentStats: unable to find data in memory cache]" Mar 18 09:05:48 crc kubenswrapper[4917]: I0318 09:05:48.773411 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:05:48 crc kubenswrapper[4917]: E0318 09:05:48.773986 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:05:48 crc kubenswrapper[4917]: I0318 09:05:48.923438 4917 generic.go:334] "Generic (PLEG): container finished" podID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerID="9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2" exitCode=0 Mar 18 09:05:48 crc kubenswrapper[4917]: I0318 09:05:48.923498 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czmn9" event={"ID":"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0","Type":"ContainerDied","Data":"9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2"} Mar 18 09:05:49 crc kubenswrapper[4917]: I0318 09:05:49.935531 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czmn9" event={"ID":"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0","Type":"ContainerStarted","Data":"1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2"} Mar 18 09:05:49 crc kubenswrapper[4917]: I0318 09:05:49.958769 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-czmn9" podStartSLOduration=3.403499069 podStartE2EDuration="6.958750861s" podCreationTimestamp="2026-03-18 09:05:43 +0000 UTC" firstStartedPulling="2026-03-18 09:05:45.887378526 +0000 UTC m=+8330.828533240" lastFinishedPulling="2026-03-18 09:05:49.442630318 +0000 UTC m=+8334.383785032" observedRunningTime="2026-03-18 09:05:49.955425001 +0000 UTC m=+8334.896579715" watchObservedRunningTime="2026-03-18 09:05:49.958750861 +0000 UTC m=+8334.899905575" Mar 18 09:05:54 crc kubenswrapper[4917]: I0318 09:05:54.973792 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:54 crc kubenswrapper[4917]: I0318 09:05:54.974428 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:55 crc kubenswrapper[4917]: I0318 09:05:55.075714 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:55 crc kubenswrapper[4917]: I0318 09:05:55.146308 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:55 crc kubenswrapper[4917]: I0318 09:05:55.315722 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-czmn9"] Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.022492 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-czmn9" podUID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerName="registry-server" containerID="cri-o://1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2" gracePeriod=2 Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.547551 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.690251 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-utilities\") pod \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.690340 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdtsg\" (UniqueName: \"kubernetes.io/projected/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-kube-api-access-sdtsg\") pod \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.690366 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-catalog-content\") pod \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\" (UID: \"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0\") " Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.691192 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-utilities" (OuterVolumeSpecName: "utilities") pod "aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" (UID: "aab33a3f-ee0c-4bec-8d31-3c090a3b43e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.697938 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-kube-api-access-sdtsg" (OuterVolumeSpecName: "kube-api-access-sdtsg") pod "aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" (UID: "aab33a3f-ee0c-4bec-8d31-3c090a3b43e0"). InnerVolumeSpecName "kube-api-access-sdtsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.793288 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.793452 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdtsg\" (UniqueName: \"kubernetes.io/projected/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-kube-api-access-sdtsg\") on node \"crc\" DevicePath \"\"" Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.931775 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" (UID: "aab33a3f-ee0c-4bec-8d31-3c090a3b43e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:05:57 crc kubenswrapper[4917]: I0318 09:05:57.997822 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.036379 4917 generic.go:334] "Generic (PLEG): container finished" podID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerID="1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2" exitCode=0 Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.036422 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czmn9" event={"ID":"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0","Type":"ContainerDied","Data":"1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2"} Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.036448 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-czmn9" event={"ID":"aab33a3f-ee0c-4bec-8d31-3c090a3b43e0","Type":"ContainerDied","Data":"32c74277fa00716ae2c91a30eab99a9f8b4678a2242140c6d41c6fbf540b19e2"} Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.036464 4917 scope.go:117] "RemoveContainer" containerID="1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.036613 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-czmn9" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.090552 4917 scope.go:117] "RemoveContainer" containerID="9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.094733 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-czmn9"] Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.117157 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-czmn9"] Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.132213 4917 scope.go:117] "RemoveContainer" containerID="4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.184105 4917 scope.go:117] "RemoveContainer" containerID="1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2" Mar 18 09:05:58 crc kubenswrapper[4917]: E0318 09:05:58.184605 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2\": container with ID starting with 1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2 not found: ID does not exist" containerID="1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.184655 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2"} err="failed to get container status \"1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2\": rpc error: code = NotFound desc = could not find container \"1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2\": container with ID starting with 1914da2838d0107c2004e55f05c766f0649eafeac53e307d5e9f6e136bfe9ef2 not found: ID does not exist" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.184685 4917 scope.go:117] "RemoveContainer" containerID="9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2" Mar 18 09:05:58 crc kubenswrapper[4917]: E0318 09:05:58.185109 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2\": container with ID starting with 9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2 not found: ID does not exist" containerID="9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.185165 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2"} err="failed to get container status \"9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2\": rpc error: code = NotFound desc = could not find container \"9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2\": container with ID starting with 9cd25c3a52842ca77d7fea2c7375652cdf58a2fed79c7825d717c0ae4d10d9c2 not found: ID does not exist" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.185203 4917 scope.go:117] "RemoveContainer" containerID="4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5" Mar 18 09:05:58 crc kubenswrapper[4917]: E0318 09:05:58.185529 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5\": container with ID starting with 4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5 not found: ID does not exist" containerID="4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5" Mar 18 09:05:58 crc kubenswrapper[4917]: I0318 09:05:58.185567 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5"} err="failed to get container status \"4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5\": rpc error: code = NotFound desc = could not find container \"4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5\": container with ID starting with 4734d367a255f6b6b3ec2bdbdfe0411aa7c8b29e6543820876be4ae2c3932df5 not found: ID does not exist" Mar 18 09:05:59 crc kubenswrapper[4917]: I0318 09:05:59.772399 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:05:59 crc kubenswrapper[4917]: E0318 09:05:59.772912 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:05:59 crc kubenswrapper[4917]: I0318 09:05:59.785986 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" path="/var/lib/kubelet/pods/aab33a3f-ee0c-4bec-8d31-3c090a3b43e0/volumes" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.168967 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563746-fr5st"] Mar 18 09:06:00 crc kubenswrapper[4917]: E0318 09:06:00.170028 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerName="extract-utilities" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.170155 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerName="extract-utilities" Mar 18 09:06:00 crc kubenswrapper[4917]: E0318 09:06:00.170243 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerName="extract-content" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.170321 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerName="extract-content" Mar 18 09:06:00 crc kubenswrapper[4917]: E0318 09:06:00.170409 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerName="registry-server" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.170477 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerName="registry-server" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.170854 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab33a3f-ee0c-4bec-8d31-3c090a3b43e0" containerName="registry-server" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.171870 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-fr5st" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.175439 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.175789 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.175869 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.183661 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-fr5st"] Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.353045 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhxfh\" (UniqueName: \"kubernetes.io/projected/4eeb8e32-2533-486b-91e4-d733eaafb70d-kube-api-access-rhxfh\") pod \"auto-csr-approver-29563746-fr5st\" (UID: \"4eeb8e32-2533-486b-91e4-d733eaafb70d\") " pod="openshift-infra/auto-csr-approver-29563746-fr5st" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.455264 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhxfh\" (UniqueName: \"kubernetes.io/projected/4eeb8e32-2533-486b-91e4-d733eaafb70d-kube-api-access-rhxfh\") pod \"auto-csr-approver-29563746-fr5st\" (UID: \"4eeb8e32-2533-486b-91e4-d733eaafb70d\") " pod="openshift-infra/auto-csr-approver-29563746-fr5st" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.475298 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhxfh\" (UniqueName: \"kubernetes.io/projected/4eeb8e32-2533-486b-91e4-d733eaafb70d-kube-api-access-rhxfh\") pod \"auto-csr-approver-29563746-fr5st\" (UID: \"4eeb8e32-2533-486b-91e4-d733eaafb70d\") " pod="openshift-infra/auto-csr-approver-29563746-fr5st" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.494839 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-fr5st" Mar 18 09:06:00 crc kubenswrapper[4917]: I0318 09:06:00.998929 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-fr5st"] Mar 18 09:06:01 crc kubenswrapper[4917]: W0318 09:06:01.005881 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4eeb8e32_2533_486b_91e4_d733eaafb70d.slice/crio-e6787dea5aa09aadcb66cfb8b968fd0bd91939b537c62220da4e1119bd18558b WatchSource:0}: Error finding container e6787dea5aa09aadcb66cfb8b968fd0bd91939b537c62220da4e1119bd18558b: Status 404 returned error can't find the container with id e6787dea5aa09aadcb66cfb8b968fd0bd91939b537c62220da4e1119bd18558b Mar 18 09:06:01 crc kubenswrapper[4917]: I0318 09:06:01.077870 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-fr5st" event={"ID":"4eeb8e32-2533-486b-91e4-d733eaafb70d","Type":"ContainerStarted","Data":"e6787dea5aa09aadcb66cfb8b968fd0bd91939b537c62220da4e1119bd18558b"} Mar 18 09:06:03 crc kubenswrapper[4917]: I0318 09:06:03.098508 4917 generic.go:334] "Generic (PLEG): container finished" podID="4eeb8e32-2533-486b-91e4-d733eaafb70d" containerID="c763888a30b0ead848dcecfeefb574af94d79233adeda08bb15c47d5aa9ff3a4" exitCode=0 Mar 18 09:06:03 crc kubenswrapper[4917]: I0318 09:06:03.098638 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-fr5st" event={"ID":"4eeb8e32-2533-486b-91e4-d733eaafb70d","Type":"ContainerDied","Data":"c763888a30b0ead848dcecfeefb574af94d79233adeda08bb15c47d5aa9ff3a4"} Mar 18 09:06:04 crc kubenswrapper[4917]: I0318 09:06:04.464379 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-fr5st" Mar 18 09:06:04 crc kubenswrapper[4917]: I0318 09:06:04.647526 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhxfh\" (UniqueName: \"kubernetes.io/projected/4eeb8e32-2533-486b-91e4-d733eaafb70d-kube-api-access-rhxfh\") pod \"4eeb8e32-2533-486b-91e4-d733eaafb70d\" (UID: \"4eeb8e32-2533-486b-91e4-d733eaafb70d\") " Mar 18 09:06:04 crc kubenswrapper[4917]: I0318 09:06:04.667601 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eeb8e32-2533-486b-91e4-d733eaafb70d-kube-api-access-rhxfh" (OuterVolumeSpecName: "kube-api-access-rhxfh") pod "4eeb8e32-2533-486b-91e4-d733eaafb70d" (UID: "4eeb8e32-2533-486b-91e4-d733eaafb70d"). InnerVolumeSpecName "kube-api-access-rhxfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:06:04 crc kubenswrapper[4917]: I0318 09:06:04.749765 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhxfh\" (UniqueName: \"kubernetes.io/projected/4eeb8e32-2533-486b-91e4-d733eaafb70d-kube-api-access-rhxfh\") on node \"crc\" DevicePath \"\"" Mar 18 09:06:05 crc kubenswrapper[4917]: I0318 09:06:05.120004 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563746-fr5st" event={"ID":"4eeb8e32-2533-486b-91e4-d733eaafb70d","Type":"ContainerDied","Data":"e6787dea5aa09aadcb66cfb8b968fd0bd91939b537c62220da4e1119bd18558b"} Mar 18 09:06:05 crc kubenswrapper[4917]: I0318 09:06:05.120039 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6787dea5aa09aadcb66cfb8b968fd0bd91939b537c62220da4e1119bd18558b" Mar 18 09:06:05 crc kubenswrapper[4917]: I0318 09:06:05.120090 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563746-fr5st" Mar 18 09:06:05 crc kubenswrapper[4917]: I0318 09:06:05.585203 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563740-klzkp"] Mar 18 09:06:05 crc kubenswrapper[4917]: I0318 09:06:05.600993 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563740-klzkp"] Mar 18 09:06:05 crc kubenswrapper[4917]: I0318 09:06:05.787242 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d28518f-85ba-4cf0-adc4-f72e31386ad3" path="/var/lib/kubelet/pods/4d28518f-85ba-4cf0-adc4-f72e31386ad3/volumes" Mar 18 09:06:11 crc kubenswrapper[4917]: I0318 09:06:11.774802 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:06:11 crc kubenswrapper[4917]: E0318 09:06:11.775623 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:06:16 crc kubenswrapper[4917]: I0318 09:06:16.309187 4917 scope.go:117] "RemoveContainer" containerID="14c61bb9a27dbd77cb16aa0f74c3ac9f18c1a61d10be01982aa4a2710c585da0" Mar 18 09:06:25 crc kubenswrapper[4917]: I0318 09:06:25.779934 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:06:25 crc kubenswrapper[4917]: E0318 09:06:25.780844 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:06:37 crc kubenswrapper[4917]: I0318 09:06:37.772864 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:06:37 crc kubenswrapper[4917]: E0318 09:06:37.773870 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:06:49 crc kubenswrapper[4917]: I0318 09:06:49.773001 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:06:49 crc kubenswrapper[4917]: E0318 09:06:49.773988 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:07:04 crc kubenswrapper[4917]: I0318 09:07:04.772906 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:07:04 crc kubenswrapper[4917]: E0318 09:07:04.776397 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:07:15 crc kubenswrapper[4917]: I0318 09:07:15.786862 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:07:15 crc kubenswrapper[4917]: E0318 09:07:15.788370 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:07:29 crc kubenswrapper[4917]: I0318 09:07:29.773153 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:07:29 crc kubenswrapper[4917]: E0318 09:07:29.774041 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:07:43 crc kubenswrapper[4917]: I0318 09:07:43.772697 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:07:43 crc kubenswrapper[4917]: E0318 09:07:43.773519 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:07:54 crc kubenswrapper[4917]: I0318 09:07:54.774565 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:07:54 crc kubenswrapper[4917]: E0318 09:07:54.776464 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.162937 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563748-t66f7"] Mar 18 09:08:00 crc kubenswrapper[4917]: E0318 09:08:00.165009 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eeb8e32-2533-486b-91e4-d733eaafb70d" containerName="oc" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.165213 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eeb8e32-2533-486b-91e4-d733eaafb70d" containerName="oc" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.165537 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eeb8e32-2533-486b-91e4-d733eaafb70d" containerName="oc" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.166495 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-t66f7" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.169139 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.171086 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.176849 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.184632 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-t66f7"] Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.331839 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkpkh\" (UniqueName: \"kubernetes.io/projected/a739d2ec-8a23-4347-b084-b57bd50a6774-kube-api-access-wkpkh\") pod \"auto-csr-approver-29563748-t66f7\" (UID: \"a739d2ec-8a23-4347-b084-b57bd50a6774\") " pod="openshift-infra/auto-csr-approver-29563748-t66f7" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.434239 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkpkh\" (UniqueName: \"kubernetes.io/projected/a739d2ec-8a23-4347-b084-b57bd50a6774-kube-api-access-wkpkh\") pod \"auto-csr-approver-29563748-t66f7\" (UID: \"a739d2ec-8a23-4347-b084-b57bd50a6774\") " pod="openshift-infra/auto-csr-approver-29563748-t66f7" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.460094 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkpkh\" (UniqueName: \"kubernetes.io/projected/a739d2ec-8a23-4347-b084-b57bd50a6774-kube-api-access-wkpkh\") pod \"auto-csr-approver-29563748-t66f7\" (UID: \"a739d2ec-8a23-4347-b084-b57bd50a6774\") " pod="openshift-infra/auto-csr-approver-29563748-t66f7" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.488120 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-t66f7" Mar 18 09:08:00 crc kubenswrapper[4917]: I0318 09:08:00.970620 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-t66f7"] Mar 18 09:08:01 crc kubenswrapper[4917]: I0318 09:08:01.549789 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-t66f7" event={"ID":"a739d2ec-8a23-4347-b084-b57bd50a6774","Type":"ContainerStarted","Data":"d783ec28c3b33d5d7cd7a58d0561530d79094309631aceafebeb9b027eed0ac4"} Mar 18 09:08:02 crc kubenswrapper[4917]: I0318 09:08:02.564741 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-t66f7" event={"ID":"a739d2ec-8a23-4347-b084-b57bd50a6774","Type":"ContainerStarted","Data":"3fa8be2dfbdb8810d1b8408b3697f40a4ca30f403cf626a788294554a7360529"} Mar 18 09:08:02 crc kubenswrapper[4917]: I0318 09:08:02.586174 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563748-t66f7" podStartSLOduration=1.556064527 podStartE2EDuration="2.58615399s" podCreationTimestamp="2026-03-18 09:08:00 +0000 UTC" firstStartedPulling="2026-03-18 09:08:00.977575911 +0000 UTC m=+8465.918730625" lastFinishedPulling="2026-03-18 09:08:02.007665344 +0000 UTC m=+8466.948820088" observedRunningTime="2026-03-18 09:08:02.578957896 +0000 UTC m=+8467.520112650" watchObservedRunningTime="2026-03-18 09:08:02.58615399 +0000 UTC m=+8467.527308714" Mar 18 09:08:03 crc kubenswrapper[4917]: I0318 09:08:03.576711 4917 generic.go:334] "Generic (PLEG): container finished" podID="a739d2ec-8a23-4347-b084-b57bd50a6774" containerID="3fa8be2dfbdb8810d1b8408b3697f40a4ca30f403cf626a788294554a7360529" exitCode=0 Mar 18 09:08:03 crc kubenswrapper[4917]: I0318 09:08:03.576756 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-t66f7" event={"ID":"a739d2ec-8a23-4347-b084-b57bd50a6774","Type":"ContainerDied","Data":"3fa8be2dfbdb8810d1b8408b3697f40a4ca30f403cf626a788294554a7360529"} Mar 18 09:08:04 crc kubenswrapper[4917]: I0318 09:08:04.998291 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-t66f7" Mar 18 09:08:05 crc kubenswrapper[4917]: I0318 09:08:05.145483 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkpkh\" (UniqueName: \"kubernetes.io/projected/a739d2ec-8a23-4347-b084-b57bd50a6774-kube-api-access-wkpkh\") pod \"a739d2ec-8a23-4347-b084-b57bd50a6774\" (UID: \"a739d2ec-8a23-4347-b084-b57bd50a6774\") " Mar 18 09:08:05 crc kubenswrapper[4917]: I0318 09:08:05.152223 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a739d2ec-8a23-4347-b084-b57bd50a6774-kube-api-access-wkpkh" (OuterVolumeSpecName: "kube-api-access-wkpkh") pod "a739d2ec-8a23-4347-b084-b57bd50a6774" (UID: "a739d2ec-8a23-4347-b084-b57bd50a6774"). InnerVolumeSpecName "kube-api-access-wkpkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:08:05 crc kubenswrapper[4917]: I0318 09:08:05.247890 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkpkh\" (UniqueName: \"kubernetes.io/projected/a739d2ec-8a23-4347-b084-b57bd50a6774-kube-api-access-wkpkh\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:05 crc kubenswrapper[4917]: I0318 09:08:05.605153 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563748-t66f7" event={"ID":"a739d2ec-8a23-4347-b084-b57bd50a6774","Type":"ContainerDied","Data":"d783ec28c3b33d5d7cd7a58d0561530d79094309631aceafebeb9b027eed0ac4"} Mar 18 09:08:05 crc kubenswrapper[4917]: I0318 09:08:05.605209 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d783ec28c3b33d5d7cd7a58d0561530d79094309631aceafebeb9b027eed0ac4" Mar 18 09:08:05 crc kubenswrapper[4917]: I0318 09:08:05.605229 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563748-t66f7" Mar 18 09:08:05 crc kubenswrapper[4917]: I0318 09:08:05.676946 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563742-7zlwq"] Mar 18 09:08:05 crc kubenswrapper[4917]: I0318 09:08:05.688921 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563742-7zlwq"] Mar 18 09:08:05 crc kubenswrapper[4917]: I0318 09:08:05.786779 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fb94b62-26ed-48b5-b514-1aa766ad7265" path="/var/lib/kubelet/pods/0fb94b62-26ed-48b5-b514-1aa766ad7265/volumes" Mar 18 09:08:06 crc kubenswrapper[4917]: I0318 09:08:06.773872 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:08:06 crc kubenswrapper[4917]: E0318 09:08:06.774912 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:08:08 crc kubenswrapper[4917]: I0318 09:08:08.651469 4917 generic.go:334] "Generic (PLEG): container finished" podID="68b25068-0099-413b-8a35-d218abba4a8c" containerID="b58887cb3dc11904d0a88979bacbc9ed68ebbcbdfaf99dc52e972df4928897c4" exitCode=0 Mar 18 09:08:08 crc kubenswrapper[4917]: I0318 09:08:08.651570 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" event={"ID":"68b25068-0099-413b-8a35-d218abba4a8c","Type":"ContainerDied","Data":"b58887cb3dc11904d0a88979bacbc9ed68ebbcbdfaf99dc52e972df4928897c4"} Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.219736 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.391545 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-combined-ca-bundle\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.391699 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-1\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.391808 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-1\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.391851 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-0\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.391933 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-0\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.392075 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-2\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.392123 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-inventory\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.392171 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-ssh-key-openstack-cell1\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.392279 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/68b25068-0099-413b-8a35-d218abba4a8c-nova-cells-global-config-0\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.392334 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-3\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.392369 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vcvv\" (UniqueName: \"kubernetes.io/projected/68b25068-0099-413b-8a35-d218abba4a8c-kube-api-access-8vcvv\") pod \"68b25068-0099-413b-8a35-d218abba4a8c\" (UID: \"68b25068-0099-413b-8a35-d218abba4a8c\") " Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.400310 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.417270 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b25068-0099-413b-8a35-d218abba4a8c-kube-api-access-8vcvv" (OuterVolumeSpecName: "kube-api-access-8vcvv") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "kube-api-access-8vcvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.431469 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.431931 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.434593 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.434957 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68b25068-0099-413b-8a35-d218abba4a8c-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.444879 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.445890 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.448532 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-inventory" (OuterVolumeSpecName: "inventory") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.455790 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.457692 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "68b25068-0099-413b-8a35-d218abba4a8c" (UID: "68b25068-0099-413b-8a35-d218abba4a8c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495500 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495546 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495567 4917 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495609 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495630 4917 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-inventory\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495650 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495670 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/68b25068-0099-413b-8a35-d218abba4a8c-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495689 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495709 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vcvv\" (UniqueName: \"kubernetes.io/projected/68b25068-0099-413b-8a35-d218abba4a8c-kube-api-access-8vcvv\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495728 4917 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.495748 4917 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/68b25068-0099-413b-8a35-d218abba4a8c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.684211 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" event={"ID":"68b25068-0099-413b-8a35-d218abba4a8c","Type":"ContainerDied","Data":"c638f4e7c4ac2a97db57f46c3d9dc674b5d7ca0bad8b024ac47606b4938728d5"} Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.684292 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w" Mar 18 09:08:10 crc kubenswrapper[4917]: I0318 09:08:10.684299 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c638f4e7c4ac2a97db57f46c3d9dc674b5d7ca0bad8b024ac47606b4938728d5" Mar 18 09:08:16 crc kubenswrapper[4917]: I0318 09:08:16.470175 4917 scope.go:117] "RemoveContainer" containerID="fc52184b6832c596b2d8c941cf28507e2eda437800fe6abd54f59bf47ad3076a" Mar 18 09:08:19 crc kubenswrapper[4917]: I0318 09:08:19.772351 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:08:19 crc kubenswrapper[4917]: E0318 09:08:19.772992 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:08:23 crc kubenswrapper[4917]: I0318 09:08:23.900078 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jddmh"] Mar 18 09:08:23 crc kubenswrapper[4917]: E0318 09:08:23.901111 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a739d2ec-8a23-4347-b084-b57bd50a6774" containerName="oc" Mar 18 09:08:23 crc kubenswrapper[4917]: I0318 09:08:23.901125 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a739d2ec-8a23-4347-b084-b57bd50a6774" containerName="oc" Mar 18 09:08:23 crc kubenswrapper[4917]: E0318 09:08:23.901141 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b25068-0099-413b-8a35-d218abba4a8c" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 18 09:08:23 crc kubenswrapper[4917]: I0318 09:08:23.901148 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b25068-0099-413b-8a35-d218abba4a8c" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 18 09:08:23 crc kubenswrapper[4917]: I0318 09:08:23.901337 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a739d2ec-8a23-4347-b084-b57bd50a6774" containerName="oc" Mar 18 09:08:23 crc kubenswrapper[4917]: I0318 09:08:23.901359 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b25068-0099-413b-8a35-d218abba4a8c" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 18 09:08:23 crc kubenswrapper[4917]: I0318 09:08:23.902766 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:23 crc kubenswrapper[4917]: I0318 09:08:23.914094 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jddmh"] Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.016908 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77k2x\" (UniqueName: \"kubernetes.io/projected/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-kube-api-access-77k2x\") pod \"redhat-marketplace-jddmh\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.017282 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-catalog-content\") pod \"redhat-marketplace-jddmh\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.017367 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-utilities\") pod \"redhat-marketplace-jddmh\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.119341 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-utilities\") pod \"redhat-marketplace-jddmh\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.119429 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77k2x\" (UniqueName: \"kubernetes.io/projected/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-kube-api-access-77k2x\") pod \"redhat-marketplace-jddmh\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.119502 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-catalog-content\") pod \"redhat-marketplace-jddmh\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.119932 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-utilities\") pod \"redhat-marketplace-jddmh\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.119962 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-catalog-content\") pod \"redhat-marketplace-jddmh\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.141153 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77k2x\" (UniqueName: \"kubernetes.io/projected/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-kube-api-access-77k2x\") pod \"redhat-marketplace-jddmh\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.240323 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.718998 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jddmh"] Mar 18 09:08:24 crc kubenswrapper[4917]: I0318 09:08:24.830692 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jddmh" event={"ID":"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb","Type":"ContainerStarted","Data":"767d1fd98333aa06b2892343f02bc2c160cc3d59bc4b9408f34462dc2deedd93"} Mar 18 09:08:25 crc kubenswrapper[4917]: I0318 09:08:25.842730 4917 generic.go:334] "Generic (PLEG): container finished" podID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerID="5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a" exitCode=0 Mar 18 09:08:25 crc kubenswrapper[4917]: I0318 09:08:25.842809 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jddmh" event={"ID":"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb","Type":"ContainerDied","Data":"5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a"} Mar 18 09:08:26 crc kubenswrapper[4917]: I0318 09:08:26.856243 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jddmh" event={"ID":"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb","Type":"ContainerStarted","Data":"deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af"} Mar 18 09:08:27 crc kubenswrapper[4917]: I0318 09:08:27.875254 4917 generic.go:334] "Generic (PLEG): container finished" podID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerID="deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af" exitCode=0 Mar 18 09:08:27 crc kubenswrapper[4917]: I0318 09:08:27.875317 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jddmh" event={"ID":"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb","Type":"ContainerDied","Data":"deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af"} Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.091472 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zm9lb"] Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.096508 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.109254 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zm9lb"] Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.234571 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkjts\" (UniqueName: \"kubernetes.io/projected/c092598c-1de0-415f-ad43-ccb67d28a488-kube-api-access-mkjts\") pod \"community-operators-zm9lb\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.234797 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-catalog-content\") pod \"community-operators-zm9lb\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.235036 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-utilities\") pod \"community-operators-zm9lb\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.336705 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkjts\" (UniqueName: \"kubernetes.io/projected/c092598c-1de0-415f-ad43-ccb67d28a488-kube-api-access-mkjts\") pod \"community-operators-zm9lb\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.336774 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-catalog-content\") pod \"community-operators-zm9lb\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.336849 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-utilities\") pod \"community-operators-zm9lb\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.337362 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-catalog-content\") pod \"community-operators-zm9lb\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.337382 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-utilities\") pod \"community-operators-zm9lb\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.360415 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkjts\" (UniqueName: \"kubernetes.io/projected/c092598c-1de0-415f-ad43-ccb67d28a488-kube-api-access-mkjts\") pod \"community-operators-zm9lb\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.460132 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.885308 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jddmh" event={"ID":"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb","Type":"ContainerStarted","Data":"c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce"} Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.975362 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jddmh" podStartSLOduration=3.420626453 podStartE2EDuration="5.975341916s" podCreationTimestamp="2026-03-18 09:08:23 +0000 UTC" firstStartedPulling="2026-03-18 09:08:25.844647137 +0000 UTC m=+8490.785801851" lastFinishedPulling="2026-03-18 09:08:28.39936261 +0000 UTC m=+8493.340517314" observedRunningTime="2026-03-18 09:08:28.90496887 +0000 UTC m=+8493.846123584" watchObservedRunningTime="2026-03-18 09:08:28.975341916 +0000 UTC m=+8493.916496630" Mar 18 09:08:28 crc kubenswrapper[4917]: I0318 09:08:28.982291 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zm9lb"] Mar 18 09:08:28 crc kubenswrapper[4917]: W0318 09:08:28.996393 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc092598c_1de0_415f_ad43_ccb67d28a488.slice/crio-6f35ebcd3065a9d3f28e09afb2122d2fcc0db95737eeeac77805864ba15849c9 WatchSource:0}: Error finding container 6f35ebcd3065a9d3f28e09afb2122d2fcc0db95737eeeac77805864ba15849c9: Status 404 returned error can't find the container with id 6f35ebcd3065a9d3f28e09afb2122d2fcc0db95737eeeac77805864ba15849c9 Mar 18 09:08:29 crc kubenswrapper[4917]: I0318 09:08:29.899542 4917 generic.go:334] "Generic (PLEG): container finished" podID="c092598c-1de0-415f-ad43-ccb67d28a488" containerID="bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de" exitCode=0 Mar 18 09:08:29 crc kubenswrapper[4917]: I0318 09:08:29.899663 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm9lb" event={"ID":"c092598c-1de0-415f-ad43-ccb67d28a488","Type":"ContainerDied","Data":"bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de"} Mar 18 09:08:29 crc kubenswrapper[4917]: I0318 09:08:29.900033 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm9lb" event={"ID":"c092598c-1de0-415f-ad43-ccb67d28a488","Type":"ContainerStarted","Data":"6f35ebcd3065a9d3f28e09afb2122d2fcc0db95737eeeac77805864ba15849c9"} Mar 18 09:08:30 crc kubenswrapper[4917]: I0318 09:08:30.773234 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:08:30 crc kubenswrapper[4917]: E0318 09:08:30.773803 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:08:31 crc kubenswrapper[4917]: I0318 09:08:31.925271 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm9lb" event={"ID":"c092598c-1de0-415f-ad43-ccb67d28a488","Type":"ContainerStarted","Data":"9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a"} Mar 18 09:08:32 crc kubenswrapper[4917]: I0318 09:08:32.941283 4917 generic.go:334] "Generic (PLEG): container finished" podID="c092598c-1de0-415f-ad43-ccb67d28a488" containerID="9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a" exitCode=0 Mar 18 09:08:32 crc kubenswrapper[4917]: I0318 09:08:32.941707 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm9lb" event={"ID":"c092598c-1de0-415f-ad43-ccb67d28a488","Type":"ContainerDied","Data":"9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a"} Mar 18 09:08:33 crc kubenswrapper[4917]: I0318 09:08:33.958497 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm9lb" event={"ID":"c092598c-1de0-415f-ad43-ccb67d28a488","Type":"ContainerStarted","Data":"933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4"} Mar 18 09:08:33 crc kubenswrapper[4917]: I0318 09:08:33.996651 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zm9lb" podStartSLOduration=2.553678085 podStartE2EDuration="5.996634934s" podCreationTimestamp="2026-03-18 09:08:28 +0000 UTC" firstStartedPulling="2026-03-18 09:08:29.90266187 +0000 UTC m=+8494.843816624" lastFinishedPulling="2026-03-18 09:08:33.345618749 +0000 UTC m=+8498.286773473" observedRunningTime="2026-03-18 09:08:33.991223083 +0000 UTC m=+8498.932377847" watchObservedRunningTime="2026-03-18 09:08:33.996634934 +0000 UTC m=+8498.937789648" Mar 18 09:08:34 crc kubenswrapper[4917]: I0318 09:08:34.240789 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:34 crc kubenswrapper[4917]: I0318 09:08:34.240846 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:34 crc kubenswrapper[4917]: I0318 09:08:34.327115 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:35 crc kubenswrapper[4917]: I0318 09:08:35.019588 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:36 crc kubenswrapper[4917]: I0318 09:08:36.471953 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jddmh"] Mar 18 09:08:36 crc kubenswrapper[4917]: I0318 09:08:36.984678 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jddmh" podUID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerName="registry-server" containerID="cri-o://c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce" gracePeriod=2 Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.477196 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.671246 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-utilities\") pod \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.671836 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77k2x\" (UniqueName: \"kubernetes.io/projected/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-kube-api-access-77k2x\") pod \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.671946 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-catalog-content\") pod \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\" (UID: \"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb\") " Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.672063 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-utilities" (OuterVolumeSpecName: "utilities") pod "3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" (UID: "3c200cc5-6c15-49b3-89d9-1c77ba9e99bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.672452 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.685829 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-kube-api-access-77k2x" (OuterVolumeSpecName: "kube-api-access-77k2x") pod "3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" (UID: "3c200cc5-6c15-49b3-89d9-1c77ba9e99bb"). InnerVolumeSpecName "kube-api-access-77k2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.697649 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" (UID: "3c200cc5-6c15-49b3-89d9-1c77ba9e99bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.774632 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77k2x\" (UniqueName: \"kubernetes.io/projected/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-kube-api-access-77k2x\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.774684 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.996158 4917 generic.go:334] "Generic (PLEG): container finished" podID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerID="c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce" exitCode=0 Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.996200 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jddmh" event={"ID":"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb","Type":"ContainerDied","Data":"c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce"} Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.996226 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jddmh" event={"ID":"3c200cc5-6c15-49b3-89d9-1c77ba9e99bb","Type":"ContainerDied","Data":"767d1fd98333aa06b2892343f02bc2c160cc3d59bc4b9408f34462dc2deedd93"} Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.996243 4917 scope.go:117] "RemoveContainer" containerID="c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce" Mar 18 09:08:37 crc kubenswrapper[4917]: I0318 09:08:37.996369 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jddmh" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.026826 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jddmh"] Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.032060 4917 scope.go:117] "RemoveContainer" containerID="deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.039288 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jddmh"] Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.057097 4917 scope.go:117] "RemoveContainer" containerID="5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.141966 4917 scope.go:117] "RemoveContainer" containerID="c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce" Mar 18 09:08:38 crc kubenswrapper[4917]: E0318 09:08:38.142420 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce\": container with ID starting with c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce not found: ID does not exist" containerID="c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.142480 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce"} err="failed to get container status \"c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce\": rpc error: code = NotFound desc = could not find container \"c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce\": container with ID starting with c04b957fd985f753c0344d262e600cab90866e8aa539da13ef12adc38f736bce not found: ID does not exist" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.142525 4917 scope.go:117] "RemoveContainer" containerID="deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af" Mar 18 09:08:38 crc kubenswrapper[4917]: E0318 09:08:38.143519 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af\": container with ID starting with deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af not found: ID does not exist" containerID="deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.143596 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af"} err="failed to get container status \"deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af\": rpc error: code = NotFound desc = could not find container \"deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af\": container with ID starting with deeb8a7342d7d67cfb443dc03ea37a0c4f397dc8ed72dd182d60ff7eca7064af not found: ID does not exist" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.143621 4917 scope.go:117] "RemoveContainer" containerID="5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a" Mar 18 09:08:38 crc kubenswrapper[4917]: E0318 09:08:38.144037 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a\": container with ID starting with 5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a not found: ID does not exist" containerID="5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.144083 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a"} err="failed to get container status \"5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a\": rpc error: code = NotFound desc = could not find container \"5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a\": container with ID starting with 5ddf8106102cb105f6fe9f8a94c26edc28f5ef2cd7bad866cff7f2c6a55b364a not found: ID does not exist" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.460546 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.461169 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:38 crc kubenswrapper[4917]: I0318 09:08:38.511996 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:39 crc kubenswrapper[4917]: I0318 09:08:39.077778 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:39 crc kubenswrapper[4917]: I0318 09:08:39.788329 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" path="/var/lib/kubelet/pods/3c200cc5-6c15-49b3-89d9-1c77ba9e99bb/volumes" Mar 18 09:08:40 crc kubenswrapper[4917]: I0318 09:08:40.874133 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zm9lb"] Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.054483 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zm9lb" podUID="c092598c-1de0-415f-ad43-ccb67d28a488" containerName="registry-server" containerID="cri-o://933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4" gracePeriod=2 Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.551722 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.680260 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkjts\" (UniqueName: \"kubernetes.io/projected/c092598c-1de0-415f-ad43-ccb67d28a488-kube-api-access-mkjts\") pod \"c092598c-1de0-415f-ad43-ccb67d28a488\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.680586 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-catalog-content\") pod \"c092598c-1de0-415f-ad43-ccb67d28a488\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.680684 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-utilities\") pod \"c092598c-1de0-415f-ad43-ccb67d28a488\" (UID: \"c092598c-1de0-415f-ad43-ccb67d28a488\") " Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.682236 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-utilities" (OuterVolumeSpecName: "utilities") pod "c092598c-1de0-415f-ad43-ccb67d28a488" (UID: "c092598c-1de0-415f-ad43-ccb67d28a488"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.692751 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c092598c-1de0-415f-ad43-ccb67d28a488-kube-api-access-mkjts" (OuterVolumeSpecName: "kube-api-access-mkjts") pod "c092598c-1de0-415f-ad43-ccb67d28a488" (UID: "c092598c-1de0-415f-ad43-ccb67d28a488"). InnerVolumeSpecName "kube-api-access-mkjts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.745624 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c092598c-1de0-415f-ad43-ccb67d28a488" (UID: "c092598c-1de0-415f-ad43-ccb67d28a488"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.783545 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkjts\" (UniqueName: \"kubernetes.io/projected/c092598c-1de0-415f-ad43-ccb67d28a488-kube-api-access-mkjts\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.783577 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:42 crc kubenswrapper[4917]: I0318 09:08:42.783601 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c092598c-1de0-415f-ad43-ccb67d28a488-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.077136 4917 generic.go:334] "Generic (PLEG): container finished" podID="c092598c-1de0-415f-ad43-ccb67d28a488" containerID="933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4" exitCode=0 Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.077192 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm9lb" event={"ID":"c092598c-1de0-415f-ad43-ccb67d28a488","Type":"ContainerDied","Data":"933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4"} Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.077221 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zm9lb" event={"ID":"c092598c-1de0-415f-ad43-ccb67d28a488","Type":"ContainerDied","Data":"6f35ebcd3065a9d3f28e09afb2122d2fcc0db95737eeeac77805864ba15849c9"} Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.077252 4917 scope.go:117] "RemoveContainer" containerID="933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.078038 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zm9lb" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.107142 4917 scope.go:117] "RemoveContainer" containerID="9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.146414 4917 scope.go:117] "RemoveContainer" containerID="bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.149247 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zm9lb"] Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.157622 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zm9lb"] Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.207001 4917 scope.go:117] "RemoveContainer" containerID="933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4" Mar 18 09:08:43 crc kubenswrapper[4917]: E0318 09:08:43.207362 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4\": container with ID starting with 933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4 not found: ID does not exist" containerID="933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.207392 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4"} err="failed to get container status \"933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4\": rpc error: code = NotFound desc = could not find container \"933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4\": container with ID starting with 933faeec4bd111b6025b8abd2ac77e65ded2eb3092781d30e707e8f46eef8bc4 not found: ID does not exist" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.207433 4917 scope.go:117] "RemoveContainer" containerID="9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a" Mar 18 09:08:43 crc kubenswrapper[4917]: E0318 09:08:43.207631 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a\": container with ID starting with 9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a not found: ID does not exist" containerID="9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.207671 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a"} err="failed to get container status \"9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a\": rpc error: code = NotFound desc = could not find container \"9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a\": container with ID starting with 9702525de8b2f8c733e58040e0e2116eb864680b1547cf13e3bd511fa82c940a not found: ID does not exist" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.207686 4917 scope.go:117] "RemoveContainer" containerID="bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de" Mar 18 09:08:43 crc kubenswrapper[4917]: E0318 09:08:43.207873 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de\": container with ID starting with bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de not found: ID does not exist" containerID="bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.207893 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de"} err="failed to get container status \"bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de\": rpc error: code = NotFound desc = could not find container \"bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de\": container with ID starting with bcd658a6b02ebecf8c691417537e913c5151fb7ab2342f6edda96249bdfa83de not found: ID does not exist" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.772905 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:08:43 crc kubenswrapper[4917]: E0318 09:08:43.773923 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:08:43 crc kubenswrapper[4917]: I0318 09:08:43.792149 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c092598c-1de0-415f-ad43-ccb67d28a488" path="/var/lib/kubelet/pods/c092598c-1de0-415f-ad43-ccb67d28a488/volumes" Mar 18 09:08:58 crc kubenswrapper[4917]: I0318 09:08:58.772330 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:08:58 crc kubenswrapper[4917]: E0318 09:08:58.773185 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:09:08 crc kubenswrapper[4917]: E0318 09:09:08.978093 4917 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.184:40794->38.102.83.184:33891: write tcp 38.102.83.184:40794->38.102.83.184:33891: write: broken pipe Mar 18 09:09:10 crc kubenswrapper[4917]: I0318 09:09:10.773162 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:09:10 crc kubenswrapper[4917]: E0318 09:09:10.774400 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:09:25 crc kubenswrapper[4917]: I0318 09:09:25.779788 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:09:25 crc kubenswrapper[4917]: E0318 09:09:25.780446 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:09:40 crc kubenswrapper[4917]: I0318 09:09:40.772541 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:09:41 crc kubenswrapper[4917]: I0318 09:09:41.753886 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"67765209a4f153d01c0e93247cdedd237f55738ba9f38d37cf02a4ba6359a70d"} Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.156828 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563750-6sqbc"] Mar 18 09:10:00 crc kubenswrapper[4917]: E0318 09:10:00.157820 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c092598c-1de0-415f-ad43-ccb67d28a488" containerName="registry-server" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.157833 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c092598c-1de0-415f-ad43-ccb67d28a488" containerName="registry-server" Mar 18 09:10:00 crc kubenswrapper[4917]: E0318 09:10:00.157857 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c092598c-1de0-415f-ad43-ccb67d28a488" containerName="extract-content" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.157863 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c092598c-1de0-415f-ad43-ccb67d28a488" containerName="extract-content" Mar 18 09:10:00 crc kubenswrapper[4917]: E0318 09:10:00.157874 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerName="extract-utilities" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.157880 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerName="extract-utilities" Mar 18 09:10:00 crc kubenswrapper[4917]: E0318 09:10:00.157895 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerName="registry-server" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.157900 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerName="registry-server" Mar 18 09:10:00 crc kubenswrapper[4917]: E0318 09:10:00.157913 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c092598c-1de0-415f-ad43-ccb67d28a488" containerName="extract-utilities" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.157920 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c092598c-1de0-415f-ad43-ccb67d28a488" containerName="extract-utilities" Mar 18 09:10:00 crc kubenswrapper[4917]: E0318 09:10:00.157935 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerName="extract-content" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.157941 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerName="extract-content" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.158120 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c092598c-1de0-415f-ad43-ccb67d28a488" containerName="registry-server" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.158141 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c200cc5-6c15-49b3-89d9-1c77ba9e99bb" containerName="registry-server" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.158840 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-6sqbc" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.160885 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.161426 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.161652 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.171071 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-6sqbc"] Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.322280 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x7nt\" (UniqueName: \"kubernetes.io/projected/d621211b-6956-4999-b866-77cfc9b5daf7-kube-api-access-5x7nt\") pod \"auto-csr-approver-29563750-6sqbc\" (UID: \"d621211b-6956-4999-b866-77cfc9b5daf7\") " pod="openshift-infra/auto-csr-approver-29563750-6sqbc" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.424480 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x7nt\" (UniqueName: \"kubernetes.io/projected/d621211b-6956-4999-b866-77cfc9b5daf7-kube-api-access-5x7nt\") pod \"auto-csr-approver-29563750-6sqbc\" (UID: \"d621211b-6956-4999-b866-77cfc9b5daf7\") " pod="openshift-infra/auto-csr-approver-29563750-6sqbc" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.446920 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x7nt\" (UniqueName: \"kubernetes.io/projected/d621211b-6956-4999-b866-77cfc9b5daf7-kube-api-access-5x7nt\") pod \"auto-csr-approver-29563750-6sqbc\" (UID: \"d621211b-6956-4999-b866-77cfc9b5daf7\") " pod="openshift-infra/auto-csr-approver-29563750-6sqbc" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.484865 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-6sqbc" Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.970045 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.990471 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-6sqbc"] Mar 18 09:10:00 crc kubenswrapper[4917]: I0318 09:10:00.999055 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563750-6sqbc" event={"ID":"d621211b-6956-4999-b866-77cfc9b5daf7","Type":"ContainerStarted","Data":"22c4ed5be32bcfbd9777fbcf80bd457f135f73442c499e26ca8f9ac8958f4ceb"} Mar 18 09:10:03 crc kubenswrapper[4917]: I0318 09:10:03.018343 4917 generic.go:334] "Generic (PLEG): container finished" podID="d621211b-6956-4999-b866-77cfc9b5daf7" containerID="7cd285717773f60f5b9f90204f0d37c67408b622a06305ab5e34c0b3606b1b31" exitCode=0 Mar 18 09:10:03 crc kubenswrapper[4917]: I0318 09:10:03.018837 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563750-6sqbc" event={"ID":"d621211b-6956-4999-b866-77cfc9b5daf7","Type":"ContainerDied","Data":"7cd285717773f60f5b9f90204f0d37c67408b622a06305ab5e34c0b3606b1b31"} Mar 18 09:10:04 crc kubenswrapper[4917]: I0318 09:10:04.479123 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-6sqbc" Mar 18 09:10:04 crc kubenswrapper[4917]: I0318 09:10:04.616477 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x7nt\" (UniqueName: \"kubernetes.io/projected/d621211b-6956-4999-b866-77cfc9b5daf7-kube-api-access-5x7nt\") pod \"d621211b-6956-4999-b866-77cfc9b5daf7\" (UID: \"d621211b-6956-4999-b866-77cfc9b5daf7\") " Mar 18 09:10:04 crc kubenswrapper[4917]: I0318 09:10:04.623909 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d621211b-6956-4999-b866-77cfc9b5daf7-kube-api-access-5x7nt" (OuterVolumeSpecName: "kube-api-access-5x7nt") pod "d621211b-6956-4999-b866-77cfc9b5daf7" (UID: "d621211b-6956-4999-b866-77cfc9b5daf7"). InnerVolumeSpecName "kube-api-access-5x7nt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:04 crc kubenswrapper[4917]: I0318 09:10:04.719390 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x7nt\" (UniqueName: \"kubernetes.io/projected/d621211b-6956-4999-b866-77cfc9b5daf7-kube-api-access-5x7nt\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:05 crc kubenswrapper[4917]: I0318 09:10:05.039154 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563750-6sqbc" event={"ID":"d621211b-6956-4999-b866-77cfc9b5daf7","Type":"ContainerDied","Data":"22c4ed5be32bcfbd9777fbcf80bd457f135f73442c499e26ca8f9ac8958f4ceb"} Mar 18 09:10:05 crc kubenswrapper[4917]: I0318 09:10:05.039516 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22c4ed5be32bcfbd9777fbcf80bd457f135f73442c499e26ca8f9ac8958f4ceb" Mar 18 09:10:05 crc kubenswrapper[4917]: I0318 09:10:05.039212 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563750-6sqbc" Mar 18 09:10:05 crc kubenswrapper[4917]: I0318 09:10:05.552242 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-69l8h"] Mar 18 09:10:05 crc kubenswrapper[4917]: I0318 09:10:05.565467 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563744-69l8h"] Mar 18 09:10:05 crc kubenswrapper[4917]: I0318 09:10:05.753280 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 09:10:05 crc kubenswrapper[4917]: I0318 09:10:05.753564 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="8b77e881-8795-4449-bedf-625bdc184ff2" containerName="adoption" containerID="cri-o://572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559" gracePeriod=30 Mar 18 09:10:05 crc kubenswrapper[4917]: I0318 09:10:05.788637 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4235a41-9c05-481f-a1b6-2b95d083027f" path="/var/lib/kubelet/pods/a4235a41-9c05-481f-a1b6-2b95d083027f/volumes" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.571107 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d7c8w"] Mar 18 09:10:14 crc kubenswrapper[4917]: E0318 09:10:14.572402 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d621211b-6956-4999-b866-77cfc9b5daf7" containerName="oc" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.572427 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d621211b-6956-4999-b866-77cfc9b5daf7" containerName="oc" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.572786 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d621211b-6956-4999-b866-77cfc9b5daf7" containerName="oc" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.575003 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.613158 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d7c8w"] Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.737860 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-catalog-content\") pod \"redhat-operators-d7c8w\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.737917 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmcgb\" (UniqueName: \"kubernetes.io/projected/796f3eb8-4dde-46fb-b922-655cc9269e7a-kube-api-access-bmcgb\") pod \"redhat-operators-d7c8w\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.738201 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-utilities\") pod \"redhat-operators-d7c8w\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.839781 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmcgb\" (UniqueName: \"kubernetes.io/projected/796f3eb8-4dde-46fb-b922-655cc9269e7a-kube-api-access-bmcgb\") pod \"redhat-operators-d7c8w\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.839900 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-utilities\") pod \"redhat-operators-d7c8w\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.840029 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-catalog-content\") pod \"redhat-operators-d7c8w\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.840502 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-catalog-content\") pod \"redhat-operators-d7c8w\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.841115 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-utilities\") pod \"redhat-operators-d7c8w\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.860726 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmcgb\" (UniqueName: \"kubernetes.io/projected/796f3eb8-4dde-46fb-b922-655cc9269e7a-kube-api-access-bmcgb\") pod \"redhat-operators-d7c8w\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:14 crc kubenswrapper[4917]: I0318 09:10:14.915600 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:15 crc kubenswrapper[4917]: I0318 09:10:15.396819 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d7c8w"] Mar 18 09:10:16 crc kubenswrapper[4917]: I0318 09:10:16.173199 4917 generic.go:334] "Generic (PLEG): container finished" podID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerID="4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455" exitCode=0 Mar 18 09:10:16 crc kubenswrapper[4917]: I0318 09:10:16.173258 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7c8w" event={"ID":"796f3eb8-4dde-46fb-b922-655cc9269e7a","Type":"ContainerDied","Data":"4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455"} Mar 18 09:10:16 crc kubenswrapper[4917]: I0318 09:10:16.173487 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7c8w" event={"ID":"796f3eb8-4dde-46fb-b922-655cc9269e7a","Type":"ContainerStarted","Data":"c418c53e0aa8d04202efa43dc7bb2805e975496cc281d38e4302c6f653b7eb2e"} Mar 18 09:10:16 crc kubenswrapper[4917]: I0318 09:10:16.646929 4917 scope.go:117] "RemoveContainer" containerID="ca630a3de1c7610996d21582a17f3b5e3d2a0ebdccfb3b752865b73ccdcdc930" Mar 18 09:10:17 crc kubenswrapper[4917]: I0318 09:10:17.190087 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7c8w" event={"ID":"796f3eb8-4dde-46fb-b922-655cc9269e7a","Type":"ContainerStarted","Data":"feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149"} Mar 18 09:10:20 crc kubenswrapper[4917]: I0318 09:10:20.229518 4917 generic.go:334] "Generic (PLEG): container finished" podID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerID="feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149" exitCode=0 Mar 18 09:10:20 crc kubenswrapper[4917]: I0318 09:10:20.229669 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7c8w" event={"ID":"796f3eb8-4dde-46fb-b922-655cc9269e7a","Type":"ContainerDied","Data":"feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149"} Mar 18 09:10:21 crc kubenswrapper[4917]: I0318 09:10:21.243062 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7c8w" event={"ID":"796f3eb8-4dde-46fb-b922-655cc9269e7a","Type":"ContainerStarted","Data":"314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9"} Mar 18 09:10:21 crc kubenswrapper[4917]: I0318 09:10:21.280257 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d7c8w" podStartSLOduration=2.7369078719999997 podStartE2EDuration="7.280239632s" podCreationTimestamp="2026-03-18 09:10:14 +0000 UTC" firstStartedPulling="2026-03-18 09:10:16.17513932 +0000 UTC m=+8601.116294024" lastFinishedPulling="2026-03-18 09:10:20.71847103 +0000 UTC m=+8605.659625784" observedRunningTime="2026-03-18 09:10:21.268509757 +0000 UTC m=+8606.209664471" watchObservedRunningTime="2026-03-18 09:10:21.280239632 +0000 UTC m=+8606.221394346" Mar 18 09:10:24 crc kubenswrapper[4917]: I0318 09:10:24.916531 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:24 crc kubenswrapper[4917]: I0318 09:10:24.917259 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:25 crc kubenswrapper[4917]: I0318 09:10:25.984879 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d7c8w" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerName="registry-server" probeResult="failure" output=< Mar 18 09:10:25 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 09:10:25 crc kubenswrapper[4917]: > Mar 18 09:10:35 crc kubenswrapper[4917]: I0318 09:10:35.047130 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:35 crc kubenswrapper[4917]: I0318 09:10:35.092215 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:35 crc kubenswrapper[4917]: I0318 09:10:35.284025 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d7c8w"] Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.287031 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.311420 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\") pod \"8b77e881-8795-4449-bedf-625bdc184ff2\" (UID: \"8b77e881-8795-4449-bedf-625bdc184ff2\") " Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.311958 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znh9b\" (UniqueName: \"kubernetes.io/projected/8b77e881-8795-4449-bedf-625bdc184ff2-kube-api-access-znh9b\") pod \"8b77e881-8795-4449-bedf-625bdc184ff2\" (UID: \"8b77e881-8795-4449-bedf-625bdc184ff2\") " Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.350794 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b77e881-8795-4449-bedf-625bdc184ff2-kube-api-access-znh9b" (OuterVolumeSpecName: "kube-api-access-znh9b") pod "8b77e881-8795-4449-bedf-625bdc184ff2" (UID: "8b77e881-8795-4449-bedf-625bdc184ff2"). InnerVolumeSpecName "kube-api-access-znh9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.365896 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6" (OuterVolumeSpecName: "mariadb-data") pod "8b77e881-8795-4449-bedf-625bdc184ff2" (UID: "8b77e881-8795-4449-bedf-625bdc184ff2"). InnerVolumeSpecName "pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.413846 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znh9b\" (UniqueName: \"kubernetes.io/projected/8b77e881-8795-4449-bedf-625bdc184ff2-kube-api-access-znh9b\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.413887 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\") on node \"crc\" " Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.416211 4917 generic.go:334] "Generic (PLEG): container finished" podID="8b77e881-8795-4449-bedf-625bdc184ff2" containerID="572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559" exitCode=137 Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.416273 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.416269 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"8b77e881-8795-4449-bedf-625bdc184ff2","Type":"ContainerDied","Data":"572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559"} Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.416321 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"8b77e881-8795-4449-bedf-625bdc184ff2","Type":"ContainerDied","Data":"4ee370d0eae53aef4c597b8f76da6d72e0fd7406a13dc61fc452f98bd7715a1e"} Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.416345 4917 scope.go:117] "RemoveContainer" containerID="572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.416470 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d7c8w" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerName="registry-server" containerID="cri-o://314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9" gracePeriod=2 Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.440927 4917 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.441478 4917 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6") on node "crc" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.466684 4917 scope.go:117] "RemoveContainer" containerID="572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559" Mar 18 09:10:36 crc kubenswrapper[4917]: E0318 09:10:36.467118 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559\": container with ID starting with 572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559 not found: ID does not exist" containerID="572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.467154 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559"} err="failed to get container status \"572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559\": rpc error: code = NotFound desc = could not find container \"572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559\": container with ID starting with 572512b0cdbc94b7784e3e664cd4b4f4b0816a71820b8a3c301164983ef66559 not found: ID does not exist" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.468552 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.480066 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.515190 4917 reconciler_common.go:293] "Volume detached for volume \"pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e1197f15-dec6-4b80-8ba8-6238b94928b6\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.804310 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.924172 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-catalog-content\") pod \"796f3eb8-4dde-46fb-b922-655cc9269e7a\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.924247 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmcgb\" (UniqueName: \"kubernetes.io/projected/796f3eb8-4dde-46fb-b922-655cc9269e7a-kube-api-access-bmcgb\") pod \"796f3eb8-4dde-46fb-b922-655cc9269e7a\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.924399 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-utilities\") pod \"796f3eb8-4dde-46fb-b922-655cc9269e7a\" (UID: \"796f3eb8-4dde-46fb-b922-655cc9269e7a\") " Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.925157 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-utilities" (OuterVolumeSpecName: "utilities") pod "796f3eb8-4dde-46fb-b922-655cc9269e7a" (UID: "796f3eb8-4dde-46fb-b922-655cc9269e7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.929701 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/796f3eb8-4dde-46fb-b922-655cc9269e7a-kube-api-access-bmcgb" (OuterVolumeSpecName: "kube-api-access-bmcgb") pod "796f3eb8-4dde-46fb-b922-655cc9269e7a" (UID: "796f3eb8-4dde-46fb-b922-655cc9269e7a"). InnerVolumeSpecName "kube-api-access-bmcgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.997527 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 09:10:36 crc kubenswrapper[4917]: I0318 09:10:36.997746 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="3251a7ec-9729-4af2-9d5d-28812567f353" containerName="adoption" containerID="cri-o://72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c" gracePeriod=30 Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.027335 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmcgb\" (UniqueName: \"kubernetes.io/projected/796f3eb8-4dde-46fb-b922-655cc9269e7a-kube-api-access-bmcgb\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.027374 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.053708 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "796f3eb8-4dde-46fb-b922-655cc9269e7a" (UID: "796f3eb8-4dde-46fb-b922-655cc9269e7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.129548 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/796f3eb8-4dde-46fb-b922-655cc9269e7a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.433959 4917 generic.go:334] "Generic (PLEG): container finished" podID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerID="314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9" exitCode=0 Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.434048 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d7c8w" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.434057 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7c8w" event={"ID":"796f3eb8-4dde-46fb-b922-655cc9269e7a","Type":"ContainerDied","Data":"314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9"} Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.434256 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d7c8w" event={"ID":"796f3eb8-4dde-46fb-b922-655cc9269e7a","Type":"ContainerDied","Data":"c418c53e0aa8d04202efa43dc7bb2805e975496cc281d38e4302c6f653b7eb2e"} Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.434319 4917 scope.go:117] "RemoveContainer" containerID="314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.472967 4917 scope.go:117] "RemoveContainer" containerID="feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.508573 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d7c8w"] Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.517913 4917 scope.go:117] "RemoveContainer" containerID="4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.528435 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d7c8w"] Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.577448 4917 scope.go:117] "RemoveContainer" containerID="314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9" Mar 18 09:10:37 crc kubenswrapper[4917]: E0318 09:10:37.578011 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9\": container with ID starting with 314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9 not found: ID does not exist" containerID="314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.578073 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9"} err="failed to get container status \"314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9\": rpc error: code = NotFound desc = could not find container \"314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9\": container with ID starting with 314f3f04767b193462faa8eafb287c335dcac76d5708dd650d8cef4825b8b0f9 not found: ID does not exist" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.578113 4917 scope.go:117] "RemoveContainer" containerID="feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149" Mar 18 09:10:37 crc kubenswrapper[4917]: E0318 09:10:37.578581 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149\": container with ID starting with feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149 not found: ID does not exist" containerID="feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.578638 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149"} err="failed to get container status \"feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149\": rpc error: code = NotFound desc = could not find container \"feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149\": container with ID starting with feab3e9b0f6da1b8e1f8d6516eb88efdeed057770a80b800919ae05aedc83149 not found: ID does not exist" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.578665 4917 scope.go:117] "RemoveContainer" containerID="4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455" Mar 18 09:10:37 crc kubenswrapper[4917]: E0318 09:10:37.579042 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455\": container with ID starting with 4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455 not found: ID does not exist" containerID="4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.579082 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455"} err="failed to get container status \"4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455\": rpc error: code = NotFound desc = could not find container \"4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455\": container with ID starting with 4528f8e926c827c48383540b4873fc8d5e8dc8d59b20b9cd82138e788d0d9455 not found: ID does not exist" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.786247 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" path="/var/lib/kubelet/pods/796f3eb8-4dde-46fb-b922-655cc9269e7a/volumes" Mar 18 09:10:37 crc kubenswrapper[4917]: I0318 09:10:37.787216 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b77e881-8795-4449-bedf-625bdc184ff2" path="/var/lib/kubelet/pods/8b77e881-8795-4449-bedf-625bdc184ff2/volumes" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.572352 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.710621 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr59h\" (UniqueName: \"kubernetes.io/projected/3251a7ec-9729-4af2-9d5d-28812567f353-kube-api-access-rr59h\") pod \"3251a7ec-9729-4af2-9d5d-28812567f353\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.711423 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\") pod \"3251a7ec-9729-4af2-9d5d-28812567f353\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.711782 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/3251a7ec-9729-4af2-9d5d-28812567f353-ovn-data-cert\") pod \"3251a7ec-9729-4af2-9d5d-28812567f353\" (UID: \"3251a7ec-9729-4af2-9d5d-28812567f353\") " Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.716503 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3251a7ec-9729-4af2-9d5d-28812567f353-kube-api-access-rr59h" (OuterVolumeSpecName: "kube-api-access-rr59h") pod "3251a7ec-9729-4af2-9d5d-28812567f353" (UID: "3251a7ec-9729-4af2-9d5d-28812567f353"). InnerVolumeSpecName "kube-api-access-rr59h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.718575 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3251a7ec-9729-4af2-9d5d-28812567f353-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "3251a7ec-9729-4af2-9d5d-28812567f353" (UID: "3251a7ec-9729-4af2-9d5d-28812567f353"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.745043 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f" (OuterVolumeSpecName: "ovn-data") pod "3251a7ec-9729-4af2-9d5d-28812567f353" (UID: "3251a7ec-9729-4af2-9d5d-28812567f353"). InnerVolumeSpecName "pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.764207 4917 generic.go:334] "Generic (PLEG): container finished" podID="3251a7ec-9729-4af2-9d5d-28812567f353" containerID="72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c" exitCode=137 Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.764256 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.764281 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"3251a7ec-9729-4af2-9d5d-28812567f353","Type":"ContainerDied","Data":"72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c"} Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.764383 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"3251a7ec-9729-4af2-9d5d-28812567f353","Type":"ContainerDied","Data":"0b1471b1edc656e7bb37d9e430fed8c9356f9f8cdbf0c209826d218fedf1a8b6"} Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.764417 4917 scope.go:117] "RemoveContainer" containerID="72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.815941 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr59h\" (UniqueName: \"kubernetes.io/projected/3251a7ec-9729-4af2-9d5d-28812567f353-kube-api-access-rr59h\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.815990 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\") on node \"crc\" " Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.816006 4917 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/3251a7ec-9729-4af2-9d5d-28812567f353-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.839080 4917 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.839364 4917 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f") on node "crc" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.871735 4917 scope.go:117] "RemoveContainer" containerID="72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c" Mar 18 09:11:07 crc kubenswrapper[4917]: E0318 09:11:07.872366 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c\": container with ID starting with 72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c not found: ID does not exist" containerID="72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.872426 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c"} err="failed to get container status \"72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c\": rpc error: code = NotFound desc = could not find container \"72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c\": container with ID starting with 72c34ac0a99b17519cd13c3a5cd7fa6bb8a7850ab1e5a6155c94a93a80024e3c not found: ID does not exist" Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.883621 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.895613 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Mar 18 09:11:07 crc kubenswrapper[4917]: I0318 09:11:07.919192 4917 reconciler_common.go:293] "Volume detached for volume \"pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-662346c9-8cd6-4f8a-b86c-50a2a372cb9f\") on node \"crc\" DevicePath \"\"" Mar 18 09:11:09 crc kubenswrapper[4917]: I0318 09:11:09.795088 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3251a7ec-9729-4af2-9d5d-28812567f353" path="/var/lib/kubelet/pods/3251a7ec-9729-4af2-9d5d-28812567f353/volumes" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.353714 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 09:11:26 crc kubenswrapper[4917]: E0318 09:11:26.354640 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerName="extract-content" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.354655 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerName="extract-content" Mar 18 09:11:26 crc kubenswrapper[4917]: E0318 09:11:26.354680 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerName="extract-utilities" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.354687 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerName="extract-utilities" Mar 18 09:11:26 crc kubenswrapper[4917]: E0318 09:11:26.354704 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b77e881-8795-4449-bedf-625bdc184ff2" containerName="adoption" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.354711 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b77e881-8795-4449-bedf-625bdc184ff2" containerName="adoption" Mar 18 09:11:26 crc kubenswrapper[4917]: E0318 09:11:26.354720 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3251a7ec-9729-4af2-9d5d-28812567f353" containerName="adoption" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.354726 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3251a7ec-9729-4af2-9d5d-28812567f353" containerName="adoption" Mar 18 09:11:26 crc kubenswrapper[4917]: E0318 09:11:26.354744 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerName="registry-server" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.354750 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerName="registry-server" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.354952 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="796f3eb8-4dde-46fb-b922-655cc9269e7a" containerName="registry-server" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.354967 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3251a7ec-9729-4af2-9d5d-28812567f353" containerName="adoption" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.354975 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b77e881-8795-4449-bedf-625bdc184ff2" containerName="adoption" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.355868 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.359263 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.359303 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-c568t" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.359308 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.362233 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.369426 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.530127 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.530447 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.530526 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.530560 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-config-data\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.530603 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.530636 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.530714 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.530795 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whpfg\" (UniqueName: \"kubernetes.io/projected/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-kube-api-access-whpfg\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.530866 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.633153 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whpfg\" (UniqueName: \"kubernetes.io/projected/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-kube-api-access-whpfg\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.633293 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.633360 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.633435 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.633537 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.633613 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-config-data\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.633677 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.633719 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.633757 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.634068 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.635096 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.635622 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.636185 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-config-data\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.636604 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.648884 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.649875 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.655993 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.657293 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whpfg\" (UniqueName: \"kubernetes.io/projected/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-kube-api-access-whpfg\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.686834 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"tempest-tests-tempest\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " pod="openstack/tempest-tests-tempest" Mar 18 09:11:26 crc kubenswrapper[4917]: I0318 09:11:26.990372 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 09:11:27 crc kubenswrapper[4917]: I0318 09:11:27.535346 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 18 09:11:28 crc kubenswrapper[4917]: I0318 09:11:28.000201 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588","Type":"ContainerStarted","Data":"b45a3928562be138f9113f4f3226fc4c82f89042b89852071cb0893df0740988"} Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.148771 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563752-qrwbn"] Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.151128 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-qrwbn" Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.154788 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.155706 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.157858 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.165884 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-qrwbn"] Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.301780 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk94d\" (UniqueName: \"kubernetes.io/projected/c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4-kube-api-access-rk94d\") pod \"auto-csr-approver-29563752-qrwbn\" (UID: \"c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4\") " pod="openshift-infra/auto-csr-approver-29563752-qrwbn" Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.404441 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk94d\" (UniqueName: \"kubernetes.io/projected/c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4-kube-api-access-rk94d\") pod \"auto-csr-approver-29563752-qrwbn\" (UID: \"c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4\") " pod="openshift-infra/auto-csr-approver-29563752-qrwbn" Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.425809 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk94d\" (UniqueName: \"kubernetes.io/projected/c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4-kube-api-access-rk94d\") pod \"auto-csr-approver-29563752-qrwbn\" (UID: \"c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4\") " pod="openshift-infra/auto-csr-approver-29563752-qrwbn" Mar 18 09:12:00 crc kubenswrapper[4917]: I0318 09:12:00.477638 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-qrwbn" Mar 18 09:12:01 crc kubenswrapper[4917]: I0318 09:12:01.012153 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-qrwbn"] Mar 18 09:12:01 crc kubenswrapper[4917]: I0318 09:12:01.381112 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-qrwbn" event={"ID":"c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4","Type":"ContainerStarted","Data":"2bc861c18440ddc7a916853b4b0df49278c2b4e90c0a9a8b3b6be6b9aa1d9b0b"} Mar 18 09:12:02 crc kubenswrapper[4917]: I0318 09:12:02.392915 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-qrwbn" event={"ID":"c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4","Type":"ContainerStarted","Data":"1d1fbf0de98f051412c1396f8fdecd27f562184957f96bb074b576a499bdd6f8"} Mar 18 09:12:02 crc kubenswrapper[4917]: I0318 09:12:02.410421 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563752-qrwbn" podStartSLOduration=1.4078244899999999 podStartE2EDuration="2.410401509s" podCreationTimestamp="2026-03-18 09:12:00 +0000 UTC" firstStartedPulling="2026-03-18 09:12:01.016659016 +0000 UTC m=+8705.957813740" lastFinishedPulling="2026-03-18 09:12:02.019236045 +0000 UTC m=+8706.960390759" observedRunningTime="2026-03-18 09:12:02.405963762 +0000 UTC m=+8707.347118476" watchObservedRunningTime="2026-03-18 09:12:02.410401509 +0000 UTC m=+8707.351556223" Mar 18 09:12:02 crc kubenswrapper[4917]: I0318 09:12:02.929554 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:12:02 crc kubenswrapper[4917]: I0318 09:12:02.929652 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:12:03 crc kubenswrapper[4917]: I0318 09:12:03.407017 4917 generic.go:334] "Generic (PLEG): container finished" podID="c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4" containerID="1d1fbf0de98f051412c1396f8fdecd27f562184957f96bb074b576a499bdd6f8" exitCode=0 Mar 18 09:12:03 crc kubenswrapper[4917]: I0318 09:12:03.407058 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-qrwbn" event={"ID":"c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4","Type":"ContainerDied","Data":"1d1fbf0de98f051412c1396f8fdecd27f562184957f96bb074b576a499bdd6f8"} Mar 18 09:12:24 crc kubenswrapper[4917]: E0318 09:12:24.600367 4917 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:059169826d1e668c44c01b5bb9959b22" Mar 18 09:12:24 crc kubenswrapper[4917]: E0318 09:12:24.600987 4917 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:059169826d1e668c44c01b5bb9959b22" Mar 18 09:12:24 crc kubenswrapper[4917]: E0318 09:12:24.601143 4917 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:059169826d1e668c44c01b5bb9959b22,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-whpfg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(aea4965f-d2fe-4941-8ff8-1cf5cf9cd588): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 18 09:12:24 crc kubenswrapper[4917]: E0318 09:12:24.603225 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" Mar 18 09:12:24 crc kubenswrapper[4917]: I0318 09:12:24.645371 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-qrwbn" Mar 18 09:12:24 crc kubenswrapper[4917]: I0318 09:12:24.654039 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563752-qrwbn" event={"ID":"c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4","Type":"ContainerDied","Data":"2bc861c18440ddc7a916853b4b0df49278c2b4e90c0a9a8b3b6be6b9aa1d9b0b"} Mar 18 09:12:24 crc kubenswrapper[4917]: I0318 09:12:24.654135 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bc861c18440ddc7a916853b4b0df49278c2b4e90c0a9a8b3b6be6b9aa1d9b0b" Mar 18 09:12:24 crc kubenswrapper[4917]: I0318 09:12:24.654053 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563752-qrwbn" Mar 18 09:12:24 crc kubenswrapper[4917]: E0318 09:12:24.656325 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:059169826d1e668c44c01b5bb9959b22\\\"\"" pod="openstack/tempest-tests-tempest" podUID="aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" Mar 18 09:12:24 crc kubenswrapper[4917]: I0318 09:12:24.794106 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk94d\" (UniqueName: \"kubernetes.io/projected/c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4-kube-api-access-rk94d\") pod \"c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4\" (UID: \"c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4\") " Mar 18 09:12:24 crc kubenswrapper[4917]: I0318 09:12:24.804436 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4-kube-api-access-rk94d" (OuterVolumeSpecName: "kube-api-access-rk94d") pod "c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4" (UID: "c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4"). InnerVolumeSpecName "kube-api-access-rk94d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:12:24 crc kubenswrapper[4917]: I0318 09:12:24.897888 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk94d\" (UniqueName: \"kubernetes.io/projected/c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4-kube-api-access-rk94d\") on node \"crc\" DevicePath \"\"" Mar 18 09:12:25 crc kubenswrapper[4917]: I0318 09:12:25.735710 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-fr5st"] Mar 18 09:12:25 crc kubenswrapper[4917]: I0318 09:12:25.747648 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563746-fr5st"] Mar 18 09:12:25 crc kubenswrapper[4917]: I0318 09:12:25.787000 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eeb8e32-2533-486b-91e4-d733eaafb70d" path="/var/lib/kubelet/pods/4eeb8e32-2533-486b-91e4-d733eaafb70d/volumes" Mar 18 09:12:32 crc kubenswrapper[4917]: I0318 09:12:32.929512 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:12:32 crc kubenswrapper[4917]: I0318 09:12:32.932257 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:12:36 crc kubenswrapper[4917]: I0318 09:12:36.992635 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 18 09:12:38 crc kubenswrapper[4917]: I0318 09:12:38.823107 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588","Type":"ContainerStarted","Data":"6ff4d6c9fdd3c7ec1595a74efadc6a323f453b56577634b997b205bc4e78fbc8"} Mar 18 09:12:38 crc kubenswrapper[4917]: I0318 09:12:38.860566 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.413324695 podStartE2EDuration="1m13.860544796s" podCreationTimestamp="2026-03-18 09:11:25 +0000 UTC" firstStartedPulling="2026-03-18 09:11:27.541756956 +0000 UTC m=+8672.482911660" lastFinishedPulling="2026-03-18 09:12:36.988977037 +0000 UTC m=+8741.930131761" observedRunningTime="2026-03-18 09:12:38.845077141 +0000 UTC m=+8743.786231925" watchObservedRunningTime="2026-03-18 09:12:38.860544796 +0000 UTC m=+8743.801699520" Mar 18 09:13:02 crc kubenswrapper[4917]: I0318 09:13:02.928840 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:13:02 crc kubenswrapper[4917]: I0318 09:13:02.929478 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:13:02 crc kubenswrapper[4917]: I0318 09:13:02.929533 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 09:13:02 crc kubenswrapper[4917]: I0318 09:13:02.930178 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"67765209a4f153d01c0e93247cdedd237f55738ba9f38d37cf02a4ba6359a70d"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:13:02 crc kubenswrapper[4917]: I0318 09:13:02.930237 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://67765209a4f153d01c0e93247cdedd237f55738ba9f38d37cf02a4ba6359a70d" gracePeriod=600 Mar 18 09:13:03 crc kubenswrapper[4917]: I0318 09:13:03.093344 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="67765209a4f153d01c0e93247cdedd237f55738ba9f38d37cf02a4ba6359a70d" exitCode=0 Mar 18 09:13:03 crc kubenswrapper[4917]: I0318 09:13:03.093421 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"67765209a4f153d01c0e93247cdedd237f55738ba9f38d37cf02a4ba6359a70d"} Mar 18 09:13:03 crc kubenswrapper[4917]: I0318 09:13:03.093677 4917 scope.go:117] "RemoveContainer" containerID="f1c4ce283f3fbc45714638f7adc4e0302942da51d224ba146a639a634eb94e73" Mar 18 09:13:04 crc kubenswrapper[4917]: I0318 09:13:04.106345 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af"} Mar 18 09:13:16 crc kubenswrapper[4917]: I0318 09:13:16.889201 4917 scope.go:117] "RemoveContainer" containerID="c763888a30b0ead848dcecfeefb574af94d79233adeda08bb15c47d5aa9ff3a4" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.142227 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563754-t4xm2"] Mar 18 09:14:00 crc kubenswrapper[4917]: E0318 09:14:00.143308 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4" containerName="oc" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.143324 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4" containerName="oc" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.143603 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4" containerName="oc" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.144533 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-t4xm2" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.146943 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.147842 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.149700 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.179987 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-t4xm2"] Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.307779 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w28rc\" (UniqueName: \"kubernetes.io/projected/a9c31f30-5439-4941-9abc-9d944e2fadd8-kube-api-access-w28rc\") pod \"auto-csr-approver-29563754-t4xm2\" (UID: \"a9c31f30-5439-4941-9abc-9d944e2fadd8\") " pod="openshift-infra/auto-csr-approver-29563754-t4xm2" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.409155 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w28rc\" (UniqueName: \"kubernetes.io/projected/a9c31f30-5439-4941-9abc-9d944e2fadd8-kube-api-access-w28rc\") pod \"auto-csr-approver-29563754-t4xm2\" (UID: \"a9c31f30-5439-4941-9abc-9d944e2fadd8\") " pod="openshift-infra/auto-csr-approver-29563754-t4xm2" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.435392 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w28rc\" (UniqueName: \"kubernetes.io/projected/a9c31f30-5439-4941-9abc-9d944e2fadd8-kube-api-access-w28rc\") pod \"auto-csr-approver-29563754-t4xm2\" (UID: \"a9c31f30-5439-4941-9abc-9d944e2fadd8\") " pod="openshift-infra/auto-csr-approver-29563754-t4xm2" Mar 18 09:14:00 crc kubenswrapper[4917]: I0318 09:14:00.474626 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-t4xm2" Mar 18 09:14:01 crc kubenswrapper[4917]: I0318 09:14:01.003780 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-t4xm2"] Mar 18 09:14:01 crc kubenswrapper[4917]: I0318 09:14:01.733116 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-t4xm2" event={"ID":"a9c31f30-5439-4941-9abc-9d944e2fadd8","Type":"ContainerStarted","Data":"f7c328a2112002a928fe758e4f6c9fbf0f1b4d124e0b22b4bb6bfb3f0bc3c578"} Mar 18 09:14:03 crc kubenswrapper[4917]: I0318 09:14:03.753659 4917 generic.go:334] "Generic (PLEG): container finished" podID="a9c31f30-5439-4941-9abc-9d944e2fadd8" containerID="14e152afbc8557766eaf26b84669dd851e2d75ff95bf651be33f4f5ea90a6caf" exitCode=0 Mar 18 09:14:03 crc kubenswrapper[4917]: I0318 09:14:03.753706 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-t4xm2" event={"ID":"a9c31f30-5439-4941-9abc-9d944e2fadd8","Type":"ContainerDied","Data":"14e152afbc8557766eaf26b84669dd851e2d75ff95bf651be33f4f5ea90a6caf"} Mar 18 09:14:05 crc kubenswrapper[4917]: I0318 09:14:05.325453 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-t4xm2" Mar 18 09:14:05 crc kubenswrapper[4917]: I0318 09:14:05.422002 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w28rc\" (UniqueName: \"kubernetes.io/projected/a9c31f30-5439-4941-9abc-9d944e2fadd8-kube-api-access-w28rc\") pod \"a9c31f30-5439-4941-9abc-9d944e2fadd8\" (UID: \"a9c31f30-5439-4941-9abc-9d944e2fadd8\") " Mar 18 09:14:05 crc kubenswrapper[4917]: I0318 09:14:05.430898 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9c31f30-5439-4941-9abc-9d944e2fadd8-kube-api-access-w28rc" (OuterVolumeSpecName: "kube-api-access-w28rc") pod "a9c31f30-5439-4941-9abc-9d944e2fadd8" (UID: "a9c31f30-5439-4941-9abc-9d944e2fadd8"). InnerVolumeSpecName "kube-api-access-w28rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:14:05 crc kubenswrapper[4917]: I0318 09:14:05.525045 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w28rc\" (UniqueName: \"kubernetes.io/projected/a9c31f30-5439-4941-9abc-9d944e2fadd8-kube-api-access-w28rc\") on node \"crc\" DevicePath \"\"" Mar 18 09:14:05 crc kubenswrapper[4917]: I0318 09:14:05.786927 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563754-t4xm2" Mar 18 09:14:05 crc kubenswrapper[4917]: I0318 09:14:05.787653 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563754-t4xm2" event={"ID":"a9c31f30-5439-4941-9abc-9d944e2fadd8","Type":"ContainerDied","Data":"f7c328a2112002a928fe758e4f6c9fbf0f1b4d124e0b22b4bb6bfb3f0bc3c578"} Mar 18 09:14:05 crc kubenswrapper[4917]: I0318 09:14:05.787686 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7c328a2112002a928fe758e4f6c9fbf0f1b4d124e0b22b4bb6bfb3f0bc3c578" Mar 18 09:14:06 crc kubenswrapper[4917]: I0318 09:14:06.407640 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-t66f7"] Mar 18 09:14:06 crc kubenswrapper[4917]: I0318 09:14:06.418822 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563748-t66f7"] Mar 18 09:14:07 crc kubenswrapper[4917]: I0318 09:14:07.792353 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a739d2ec-8a23-4347-b084-b57bd50a6774" path="/var/lib/kubelet/pods/a739d2ec-8a23-4347-b084-b57bd50a6774/volumes" Mar 18 09:14:16 crc kubenswrapper[4917]: I0318 09:14:16.998193 4917 scope.go:117] "RemoveContainer" containerID="3fa8be2dfbdb8810d1b8408b3697f40a4ca30f403cf626a788294554a7360529" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.163943 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv"] Mar 18 09:15:00 crc kubenswrapper[4917]: E0318 09:15:00.165429 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9c31f30-5439-4941-9abc-9d944e2fadd8" containerName="oc" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.165451 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9c31f30-5439-4941-9abc-9d944e2fadd8" containerName="oc" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.165882 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9c31f30-5439-4941-9abc-9d944e2fadd8" containerName="oc" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.167064 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.168917 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.170125 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.181103 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv"] Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.280611 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bd965ae-442d-4f7e-8c32-295d25cc0c06-secret-volume\") pod \"collect-profiles-29563755-cdcfv\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.280658 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bd965ae-442d-4f7e-8c32-295d25cc0c06-config-volume\") pod \"collect-profiles-29563755-cdcfv\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.280811 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fn7\" (UniqueName: \"kubernetes.io/projected/1bd965ae-442d-4f7e-8c32-295d25cc0c06-kube-api-access-t5fn7\") pod \"collect-profiles-29563755-cdcfv\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.383604 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bd965ae-442d-4f7e-8c32-295d25cc0c06-secret-volume\") pod \"collect-profiles-29563755-cdcfv\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.383673 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bd965ae-442d-4f7e-8c32-295d25cc0c06-config-volume\") pod \"collect-profiles-29563755-cdcfv\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.383740 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fn7\" (UniqueName: \"kubernetes.io/projected/1bd965ae-442d-4f7e-8c32-295d25cc0c06-kube-api-access-t5fn7\") pod \"collect-profiles-29563755-cdcfv\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.386106 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bd965ae-442d-4f7e-8c32-295d25cc0c06-config-volume\") pod \"collect-profiles-29563755-cdcfv\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.393784 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bd965ae-442d-4f7e-8c32-295d25cc0c06-secret-volume\") pod \"collect-profiles-29563755-cdcfv\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.408780 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fn7\" (UniqueName: \"kubernetes.io/projected/1bd965ae-442d-4f7e-8c32-295d25cc0c06-kube-api-access-t5fn7\") pod \"collect-profiles-29563755-cdcfv\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.492448 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:00 crc kubenswrapper[4917]: I0318 09:15:00.965255 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv"] Mar 18 09:15:00 crc kubenswrapper[4917]: W0318 09:15:00.999802 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bd965ae_442d_4f7e_8c32_295d25cc0c06.slice/crio-de48d0ab8c189e09f4d6168e538d1882a9af453b8df35d27d0b35af39936641c WatchSource:0}: Error finding container de48d0ab8c189e09f4d6168e538d1882a9af453b8df35d27d0b35af39936641c: Status 404 returned error can't find the container with id de48d0ab8c189e09f4d6168e538d1882a9af453b8df35d27d0b35af39936641c Mar 18 09:15:01 crc kubenswrapper[4917]: I0318 09:15:01.286470 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" event={"ID":"1bd965ae-442d-4f7e-8c32-295d25cc0c06","Type":"ContainerStarted","Data":"f3e44217b3a630d4547de57d7b1a9832a17be9f549298d8a912b03c7535c18f4"} Mar 18 09:15:01 crc kubenswrapper[4917]: I0318 09:15:01.286834 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" event={"ID":"1bd965ae-442d-4f7e-8c32-295d25cc0c06","Type":"ContainerStarted","Data":"de48d0ab8c189e09f4d6168e538d1882a9af453b8df35d27d0b35af39936641c"} Mar 18 09:15:01 crc kubenswrapper[4917]: I0318 09:15:01.307165 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" podStartSLOduration=1.307141582 podStartE2EDuration="1.307141582s" podCreationTimestamp="2026-03-18 09:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:15:01.300147892 +0000 UTC m=+8886.241302606" watchObservedRunningTime="2026-03-18 09:15:01.307141582 +0000 UTC m=+8886.248296316" Mar 18 09:15:02 crc kubenswrapper[4917]: I0318 09:15:02.296491 4917 generic.go:334] "Generic (PLEG): container finished" podID="1bd965ae-442d-4f7e-8c32-295d25cc0c06" containerID="f3e44217b3a630d4547de57d7b1a9832a17be9f549298d8a912b03c7535c18f4" exitCode=0 Mar 18 09:15:02 crc kubenswrapper[4917]: I0318 09:15:02.296642 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" event={"ID":"1bd965ae-442d-4f7e-8c32-295d25cc0c06","Type":"ContainerDied","Data":"f3e44217b3a630d4547de57d7b1a9832a17be9f549298d8a912b03c7535c18f4"} Mar 18 09:15:03 crc kubenswrapper[4917]: I0318 09:15:03.892760 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.056784 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5fn7\" (UniqueName: \"kubernetes.io/projected/1bd965ae-442d-4f7e-8c32-295d25cc0c06-kube-api-access-t5fn7\") pod \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.056858 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bd965ae-442d-4f7e-8c32-295d25cc0c06-config-volume\") pod \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.056980 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bd965ae-442d-4f7e-8c32-295d25cc0c06-secret-volume\") pod \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\" (UID: \"1bd965ae-442d-4f7e-8c32-295d25cc0c06\") " Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.057890 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bd965ae-442d-4f7e-8c32-295d25cc0c06-config-volume" (OuterVolumeSpecName: "config-volume") pod "1bd965ae-442d-4f7e-8c32-295d25cc0c06" (UID: "1bd965ae-442d-4f7e-8c32-295d25cc0c06"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.062673 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bd965ae-442d-4f7e-8c32-295d25cc0c06-kube-api-access-t5fn7" (OuterVolumeSpecName: "kube-api-access-t5fn7") pod "1bd965ae-442d-4f7e-8c32-295d25cc0c06" (UID: "1bd965ae-442d-4f7e-8c32-295d25cc0c06"). InnerVolumeSpecName "kube-api-access-t5fn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.066838 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bd965ae-442d-4f7e-8c32-295d25cc0c06-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1bd965ae-442d-4f7e-8c32-295d25cc0c06" (UID: "1bd965ae-442d-4f7e-8c32-295d25cc0c06"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.159730 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1bd965ae-442d-4f7e-8c32-295d25cc0c06-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.159771 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1bd965ae-442d-4f7e-8c32-295d25cc0c06-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.159786 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5fn7\" (UniqueName: \"kubernetes.io/projected/1bd965ae-442d-4f7e-8c32-295d25cc0c06-kube-api-access-t5fn7\") on node \"crc\" DevicePath \"\"" Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.314863 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" event={"ID":"1bd965ae-442d-4f7e-8c32-295d25cc0c06","Type":"ContainerDied","Data":"de48d0ab8c189e09f4d6168e538d1882a9af453b8df35d27d0b35af39936641c"} Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.314900 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de48d0ab8c189e09f4d6168e538d1882a9af453b8df35d27d0b35af39936641c" Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.314934 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563755-cdcfv" Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.390539 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp"] Mar 18 09:15:04 crc kubenswrapper[4917]: I0318 09:15:04.404748 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563710-lfvrp"] Mar 18 09:15:05 crc kubenswrapper[4917]: I0318 09:15:05.786266 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b9f4ce5-4103-41a4-8874-39c1d4697c45" path="/var/lib/kubelet/pods/4b9f4ce5-4103-41a4-8874-39c1d4697c45/volumes" Mar 18 09:15:17 crc kubenswrapper[4917]: I0318 09:15:17.087436 4917 scope.go:117] "RemoveContainer" containerID="436adbffa75c926bea21413d97049e7a68a9cc3d5ebec46fd13a9b904ad67375" Mar 18 09:15:32 crc kubenswrapper[4917]: I0318 09:15:32.928601 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:15:32 crc kubenswrapper[4917]: I0318 09:15:32.929094 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.841201 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-klbhj"] Mar 18 09:15:51 crc kubenswrapper[4917]: E0318 09:15:51.844929 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bd965ae-442d-4f7e-8c32-295d25cc0c06" containerName="collect-profiles" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.844963 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bd965ae-442d-4f7e-8c32-295d25cc0c06" containerName="collect-profiles" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.845334 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bd965ae-442d-4f7e-8c32-295d25cc0c06" containerName="collect-profiles" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.848652 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.854413 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-klbhj"] Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.870484 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4dzd\" (UniqueName: \"kubernetes.io/projected/279506a5-dd50-4b7a-9f4a-26ab98154942-kube-api-access-x4dzd\") pod \"certified-operators-klbhj\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.871379 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-utilities\") pod \"certified-operators-klbhj\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.871421 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-catalog-content\") pod \"certified-operators-klbhj\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.973422 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4dzd\" (UniqueName: \"kubernetes.io/projected/279506a5-dd50-4b7a-9f4a-26ab98154942-kube-api-access-x4dzd\") pod \"certified-operators-klbhj\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.973990 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-utilities\") pod \"certified-operators-klbhj\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.974036 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-catalog-content\") pod \"certified-operators-klbhj\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.974546 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-catalog-content\") pod \"certified-operators-klbhj\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.974700 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-utilities\") pod \"certified-operators-klbhj\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:51 crc kubenswrapper[4917]: I0318 09:15:51.998817 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4dzd\" (UniqueName: \"kubernetes.io/projected/279506a5-dd50-4b7a-9f4a-26ab98154942-kube-api-access-x4dzd\") pod \"certified-operators-klbhj\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:52 crc kubenswrapper[4917]: I0318 09:15:52.189699 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:15:52 crc kubenswrapper[4917]: I0318 09:15:52.828149 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-klbhj"] Mar 18 09:15:53 crc kubenswrapper[4917]: I0318 09:15:53.817990 4917 generic.go:334] "Generic (PLEG): container finished" podID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerID="230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f" exitCode=0 Mar 18 09:15:53 crc kubenswrapper[4917]: I0318 09:15:53.818099 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klbhj" event={"ID":"279506a5-dd50-4b7a-9f4a-26ab98154942","Type":"ContainerDied","Data":"230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f"} Mar 18 09:15:53 crc kubenswrapper[4917]: I0318 09:15:53.818374 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klbhj" event={"ID":"279506a5-dd50-4b7a-9f4a-26ab98154942","Type":"ContainerStarted","Data":"c15ddece6f33e62dbe2a658cb7f6f828d48320fbf30628acd63f1bc5bd3a6157"} Mar 18 09:15:53 crc kubenswrapper[4917]: I0318 09:15:53.820082 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:15:55 crc kubenswrapper[4917]: I0318 09:15:55.847856 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klbhj" event={"ID":"279506a5-dd50-4b7a-9f4a-26ab98154942","Type":"ContainerStarted","Data":"5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6"} Mar 18 09:15:56 crc kubenswrapper[4917]: I0318 09:15:56.860815 4917 generic.go:334] "Generic (PLEG): container finished" podID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerID="5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6" exitCode=0 Mar 18 09:15:56 crc kubenswrapper[4917]: I0318 09:15:56.860910 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klbhj" event={"ID":"279506a5-dd50-4b7a-9f4a-26ab98154942","Type":"ContainerDied","Data":"5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6"} Mar 18 09:15:57 crc kubenswrapper[4917]: I0318 09:15:57.871747 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klbhj" event={"ID":"279506a5-dd50-4b7a-9f4a-26ab98154942","Type":"ContainerStarted","Data":"51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a"} Mar 18 09:15:57 crc kubenswrapper[4917]: I0318 09:15:57.897645 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-klbhj" podStartSLOduration=3.442464194 podStartE2EDuration="6.89762997s" podCreationTimestamp="2026-03-18 09:15:51 +0000 UTC" firstStartedPulling="2026-03-18 09:15:53.819781587 +0000 UTC m=+8938.760936311" lastFinishedPulling="2026-03-18 09:15:57.274947373 +0000 UTC m=+8942.216102087" observedRunningTime="2026-03-18 09:15:57.892433955 +0000 UTC m=+8942.833588669" watchObservedRunningTime="2026-03-18 09:15:57.89762997 +0000 UTC m=+8942.838784684" Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.207047 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563756-9l56h"] Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.208858 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-9l56h" Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.212559 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.212878 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.212884 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.240236 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-9l56h"] Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.355685 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gbc4\" (UniqueName: \"kubernetes.io/projected/224ca311-56b5-49d6-9363-83b4671e465a-kube-api-access-7gbc4\") pod \"auto-csr-approver-29563756-9l56h\" (UID: \"224ca311-56b5-49d6-9363-83b4671e465a\") " pod="openshift-infra/auto-csr-approver-29563756-9l56h" Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.457245 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gbc4\" (UniqueName: \"kubernetes.io/projected/224ca311-56b5-49d6-9363-83b4671e465a-kube-api-access-7gbc4\") pod \"auto-csr-approver-29563756-9l56h\" (UID: \"224ca311-56b5-49d6-9363-83b4671e465a\") " pod="openshift-infra/auto-csr-approver-29563756-9l56h" Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.477304 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gbc4\" (UniqueName: \"kubernetes.io/projected/224ca311-56b5-49d6-9363-83b4671e465a-kube-api-access-7gbc4\") pod \"auto-csr-approver-29563756-9l56h\" (UID: \"224ca311-56b5-49d6-9363-83b4671e465a\") " pod="openshift-infra/auto-csr-approver-29563756-9l56h" Mar 18 09:16:00 crc kubenswrapper[4917]: I0318 09:16:00.534456 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-9l56h" Mar 18 09:16:01 crc kubenswrapper[4917]: I0318 09:16:01.222951 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-9l56h"] Mar 18 09:16:01 crc kubenswrapper[4917]: W0318 09:16:01.230492 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod224ca311_56b5_49d6_9363_83b4671e465a.slice/crio-80dc30dbc12811967eadbed35e634fe1f3d117efa2b51203fde0c644fb5aa255 WatchSource:0}: Error finding container 80dc30dbc12811967eadbed35e634fe1f3d117efa2b51203fde0c644fb5aa255: Status 404 returned error can't find the container with id 80dc30dbc12811967eadbed35e634fe1f3d117efa2b51203fde0c644fb5aa255 Mar 18 09:16:01 crc kubenswrapper[4917]: I0318 09:16:01.910193 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-9l56h" event={"ID":"224ca311-56b5-49d6-9363-83b4671e465a","Type":"ContainerStarted","Data":"80dc30dbc12811967eadbed35e634fe1f3d117efa2b51203fde0c644fb5aa255"} Mar 18 09:16:02 crc kubenswrapper[4917]: I0318 09:16:02.190354 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:16:02 crc kubenswrapper[4917]: I0318 09:16:02.190406 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:16:02 crc kubenswrapper[4917]: I0318 09:16:02.249136 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:16:02 crc kubenswrapper[4917]: I0318 09:16:02.922990 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-9l56h" event={"ID":"224ca311-56b5-49d6-9363-83b4671e465a","Type":"ContainerStarted","Data":"6deceed5dc238cfac13b076513f9a9d1df47d2f87fd6dbc4c67a22144eef1e7f"} Mar 18 09:16:02 crc kubenswrapper[4917]: I0318 09:16:02.928909 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:16:02 crc kubenswrapper[4917]: I0318 09:16:02.928998 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:16:02 crc kubenswrapper[4917]: I0318 09:16:02.940488 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563756-9l56h" podStartSLOduration=1.681892736 podStartE2EDuration="2.940466062s" podCreationTimestamp="2026-03-18 09:16:00 +0000 UTC" firstStartedPulling="2026-03-18 09:16:01.233283159 +0000 UTC m=+8946.174437883" lastFinishedPulling="2026-03-18 09:16:02.491856465 +0000 UTC m=+8947.433011209" observedRunningTime="2026-03-18 09:16:02.937619623 +0000 UTC m=+8947.878774357" watchObservedRunningTime="2026-03-18 09:16:02.940466062 +0000 UTC m=+8947.881620786" Mar 18 09:16:02 crc kubenswrapper[4917]: I0318 09:16:02.976776 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:16:03 crc kubenswrapper[4917]: I0318 09:16:03.033370 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-klbhj"] Mar 18 09:16:03 crc kubenswrapper[4917]: I0318 09:16:03.937543 4917 generic.go:334] "Generic (PLEG): container finished" podID="224ca311-56b5-49d6-9363-83b4671e465a" containerID="6deceed5dc238cfac13b076513f9a9d1df47d2f87fd6dbc4c67a22144eef1e7f" exitCode=0 Mar 18 09:16:03 crc kubenswrapper[4917]: I0318 09:16:03.937738 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-9l56h" event={"ID":"224ca311-56b5-49d6-9363-83b4671e465a","Type":"ContainerDied","Data":"6deceed5dc238cfac13b076513f9a9d1df47d2f87fd6dbc4c67a22144eef1e7f"} Mar 18 09:16:04 crc kubenswrapper[4917]: I0318 09:16:04.947928 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-klbhj" podUID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerName="registry-server" containerID="cri-o://51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a" gracePeriod=2 Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.336476 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-9l56h" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.497800 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gbc4\" (UniqueName: \"kubernetes.io/projected/224ca311-56b5-49d6-9363-83b4671e465a-kube-api-access-7gbc4\") pod \"224ca311-56b5-49d6-9363-83b4671e465a\" (UID: \"224ca311-56b5-49d6-9363-83b4671e465a\") " Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.502795 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/224ca311-56b5-49d6-9363-83b4671e465a-kube-api-access-7gbc4" (OuterVolumeSpecName: "kube-api-access-7gbc4") pod "224ca311-56b5-49d6-9363-83b4671e465a" (UID: "224ca311-56b5-49d6-9363-83b4671e465a"). InnerVolumeSpecName "kube-api-access-7gbc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.600755 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gbc4\" (UniqueName: \"kubernetes.io/projected/224ca311-56b5-49d6-9363-83b4671e465a-kube-api-access-7gbc4\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.634714 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.804535 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4dzd\" (UniqueName: \"kubernetes.io/projected/279506a5-dd50-4b7a-9f4a-26ab98154942-kube-api-access-x4dzd\") pod \"279506a5-dd50-4b7a-9f4a-26ab98154942\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.804706 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-utilities\") pod \"279506a5-dd50-4b7a-9f4a-26ab98154942\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.804892 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-catalog-content\") pod \"279506a5-dd50-4b7a-9f4a-26ab98154942\" (UID: \"279506a5-dd50-4b7a-9f4a-26ab98154942\") " Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.805752 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-utilities" (OuterVolumeSpecName: "utilities") pod "279506a5-dd50-4b7a-9f4a-26ab98154942" (UID: "279506a5-dd50-4b7a-9f4a-26ab98154942"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.807460 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.809415 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279506a5-dd50-4b7a-9f4a-26ab98154942-kube-api-access-x4dzd" (OuterVolumeSpecName: "kube-api-access-x4dzd") pod "279506a5-dd50-4b7a-9f4a-26ab98154942" (UID: "279506a5-dd50-4b7a-9f4a-26ab98154942"). InnerVolumeSpecName "kube-api-access-x4dzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.857212 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "279506a5-dd50-4b7a-9f4a-26ab98154942" (UID: "279506a5-dd50-4b7a-9f4a-26ab98154942"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.909735 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/279506a5-dd50-4b7a-9f4a-26ab98154942-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.909794 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4dzd\" (UniqueName: \"kubernetes.io/projected/279506a5-dd50-4b7a-9f4a-26ab98154942-kube-api-access-x4dzd\") on node \"crc\" DevicePath \"\"" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.958861 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563756-9l56h" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.958894 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563756-9l56h" event={"ID":"224ca311-56b5-49d6-9363-83b4671e465a","Type":"ContainerDied","Data":"80dc30dbc12811967eadbed35e634fe1f3d117efa2b51203fde0c644fb5aa255"} Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.958938 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80dc30dbc12811967eadbed35e634fe1f3d117efa2b51203fde0c644fb5aa255" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.963232 4917 generic.go:334] "Generic (PLEG): container finished" podID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerID="51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a" exitCode=0 Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.963512 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klbhj" event={"ID":"279506a5-dd50-4b7a-9f4a-26ab98154942","Type":"ContainerDied","Data":"51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a"} Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.963539 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-klbhj" event={"ID":"279506a5-dd50-4b7a-9f4a-26ab98154942","Type":"ContainerDied","Data":"c15ddece6f33e62dbe2a658cb7f6f828d48320fbf30628acd63f1bc5bd3a6157"} Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.963558 4917 scope.go:117] "RemoveContainer" containerID="51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a" Mar 18 09:16:05 crc kubenswrapper[4917]: I0318 09:16:05.963715 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-klbhj" Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.015576 4917 scope.go:117] "RemoveContainer" containerID="5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6" Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.033114 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-klbhj"] Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.044983 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-klbhj"] Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.052629 4917 scope.go:117] "RemoveContainer" containerID="230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f" Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.056756 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-6sqbc"] Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.070045 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563750-6sqbc"] Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.080190 4917 scope.go:117] "RemoveContainer" containerID="51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a" Mar 18 09:16:06 crc kubenswrapper[4917]: E0318 09:16:06.080874 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a\": container with ID starting with 51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a not found: ID does not exist" containerID="51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a" Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.080905 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a"} err="failed to get container status \"51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a\": rpc error: code = NotFound desc = could not find container \"51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a\": container with ID starting with 51260a925e7da3334025a927e12d2dcc3b5a20f7793c8fef374fe578beb1a79a not found: ID does not exist" Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.080925 4917 scope.go:117] "RemoveContainer" containerID="5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6" Mar 18 09:16:06 crc kubenswrapper[4917]: E0318 09:16:06.081175 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6\": container with ID starting with 5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6 not found: ID does not exist" containerID="5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6" Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.081321 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6"} err="failed to get container status \"5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6\": rpc error: code = NotFound desc = could not find container \"5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6\": container with ID starting with 5db54ba06023150d6d60c041113f3f933a33eee6c789bb1bad75d329ba7043d6 not found: ID does not exist" Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.081341 4917 scope.go:117] "RemoveContainer" containerID="230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f" Mar 18 09:16:06 crc kubenswrapper[4917]: E0318 09:16:06.081561 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f\": container with ID starting with 230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f not found: ID does not exist" containerID="230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f" Mar 18 09:16:06 crc kubenswrapper[4917]: I0318 09:16:06.081623 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f"} err="failed to get container status \"230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f\": rpc error: code = NotFound desc = could not find container \"230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f\": container with ID starting with 230bef1a5de854744eedfa2388cdd4c856cdeba18f8a6e0211c1cbdc12dcc84f not found: ID does not exist" Mar 18 09:16:07 crc kubenswrapper[4917]: I0318 09:16:07.803294 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279506a5-dd50-4b7a-9f4a-26ab98154942" path="/var/lib/kubelet/pods/279506a5-dd50-4b7a-9f4a-26ab98154942/volumes" Mar 18 09:16:07 crc kubenswrapper[4917]: I0318 09:16:07.806513 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d621211b-6956-4999-b866-77cfc9b5daf7" path="/var/lib/kubelet/pods/d621211b-6956-4999-b866-77cfc9b5daf7/volumes" Mar 18 09:16:17 crc kubenswrapper[4917]: I0318 09:16:17.321566 4917 scope.go:117] "RemoveContainer" containerID="7cd285717773f60f5b9f90204f0d37c67408b622a06305ab5e34c0b3606b1b31" Mar 18 09:16:32 crc kubenswrapper[4917]: I0318 09:16:32.929838 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:16:32 crc kubenswrapper[4917]: I0318 09:16:32.930748 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:16:32 crc kubenswrapper[4917]: I0318 09:16:32.931509 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 09:16:32 crc kubenswrapper[4917]: I0318 09:16:32.932842 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:16:32 crc kubenswrapper[4917]: I0318 09:16:32.932980 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" gracePeriod=600 Mar 18 09:16:33 crc kubenswrapper[4917]: E0318 09:16:33.060454 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:16:33 crc kubenswrapper[4917]: I0318 09:16:33.213536 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" exitCode=0 Mar 18 09:16:33 crc kubenswrapper[4917]: I0318 09:16:33.213627 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af"} Mar 18 09:16:33 crc kubenswrapper[4917]: I0318 09:16:33.214752 4917 scope.go:117] "RemoveContainer" containerID="67765209a4f153d01c0e93247cdedd237f55738ba9f38d37cf02a4ba6359a70d" Mar 18 09:16:33 crc kubenswrapper[4917]: I0318 09:16:33.215907 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:16:33 crc kubenswrapper[4917]: E0318 09:16:33.216479 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:16:46 crc kubenswrapper[4917]: I0318 09:16:46.772972 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:16:46 crc kubenswrapper[4917]: E0318 09:16:46.773632 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:17:00 crc kubenswrapper[4917]: I0318 09:17:00.772536 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:17:00 crc kubenswrapper[4917]: E0318 09:17:00.773206 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:17:11 crc kubenswrapper[4917]: I0318 09:17:11.773716 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:17:11 crc kubenswrapper[4917]: E0318 09:17:11.774754 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:17:22 crc kubenswrapper[4917]: I0318 09:17:22.772830 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:17:22 crc kubenswrapper[4917]: E0318 09:17:22.773747 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:17:33 crc kubenswrapper[4917]: I0318 09:17:33.772356 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:17:33 crc kubenswrapper[4917]: E0318 09:17:33.773117 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:17:44 crc kubenswrapper[4917]: I0318 09:17:44.772377 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:17:44 crc kubenswrapper[4917]: E0318 09:17:44.773014 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:17:56 crc kubenswrapper[4917]: I0318 09:17:56.773195 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:17:56 crc kubenswrapper[4917]: E0318 09:17:56.773889 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.149640 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563758-q56hs"] Mar 18 09:18:00 crc kubenswrapper[4917]: E0318 09:18:00.150577 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="224ca311-56b5-49d6-9363-83b4671e465a" containerName="oc" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.150605 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="224ca311-56b5-49d6-9363-83b4671e465a" containerName="oc" Mar 18 09:18:00 crc kubenswrapper[4917]: E0318 09:18:00.150620 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.150625 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4917]: E0318 09:18:00.150648 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.150654 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerName="extract-content" Mar 18 09:18:00 crc kubenswrapper[4917]: E0318 09:18:00.150671 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.150678 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerName="extract-utilities" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.150874 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="224ca311-56b5-49d6-9363-83b4671e465a" containerName="oc" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.150888 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="279506a5-dd50-4b7a-9f4a-26ab98154942" containerName="registry-server" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.151610 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-q56hs" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.154754 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.154838 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.154766 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.165408 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-q56hs"] Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.331891 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzw9r\" (UniqueName: \"kubernetes.io/projected/d1a4852f-8b12-447d-93b6-c325d1c5109e-kube-api-access-rzw9r\") pod \"auto-csr-approver-29563758-q56hs\" (UID: \"d1a4852f-8b12-447d-93b6-c325d1c5109e\") " pod="openshift-infra/auto-csr-approver-29563758-q56hs" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.433991 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzw9r\" (UniqueName: \"kubernetes.io/projected/d1a4852f-8b12-447d-93b6-c325d1c5109e-kube-api-access-rzw9r\") pod \"auto-csr-approver-29563758-q56hs\" (UID: \"d1a4852f-8b12-447d-93b6-c325d1c5109e\") " pod="openshift-infra/auto-csr-approver-29563758-q56hs" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.467043 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzw9r\" (UniqueName: \"kubernetes.io/projected/d1a4852f-8b12-447d-93b6-c325d1c5109e-kube-api-access-rzw9r\") pod \"auto-csr-approver-29563758-q56hs\" (UID: \"d1a4852f-8b12-447d-93b6-c325d1c5109e\") " pod="openshift-infra/auto-csr-approver-29563758-q56hs" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.475703 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-q56hs" Mar 18 09:18:00 crc kubenswrapper[4917]: I0318 09:18:00.953620 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-q56hs"] Mar 18 09:18:01 crc kubenswrapper[4917]: I0318 09:18:01.072023 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-q56hs" event={"ID":"d1a4852f-8b12-447d-93b6-c325d1c5109e","Type":"ContainerStarted","Data":"fccd90cd625e936429546999b5b80353dba09c115643a4412623af35e77d224f"} Mar 18 09:18:03 crc kubenswrapper[4917]: I0318 09:18:03.088774 4917 generic.go:334] "Generic (PLEG): container finished" podID="d1a4852f-8b12-447d-93b6-c325d1c5109e" containerID="9eedb725cae0f961441ea3fb8e05828474a44b87e67341edb50400b8b0449055" exitCode=0 Mar 18 09:18:03 crc kubenswrapper[4917]: I0318 09:18:03.089256 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-q56hs" event={"ID":"d1a4852f-8b12-447d-93b6-c325d1c5109e","Type":"ContainerDied","Data":"9eedb725cae0f961441ea3fb8e05828474a44b87e67341edb50400b8b0449055"} Mar 18 09:18:04 crc kubenswrapper[4917]: I0318 09:18:04.583950 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-q56hs" Mar 18 09:18:04 crc kubenswrapper[4917]: I0318 09:18:04.729707 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzw9r\" (UniqueName: \"kubernetes.io/projected/d1a4852f-8b12-447d-93b6-c325d1c5109e-kube-api-access-rzw9r\") pod \"d1a4852f-8b12-447d-93b6-c325d1c5109e\" (UID: \"d1a4852f-8b12-447d-93b6-c325d1c5109e\") " Mar 18 09:18:04 crc kubenswrapper[4917]: I0318 09:18:04.735535 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a4852f-8b12-447d-93b6-c325d1c5109e-kube-api-access-rzw9r" (OuterVolumeSpecName: "kube-api-access-rzw9r") pod "d1a4852f-8b12-447d-93b6-c325d1c5109e" (UID: "d1a4852f-8b12-447d-93b6-c325d1c5109e"). InnerVolumeSpecName "kube-api-access-rzw9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:18:04 crc kubenswrapper[4917]: I0318 09:18:04.832126 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzw9r\" (UniqueName: \"kubernetes.io/projected/d1a4852f-8b12-447d-93b6-c325d1c5109e-kube-api-access-rzw9r\") on node \"crc\" DevicePath \"\"" Mar 18 09:18:05 crc kubenswrapper[4917]: I0318 09:18:05.109114 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563758-q56hs" event={"ID":"d1a4852f-8b12-447d-93b6-c325d1c5109e","Type":"ContainerDied","Data":"fccd90cd625e936429546999b5b80353dba09c115643a4412623af35e77d224f"} Mar 18 09:18:05 crc kubenswrapper[4917]: I0318 09:18:05.109150 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fccd90cd625e936429546999b5b80353dba09c115643a4412623af35e77d224f" Mar 18 09:18:05 crc kubenswrapper[4917]: I0318 09:18:05.109160 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563758-q56hs" Mar 18 09:18:05 crc kubenswrapper[4917]: I0318 09:18:05.662267 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-qrwbn"] Mar 18 09:18:05 crc kubenswrapper[4917]: I0318 09:18:05.675224 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563752-qrwbn"] Mar 18 09:18:05 crc kubenswrapper[4917]: I0318 09:18:05.786979 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4" path="/var/lib/kubelet/pods/c0d1525b-fa9d-4e23-a27b-bd90f3a9ebf4/volumes" Mar 18 09:18:10 crc kubenswrapper[4917]: I0318 09:18:10.772863 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:18:10 crc kubenswrapper[4917]: E0318 09:18:10.773683 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:18:17 crc kubenswrapper[4917]: I0318 09:18:17.438479 4917 scope.go:117] "RemoveContainer" containerID="1d1fbf0de98f051412c1396f8fdecd27f562184957f96bb074b576a499bdd6f8" Mar 18 09:18:22 crc kubenswrapper[4917]: I0318 09:18:22.772928 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:18:22 crc kubenswrapper[4917]: E0318 09:18:22.774196 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:18:36 crc kubenswrapper[4917]: I0318 09:18:36.773177 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:18:36 crc kubenswrapper[4917]: E0318 09:18:36.774092 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:18:47 crc kubenswrapper[4917]: I0318 09:18:47.773131 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:18:47 crc kubenswrapper[4917]: E0318 09:18:47.773929 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:19:00 crc kubenswrapper[4917]: I0318 09:19:00.773910 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:19:00 crc kubenswrapper[4917]: E0318 09:19:00.775246 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.619894 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nzrf7"] Mar 18 09:19:03 crc kubenswrapper[4917]: E0318 09:19:03.620560 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a4852f-8b12-447d-93b6-c325d1c5109e" containerName="oc" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.620573 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a4852f-8b12-447d-93b6-c325d1c5109e" containerName="oc" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.620792 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a4852f-8b12-447d-93b6-c325d1c5109e" containerName="oc" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.622282 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.651029 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nzrf7"] Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.715662 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-utilities\") pod \"community-operators-nzrf7\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.715720 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-649fq\" (UniqueName: \"kubernetes.io/projected/51a3ea95-f7a2-41ec-857f-440f93273b2e-kube-api-access-649fq\") pod \"community-operators-nzrf7\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.716138 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-catalog-content\") pod \"community-operators-nzrf7\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.817964 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-catalog-content\") pod \"community-operators-nzrf7\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.818115 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-utilities\") pod \"community-operators-nzrf7\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.818149 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-649fq\" (UniqueName: \"kubernetes.io/projected/51a3ea95-f7a2-41ec-857f-440f93273b2e-kube-api-access-649fq\") pod \"community-operators-nzrf7\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.818804 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-catalog-content\") pod \"community-operators-nzrf7\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.819054 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-utilities\") pod \"community-operators-nzrf7\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.838455 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-649fq\" (UniqueName: \"kubernetes.io/projected/51a3ea95-f7a2-41ec-857f-440f93273b2e-kube-api-access-649fq\") pod \"community-operators-nzrf7\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:03 crc kubenswrapper[4917]: I0318 09:19:03.944918 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:04 crc kubenswrapper[4917]: I0318 09:19:04.525138 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nzrf7"] Mar 18 09:19:04 crc kubenswrapper[4917]: I0318 09:19:04.742833 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzrf7" event={"ID":"51a3ea95-f7a2-41ec-857f-440f93273b2e","Type":"ContainerStarted","Data":"877bb26069ce5a47670ffddf59f0905c232cb4f606fb03b22dbbe6c8129cafab"} Mar 18 09:19:05 crc kubenswrapper[4917]: I0318 09:19:05.759073 4917 generic.go:334] "Generic (PLEG): container finished" podID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerID="e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3" exitCode=0 Mar 18 09:19:05 crc kubenswrapper[4917]: I0318 09:19:05.759420 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzrf7" event={"ID":"51a3ea95-f7a2-41ec-857f-440f93273b2e","Type":"ContainerDied","Data":"e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3"} Mar 18 09:19:07 crc kubenswrapper[4917]: I0318 09:19:07.789235 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzrf7" event={"ID":"51a3ea95-f7a2-41ec-857f-440f93273b2e","Type":"ContainerStarted","Data":"d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb"} Mar 18 09:19:09 crc kubenswrapper[4917]: I0318 09:19:09.808778 4917 generic.go:334] "Generic (PLEG): container finished" podID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerID="d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb" exitCode=0 Mar 18 09:19:09 crc kubenswrapper[4917]: I0318 09:19:09.808885 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzrf7" event={"ID":"51a3ea95-f7a2-41ec-857f-440f93273b2e","Type":"ContainerDied","Data":"d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb"} Mar 18 09:19:10 crc kubenswrapper[4917]: I0318 09:19:10.820158 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzrf7" event={"ID":"51a3ea95-f7a2-41ec-857f-440f93273b2e","Type":"ContainerStarted","Data":"fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb"} Mar 18 09:19:10 crc kubenswrapper[4917]: I0318 09:19:10.908744 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nzrf7" podStartSLOduration=3.433890242 podStartE2EDuration="7.908710251s" podCreationTimestamp="2026-03-18 09:19:03 +0000 UTC" firstStartedPulling="2026-03-18 09:19:05.761792456 +0000 UTC m=+9130.702947180" lastFinishedPulling="2026-03-18 09:19:10.236612475 +0000 UTC m=+9135.177767189" observedRunningTime="2026-03-18 09:19:10.901880435 +0000 UTC m=+9135.843035169" watchObservedRunningTime="2026-03-18 09:19:10.908710251 +0000 UTC m=+9135.849864965" Mar 18 09:19:13 crc kubenswrapper[4917]: I0318 09:19:13.945391 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:13 crc kubenswrapper[4917]: I0318 09:19:13.945698 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:13 crc kubenswrapper[4917]: I0318 09:19:13.999422 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:15 crc kubenswrapper[4917]: I0318 09:19:15.781439 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:19:15 crc kubenswrapper[4917]: E0318 09:19:15.782342 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:19:24 crc kubenswrapper[4917]: I0318 09:19:24.003526 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:24 crc kubenswrapper[4917]: I0318 09:19:24.059668 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nzrf7"] Mar 18 09:19:24 crc kubenswrapper[4917]: I0318 09:19:24.954224 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nzrf7" podUID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerName="registry-server" containerID="cri-o://fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb" gracePeriod=2 Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.659234 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.799661 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-catalog-content\") pod \"51a3ea95-f7a2-41ec-857f-440f93273b2e\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.799743 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-utilities\") pod \"51a3ea95-f7a2-41ec-857f-440f93273b2e\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.799789 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-649fq\" (UniqueName: \"kubernetes.io/projected/51a3ea95-f7a2-41ec-857f-440f93273b2e-kube-api-access-649fq\") pod \"51a3ea95-f7a2-41ec-857f-440f93273b2e\" (UID: \"51a3ea95-f7a2-41ec-857f-440f93273b2e\") " Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.800661 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-utilities" (OuterVolumeSpecName: "utilities") pod "51a3ea95-f7a2-41ec-857f-440f93273b2e" (UID: "51a3ea95-f7a2-41ec-857f-440f93273b2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.805619 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51a3ea95-f7a2-41ec-857f-440f93273b2e-kube-api-access-649fq" (OuterVolumeSpecName: "kube-api-access-649fq") pod "51a3ea95-f7a2-41ec-857f-440f93273b2e" (UID: "51a3ea95-f7a2-41ec-857f-440f93273b2e"). InnerVolumeSpecName "kube-api-access-649fq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.863856 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51a3ea95-f7a2-41ec-857f-440f93273b2e" (UID: "51a3ea95-f7a2-41ec-857f-440f93273b2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.902712 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.902741 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51a3ea95-f7a2-41ec-857f-440f93273b2e-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.902751 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-649fq\" (UniqueName: \"kubernetes.io/projected/51a3ea95-f7a2-41ec-857f-440f93273b2e-kube-api-access-649fq\") on node \"crc\" DevicePath \"\"" Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.975121 4917 generic.go:334] "Generic (PLEG): container finished" podID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerID="fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb" exitCode=0 Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.975169 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzrf7" event={"ID":"51a3ea95-f7a2-41ec-857f-440f93273b2e","Type":"ContainerDied","Data":"fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb"} Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.975196 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nzrf7" event={"ID":"51a3ea95-f7a2-41ec-857f-440f93273b2e","Type":"ContainerDied","Data":"877bb26069ce5a47670ffddf59f0905c232cb4f606fb03b22dbbe6c8129cafab"} Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.975212 4917 scope.go:117] "RemoveContainer" containerID="fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb" Mar 18 09:19:25 crc kubenswrapper[4917]: I0318 09:19:25.975643 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nzrf7" Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.001200 4917 scope.go:117] "RemoveContainer" containerID="d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb" Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.021812 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nzrf7"] Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.034204 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nzrf7"] Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.452772 4917 scope.go:117] "RemoveContainer" containerID="e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3" Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.514904 4917 scope.go:117] "RemoveContainer" containerID="fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb" Mar 18 09:19:26 crc kubenswrapper[4917]: E0318 09:19:26.515296 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb\": container with ID starting with fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb not found: ID does not exist" containerID="fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb" Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.515329 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb"} err="failed to get container status \"fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb\": rpc error: code = NotFound desc = could not find container \"fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb\": container with ID starting with fed554eca387bdb44bf43e5c91a68fca645daf2543bd93c92a9cc81826330fdb not found: ID does not exist" Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.515350 4917 scope.go:117] "RemoveContainer" containerID="d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb" Mar 18 09:19:26 crc kubenswrapper[4917]: E0318 09:19:26.515748 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb\": container with ID starting with d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb not found: ID does not exist" containerID="d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb" Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.515797 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb"} err="failed to get container status \"d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb\": rpc error: code = NotFound desc = could not find container \"d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb\": container with ID starting with d7468f4d56c6a2f47c742e7d7cf1c0e25dd6e4ebc27382511cf6904ca6e64cfb not found: ID does not exist" Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.515831 4917 scope.go:117] "RemoveContainer" containerID="e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3" Mar 18 09:19:26 crc kubenswrapper[4917]: E0318 09:19:26.516125 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3\": container with ID starting with e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3 not found: ID does not exist" containerID="e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3" Mar 18 09:19:26 crc kubenswrapper[4917]: I0318 09:19:26.516152 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3"} err="failed to get container status \"e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3\": rpc error: code = NotFound desc = could not find container \"e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3\": container with ID starting with e41fb1282c30f34f32382c91e0c9318b67a618124fc23c95adf1ba75fb913cd3 not found: ID does not exist" Mar 18 09:19:27 crc kubenswrapper[4917]: I0318 09:19:27.787462 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51a3ea95-f7a2-41ec-857f-440f93273b2e" path="/var/lib/kubelet/pods/51a3ea95-f7a2-41ec-857f-440f93273b2e/volumes" Mar 18 09:19:30 crc kubenswrapper[4917]: I0318 09:19:30.775271 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:19:30 crc kubenswrapper[4917]: E0318 09:19:30.777664 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:19:41 crc kubenswrapper[4917]: I0318 09:19:41.772871 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:19:41 crc kubenswrapper[4917]: E0318 09:19:41.773505 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.434030 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbqc"] Mar 18 09:19:48 crc kubenswrapper[4917]: E0318 09:19:48.435361 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerName="registry-server" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.435384 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerName="registry-server" Mar 18 09:19:48 crc kubenswrapper[4917]: E0318 09:19:48.435428 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerName="extract-content" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.435440 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerName="extract-content" Mar 18 09:19:48 crc kubenswrapper[4917]: E0318 09:19:48.435485 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerName="extract-utilities" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.435498 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerName="extract-utilities" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.435878 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="51a3ea95-f7a2-41ec-857f-440f93273b2e" containerName="registry-server" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.438391 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.449001 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbqc"] Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.569682 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-utilities\") pod \"redhat-marketplace-bkbqc\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.569777 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fntmj\" (UniqueName: \"kubernetes.io/projected/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-kube-api-access-fntmj\") pod \"redhat-marketplace-bkbqc\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.569801 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-catalog-content\") pod \"redhat-marketplace-bkbqc\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.672631 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-utilities\") pod \"redhat-marketplace-bkbqc\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.673130 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fntmj\" (UniqueName: \"kubernetes.io/projected/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-kube-api-access-fntmj\") pod \"redhat-marketplace-bkbqc\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.673171 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-catalog-content\") pod \"redhat-marketplace-bkbqc\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.673315 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-utilities\") pod \"redhat-marketplace-bkbqc\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.673856 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-catalog-content\") pod \"redhat-marketplace-bkbqc\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.698351 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fntmj\" (UniqueName: \"kubernetes.io/projected/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-kube-api-access-fntmj\") pod \"redhat-marketplace-bkbqc\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:48 crc kubenswrapper[4917]: I0318 09:19:48.766826 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:49 crc kubenswrapper[4917]: I0318 09:19:49.319163 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbqc"] Mar 18 09:19:49 crc kubenswrapper[4917]: W0318 09:19:49.321258 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa3ecceb_632a_46b6_a1b0_9e20dce81ed0.slice/crio-db4cfbb0abe887183872014b4337a70ff9b020867e2b0550ee76a8e5d692f0b4 WatchSource:0}: Error finding container db4cfbb0abe887183872014b4337a70ff9b020867e2b0550ee76a8e5d692f0b4: Status 404 returned error can't find the container with id db4cfbb0abe887183872014b4337a70ff9b020867e2b0550ee76a8e5d692f0b4 Mar 18 09:19:50 crc kubenswrapper[4917]: I0318 09:19:50.292262 4917 generic.go:334] "Generic (PLEG): container finished" podID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerID="3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8" exitCode=0 Mar 18 09:19:50 crc kubenswrapper[4917]: I0318 09:19:50.292475 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbqc" event={"ID":"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0","Type":"ContainerDied","Data":"3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8"} Mar 18 09:19:50 crc kubenswrapper[4917]: I0318 09:19:50.293910 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbqc" event={"ID":"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0","Type":"ContainerStarted","Data":"db4cfbb0abe887183872014b4337a70ff9b020867e2b0550ee76a8e5d692f0b4"} Mar 18 09:19:52 crc kubenswrapper[4917]: I0318 09:19:52.311462 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbqc" event={"ID":"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0","Type":"ContainerStarted","Data":"ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2"} Mar 18 09:19:53 crc kubenswrapper[4917]: I0318 09:19:53.321847 4917 generic.go:334] "Generic (PLEG): container finished" podID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerID="ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2" exitCode=0 Mar 18 09:19:53 crc kubenswrapper[4917]: I0318 09:19:53.321897 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbqc" event={"ID":"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0","Type":"ContainerDied","Data":"ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2"} Mar 18 09:19:54 crc kubenswrapper[4917]: I0318 09:19:54.335915 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbqc" event={"ID":"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0","Type":"ContainerStarted","Data":"985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26"} Mar 18 09:19:54 crc kubenswrapper[4917]: I0318 09:19:54.362589 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bkbqc" podStartSLOduration=2.813942021 podStartE2EDuration="6.362568102s" podCreationTimestamp="2026-03-18 09:19:48 +0000 UTC" firstStartedPulling="2026-03-18 09:19:50.294682501 +0000 UTC m=+9175.235837215" lastFinishedPulling="2026-03-18 09:19:53.843308572 +0000 UTC m=+9178.784463296" observedRunningTime="2026-03-18 09:19:54.361928936 +0000 UTC m=+9179.303083650" watchObservedRunningTime="2026-03-18 09:19:54.362568102 +0000 UTC m=+9179.303722816" Mar 18 09:19:54 crc kubenswrapper[4917]: I0318 09:19:54.773398 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:19:54 crc kubenswrapper[4917]: E0318 09:19:54.773837 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:19:58 crc kubenswrapper[4917]: I0318 09:19:58.767087 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:58 crc kubenswrapper[4917]: I0318 09:19:58.767525 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:58 crc kubenswrapper[4917]: I0318 09:19:58.828367 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:59 crc kubenswrapper[4917]: I0318 09:19:59.456294 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:19:59 crc kubenswrapper[4917]: I0318 09:19:59.516357 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbqc"] Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.155350 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563760-pv28h"] Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.156987 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-pv28h" Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.160968 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.161023 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.161132 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.170325 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-pv28h"] Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.256724 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7fpz\" (UniqueName: \"kubernetes.io/projected/7c05360f-a8d4-471a-9b9b-b3a57e4c4d12-kube-api-access-d7fpz\") pod \"auto-csr-approver-29563760-pv28h\" (UID: \"7c05360f-a8d4-471a-9b9b-b3a57e4c4d12\") " pod="openshift-infra/auto-csr-approver-29563760-pv28h" Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.359038 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7fpz\" (UniqueName: \"kubernetes.io/projected/7c05360f-a8d4-471a-9b9b-b3a57e4c4d12-kube-api-access-d7fpz\") pod \"auto-csr-approver-29563760-pv28h\" (UID: \"7c05360f-a8d4-471a-9b9b-b3a57e4c4d12\") " pod="openshift-infra/auto-csr-approver-29563760-pv28h" Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.391409 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7fpz\" (UniqueName: \"kubernetes.io/projected/7c05360f-a8d4-471a-9b9b-b3a57e4c4d12-kube-api-access-d7fpz\") pod \"auto-csr-approver-29563760-pv28h\" (UID: \"7c05360f-a8d4-471a-9b9b-b3a57e4c4d12\") " pod="openshift-infra/auto-csr-approver-29563760-pv28h" Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.484155 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-pv28h" Mar 18 09:20:00 crc kubenswrapper[4917]: I0318 09:20:00.977866 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-pv28h"] Mar 18 09:20:01 crc kubenswrapper[4917]: I0318 09:20:01.406615 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-pv28h" event={"ID":"7c05360f-a8d4-471a-9b9b-b3a57e4c4d12","Type":"ContainerStarted","Data":"8ade70c79650f6a8e2863d6c4402b17338cef746da542d29d5c036f9126c3e5a"} Mar 18 09:20:01 crc kubenswrapper[4917]: I0318 09:20:01.406827 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bkbqc" podUID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerName="registry-server" containerID="cri-o://985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26" gracePeriod=2 Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.413218 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.417427 4917 generic.go:334] "Generic (PLEG): container finished" podID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerID="985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26" exitCode=0 Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.417469 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bkbqc" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.417469 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbqc" event={"ID":"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0","Type":"ContainerDied","Data":"985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26"} Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.417632 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bkbqc" event={"ID":"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0","Type":"ContainerDied","Data":"db4cfbb0abe887183872014b4337a70ff9b020867e2b0550ee76a8e5d692f0b4"} Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.417650 4917 scope.go:117] "RemoveContainer" containerID="985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.461409 4917 scope.go:117] "RemoveContainer" containerID="ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.488044 4917 scope.go:117] "RemoveContainer" containerID="3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.505898 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fntmj\" (UniqueName: \"kubernetes.io/projected/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-kube-api-access-fntmj\") pod \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.505947 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-catalog-content\") pod \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.506058 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-utilities\") pod \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\" (UID: \"aa3ecceb-632a-46b6-a1b0-9e20dce81ed0\") " Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.507042 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-utilities" (OuterVolumeSpecName: "utilities") pod "aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" (UID: "aa3ecceb-632a-46b6-a1b0-9e20dce81ed0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.512366 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-kube-api-access-fntmj" (OuterVolumeSpecName: "kube-api-access-fntmj") pod "aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" (UID: "aa3ecceb-632a-46b6-a1b0-9e20dce81ed0"). InnerVolumeSpecName "kube-api-access-fntmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.542788 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" (UID: "aa3ecceb-632a-46b6-a1b0-9e20dce81ed0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.583229 4917 scope.go:117] "RemoveContainer" containerID="985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26" Mar 18 09:20:02 crc kubenswrapper[4917]: E0318 09:20:02.583788 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26\": container with ID starting with 985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26 not found: ID does not exist" containerID="985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.583828 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26"} err="failed to get container status \"985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26\": rpc error: code = NotFound desc = could not find container \"985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26\": container with ID starting with 985cb05f47f654206e5cc5c1d9027303166d5475e096602aba92c2294b110a26 not found: ID does not exist" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.583853 4917 scope.go:117] "RemoveContainer" containerID="ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2" Mar 18 09:20:02 crc kubenswrapper[4917]: E0318 09:20:02.584220 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2\": container with ID starting with ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2 not found: ID does not exist" containerID="ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.584268 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2"} err="failed to get container status \"ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2\": rpc error: code = NotFound desc = could not find container \"ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2\": container with ID starting with ea963c8673f72b41fae29040f0f0b7ca932bb048b4bed6b9b11318603370e1d2 not found: ID does not exist" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.584298 4917 scope.go:117] "RemoveContainer" containerID="3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8" Mar 18 09:20:02 crc kubenswrapper[4917]: E0318 09:20:02.584899 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8\": container with ID starting with 3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8 not found: ID does not exist" containerID="3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.584940 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8"} err="failed to get container status \"3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8\": rpc error: code = NotFound desc = could not find container \"3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8\": container with ID starting with 3ce4544137afd4bc0328005d3a7580783c7cb0b1875646db2983ceddcf08bfa8 not found: ID does not exist" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.608076 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.608108 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fntmj\" (UniqueName: \"kubernetes.io/projected/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-kube-api-access-fntmj\") on node \"crc\" DevicePath \"\"" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.608118 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.772316 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbqc"] Mar 18 09:20:02 crc kubenswrapper[4917]: I0318 09:20:02.781962 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bkbqc"] Mar 18 09:20:03 crc kubenswrapper[4917]: I0318 09:20:03.449159 4917 generic.go:334] "Generic (PLEG): container finished" podID="7c05360f-a8d4-471a-9b9b-b3a57e4c4d12" containerID="9d8529bcc6697844ee778295a7be014d6fdd18c5d243bbe6d3f4b044b29191fb" exitCode=0 Mar 18 09:20:03 crc kubenswrapper[4917]: I0318 09:20:03.449290 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-pv28h" event={"ID":"7c05360f-a8d4-471a-9b9b-b3a57e4c4d12","Type":"ContainerDied","Data":"9d8529bcc6697844ee778295a7be014d6fdd18c5d243bbe6d3f4b044b29191fb"} Mar 18 09:20:03 crc kubenswrapper[4917]: I0318 09:20:03.783788 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" path="/var/lib/kubelet/pods/aa3ecceb-632a-46b6-a1b0-9e20dce81ed0/volumes" Mar 18 09:20:04 crc kubenswrapper[4917]: I0318 09:20:04.924294 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-pv28h" Mar 18 09:20:05 crc kubenswrapper[4917]: I0318 09:20:05.055338 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7fpz\" (UniqueName: \"kubernetes.io/projected/7c05360f-a8d4-471a-9b9b-b3a57e4c4d12-kube-api-access-d7fpz\") pod \"7c05360f-a8d4-471a-9b9b-b3a57e4c4d12\" (UID: \"7c05360f-a8d4-471a-9b9b-b3a57e4c4d12\") " Mar 18 09:20:05 crc kubenswrapper[4917]: I0318 09:20:05.065349 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c05360f-a8d4-471a-9b9b-b3a57e4c4d12-kube-api-access-d7fpz" (OuterVolumeSpecName: "kube-api-access-d7fpz") pod "7c05360f-a8d4-471a-9b9b-b3a57e4c4d12" (UID: "7c05360f-a8d4-471a-9b9b-b3a57e4c4d12"). InnerVolumeSpecName "kube-api-access-d7fpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:20:05 crc kubenswrapper[4917]: I0318 09:20:05.158361 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7fpz\" (UniqueName: \"kubernetes.io/projected/7c05360f-a8d4-471a-9b9b-b3a57e4c4d12-kube-api-access-d7fpz\") on node \"crc\" DevicePath \"\"" Mar 18 09:20:05 crc kubenswrapper[4917]: I0318 09:20:05.468706 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563760-pv28h" event={"ID":"7c05360f-a8d4-471a-9b9b-b3a57e4c4d12","Type":"ContainerDied","Data":"8ade70c79650f6a8e2863d6c4402b17338cef746da542d29d5c036f9126c3e5a"} Mar 18 09:20:05 crc kubenswrapper[4917]: I0318 09:20:05.468749 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ade70c79650f6a8e2863d6c4402b17338cef746da542d29d5c036f9126c3e5a" Mar 18 09:20:05 crc kubenswrapper[4917]: I0318 09:20:05.468761 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563760-pv28h" Mar 18 09:20:05 crc kubenswrapper[4917]: I0318 09:20:05.992498 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-t4xm2"] Mar 18 09:20:06 crc kubenswrapper[4917]: I0318 09:20:06.001493 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563754-t4xm2"] Mar 18 09:20:07 crc kubenswrapper[4917]: I0318 09:20:07.772813 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:20:07 crc kubenswrapper[4917]: E0318 09:20:07.774452 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:20:07 crc kubenswrapper[4917]: I0318 09:20:07.786483 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9c31f30-5439-4941-9abc-9d944e2fadd8" path="/var/lib/kubelet/pods/a9c31f30-5439-4941-9abc-9d944e2fadd8/volumes" Mar 18 09:20:17 crc kubenswrapper[4917]: I0318 09:20:17.550166 4917 scope.go:117] "RemoveContainer" containerID="14e152afbc8557766eaf26b84669dd851e2d75ff95bf651be33f4f5ea90a6caf" Mar 18 09:20:22 crc kubenswrapper[4917]: I0318 09:20:22.772568 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:20:22 crc kubenswrapper[4917]: E0318 09:20:22.773456 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:20:35 crc kubenswrapper[4917]: I0318 09:20:35.779372 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:20:35 crc kubenswrapper[4917]: E0318 09:20:35.780109 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:20:47 crc kubenswrapper[4917]: I0318 09:20:47.775628 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:20:47 crc kubenswrapper[4917]: E0318 09:20:47.776274 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:20:59 crc kubenswrapper[4917]: I0318 09:20:59.772656 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:20:59 crc kubenswrapper[4917]: E0318 09:20:59.773483 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.643635 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qxzm4"] Mar 18 09:21:02 crc kubenswrapper[4917]: E0318 09:21:02.644595 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c05360f-a8d4-471a-9b9b-b3a57e4c4d12" containerName="oc" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.644606 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c05360f-a8d4-471a-9b9b-b3a57e4c4d12" containerName="oc" Mar 18 09:21:02 crc kubenswrapper[4917]: E0318 09:21:02.644620 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerName="registry-server" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.644626 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerName="registry-server" Mar 18 09:21:02 crc kubenswrapper[4917]: E0318 09:21:02.644650 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerName="extract-utilities" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.644656 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerName="extract-utilities" Mar 18 09:21:02 crc kubenswrapper[4917]: E0318 09:21:02.644676 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerName="extract-content" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.644681 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerName="extract-content" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.644874 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa3ecceb-632a-46b6-a1b0-9e20dce81ed0" containerName="registry-server" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.644898 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c05360f-a8d4-471a-9b9b-b3a57e4c4d12" containerName="oc" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.646227 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.666744 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxzm4"] Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.719199 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-catalog-content\") pod \"redhat-operators-qxzm4\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.719299 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phkqs\" (UniqueName: \"kubernetes.io/projected/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-kube-api-access-phkqs\") pod \"redhat-operators-qxzm4\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.719338 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-utilities\") pod \"redhat-operators-qxzm4\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.821722 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-catalog-content\") pod \"redhat-operators-qxzm4\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.821796 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phkqs\" (UniqueName: \"kubernetes.io/projected/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-kube-api-access-phkqs\") pod \"redhat-operators-qxzm4\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.821844 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-utilities\") pod \"redhat-operators-qxzm4\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.822254 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-catalog-content\") pod \"redhat-operators-qxzm4\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.822471 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-utilities\") pod \"redhat-operators-qxzm4\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.843408 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phkqs\" (UniqueName: \"kubernetes.io/projected/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-kube-api-access-phkqs\") pod \"redhat-operators-qxzm4\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:02 crc kubenswrapper[4917]: I0318 09:21:02.983471 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:03 crc kubenswrapper[4917]: I0318 09:21:03.469154 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qxzm4"] Mar 18 09:21:03 crc kubenswrapper[4917]: I0318 09:21:03.760054 4917 generic.go:334] "Generic (PLEG): container finished" podID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerID="76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7" exitCode=0 Mar 18 09:21:03 crc kubenswrapper[4917]: I0318 09:21:03.760115 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzm4" event={"ID":"2eeb4a3a-bb85-45b0-a02d-2b48328307a2","Type":"ContainerDied","Data":"76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7"} Mar 18 09:21:03 crc kubenswrapper[4917]: I0318 09:21:03.760162 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzm4" event={"ID":"2eeb4a3a-bb85-45b0-a02d-2b48328307a2","Type":"ContainerStarted","Data":"f8ec1180dd097644db91eab48d33271ed065efce904b2aa8ea1338161a764724"} Mar 18 09:21:03 crc kubenswrapper[4917]: I0318 09:21:03.763799 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:21:05 crc kubenswrapper[4917]: I0318 09:21:05.784326 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzm4" event={"ID":"2eeb4a3a-bb85-45b0-a02d-2b48328307a2","Type":"ContainerStarted","Data":"372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a"} Mar 18 09:21:09 crc kubenswrapper[4917]: I0318 09:21:09.833735 4917 generic.go:334] "Generic (PLEG): container finished" podID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerID="372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a" exitCode=0 Mar 18 09:21:09 crc kubenswrapper[4917]: I0318 09:21:09.833821 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzm4" event={"ID":"2eeb4a3a-bb85-45b0-a02d-2b48328307a2","Type":"ContainerDied","Data":"372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a"} Mar 18 09:21:11 crc kubenswrapper[4917]: I0318 09:21:11.773098 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:21:11 crc kubenswrapper[4917]: E0318 09:21:11.773961 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:21:11 crc kubenswrapper[4917]: I0318 09:21:11.861339 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzm4" event={"ID":"2eeb4a3a-bb85-45b0-a02d-2b48328307a2","Type":"ContainerStarted","Data":"dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528"} Mar 18 09:21:11 crc kubenswrapper[4917]: I0318 09:21:11.892785 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qxzm4" podStartSLOduration=3.126817155 podStartE2EDuration="9.892768906s" podCreationTimestamp="2026-03-18 09:21:02 +0000 UTC" firstStartedPulling="2026-03-18 09:21:03.763238963 +0000 UTC m=+9248.704393687" lastFinishedPulling="2026-03-18 09:21:10.529190694 +0000 UTC m=+9255.470345438" observedRunningTime="2026-03-18 09:21:11.887313733 +0000 UTC m=+9256.828468477" watchObservedRunningTime="2026-03-18 09:21:11.892768906 +0000 UTC m=+9256.833923620" Mar 18 09:21:12 crc kubenswrapper[4917]: I0318 09:21:12.990984 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:12 crc kubenswrapper[4917]: I0318 09:21:12.991447 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:14 crc kubenswrapper[4917]: I0318 09:21:14.485813 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qxzm4" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerName="registry-server" probeResult="failure" output=< Mar 18 09:21:14 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 09:21:14 crc kubenswrapper[4917]: > Mar 18 09:21:23 crc kubenswrapper[4917]: I0318 09:21:23.037004 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:23 crc kubenswrapper[4917]: I0318 09:21:23.115912 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:23 crc kubenswrapper[4917]: I0318 09:21:23.291575 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxzm4"] Mar 18 09:21:23 crc kubenswrapper[4917]: I0318 09:21:23.778061 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:21:23 crc kubenswrapper[4917]: E0318 09:21:23.778269 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.009052 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qxzm4" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerName="registry-server" containerID="cri-o://dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528" gracePeriod=2 Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.637712 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.810149 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phkqs\" (UniqueName: \"kubernetes.io/projected/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-kube-api-access-phkqs\") pod \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.810269 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-utilities\") pod \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.810386 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-catalog-content\") pod \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\" (UID: \"2eeb4a3a-bb85-45b0-a02d-2b48328307a2\") " Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.811848 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-utilities" (OuterVolumeSpecName: "utilities") pod "2eeb4a3a-bb85-45b0-a02d-2b48328307a2" (UID: "2eeb4a3a-bb85-45b0-a02d-2b48328307a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.815231 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-kube-api-access-phkqs" (OuterVolumeSpecName: "kube-api-access-phkqs") pod "2eeb4a3a-bb85-45b0-a02d-2b48328307a2" (UID: "2eeb4a3a-bb85-45b0-a02d-2b48328307a2"). InnerVolumeSpecName "kube-api-access-phkqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.913035 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phkqs\" (UniqueName: \"kubernetes.io/projected/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-kube-api-access-phkqs\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.913071 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:25 crc kubenswrapper[4917]: I0318 09:21:25.961263 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2eeb4a3a-bb85-45b0-a02d-2b48328307a2" (UID: "2eeb4a3a-bb85-45b0-a02d-2b48328307a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.014571 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eeb4a3a-bb85-45b0-a02d-2b48328307a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.021331 4917 generic.go:334] "Generic (PLEG): container finished" podID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerID="dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528" exitCode=0 Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.021373 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzm4" event={"ID":"2eeb4a3a-bb85-45b0-a02d-2b48328307a2","Type":"ContainerDied","Data":"dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528"} Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.021400 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qxzm4" event={"ID":"2eeb4a3a-bb85-45b0-a02d-2b48328307a2","Type":"ContainerDied","Data":"f8ec1180dd097644db91eab48d33271ed065efce904b2aa8ea1338161a764724"} Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.021418 4917 scope.go:117] "RemoveContainer" containerID="dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.021458 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qxzm4" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.053798 4917 scope.go:117] "RemoveContainer" containerID="372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.068176 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qxzm4"] Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.077763 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qxzm4"] Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.093358 4917 scope.go:117] "RemoveContainer" containerID="76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.150710 4917 scope.go:117] "RemoveContainer" containerID="dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528" Mar 18 09:21:26 crc kubenswrapper[4917]: E0318 09:21:26.158747 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528\": container with ID starting with dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528 not found: ID does not exist" containerID="dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.158801 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528"} err="failed to get container status \"dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528\": rpc error: code = NotFound desc = could not find container \"dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528\": container with ID starting with dcbfdd8e900b5ecef8b33f07f74475802371edcf2660949d6d7bf1eb506f9528 not found: ID does not exist" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.158836 4917 scope.go:117] "RemoveContainer" containerID="372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a" Mar 18 09:21:26 crc kubenswrapper[4917]: E0318 09:21:26.159428 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a\": container with ID starting with 372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a not found: ID does not exist" containerID="372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.159459 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a"} err="failed to get container status \"372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a\": rpc error: code = NotFound desc = could not find container \"372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a\": container with ID starting with 372c004681fb5d0ca5b53b86af12374a67778b79d2a4ff95e7c37b756b0f4f5a not found: ID does not exist" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.159478 4917 scope.go:117] "RemoveContainer" containerID="76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7" Mar 18 09:21:26 crc kubenswrapper[4917]: E0318 09:21:26.161131 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7\": container with ID starting with 76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7 not found: ID does not exist" containerID="76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7" Mar 18 09:21:26 crc kubenswrapper[4917]: I0318 09:21:26.161161 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7"} err="failed to get container status \"76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7\": rpc error: code = NotFound desc = could not find container \"76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7\": container with ID starting with 76f94b27992a866132b9115d1110b06c9f79acfb9ab0849e6a9ddc4cb78393c7 not found: ID does not exist" Mar 18 09:21:27 crc kubenswrapper[4917]: I0318 09:21:27.786367 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" path="/var/lib/kubelet/pods/2eeb4a3a-bb85-45b0-a02d-2b48328307a2/volumes" Mar 18 09:21:36 crc kubenswrapper[4917]: I0318 09:21:36.773014 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:21:37 crc kubenswrapper[4917]: I0318 09:21:37.144068 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"631374ea93931dfc55d694e6dd1da33a32d0d39afef7672b4338c0a7786c053e"} Mar 18 09:21:44 crc kubenswrapper[4917]: I0318 09:21:44.245072 4917 generic.go:334] "Generic (PLEG): container finished" podID="aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" containerID="6ff4d6c9fdd3c7ec1595a74efadc6a323f453b56577634b997b205bc4e78fbc8" exitCode=0 Mar 18 09:21:44 crc kubenswrapper[4917]: I0318 09:21:44.245190 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588","Type":"ContainerDied","Data":"6ff4d6c9fdd3c7ec1595a74efadc6a323f453b56577634b997b205bc4e78fbc8"} Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.678762 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.809683 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ssh-key\") pod \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.809885 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-workdir\") pod \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.809948 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.810050 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config-secret\") pod \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.810198 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ca-certs\") pod \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.810245 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-temporary\") pod \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.810279 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whpfg\" (UniqueName: \"kubernetes.io/projected/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-kube-api-access-whpfg\") pod \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.810374 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-config-data\") pod \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.810505 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config\") pod \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\" (UID: \"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588\") " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.814309 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-config-data" (OuterVolumeSpecName: "config-data") pod "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" (UID: "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.814374 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" (UID: "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.817751 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" (UID: "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.832911 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "test-operator-logs") pod "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" (UID: "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.833232 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-kube-api-access-whpfg" (OuterVolumeSpecName: "kube-api-access-whpfg") pod "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" (UID: "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588"). InnerVolumeSpecName "kube-api-access-whpfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.849110 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" (UID: "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.850004 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" (UID: "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.858139 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" (UID: "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.885142 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" (UID: "aea4965f-d2fe-4941-8ff8-1cf5cf9cd588"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.914021 4917 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.914070 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.914085 4917 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.914098 4917 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.914111 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whpfg\" (UniqueName: \"kubernetes.io/projected/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-kube-api-access-whpfg\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.914124 4917 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-config-data\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.914135 4917 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.914146 4917 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:45 crc kubenswrapper[4917]: I0318 09:21:45.914156 4917 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/aea4965f-d2fe-4941-8ff8-1cf5cf9cd588-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:46 crc kubenswrapper[4917]: I0318 09:21:46.273929 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"aea4965f-d2fe-4941-8ff8-1cf5cf9cd588","Type":"ContainerDied","Data":"b45a3928562be138f9113f4f3226fc4c82f89042b89852071cb0893df0740988"} Mar 18 09:21:46 crc kubenswrapper[4917]: I0318 09:21:46.273997 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b45a3928562be138f9113f4f3226fc4c82f89042b89852071cb0893df0740988" Mar 18 09:21:46 crc kubenswrapper[4917]: I0318 09:21:46.274017 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 18 09:21:46 crc kubenswrapper[4917]: I0318 09:21:46.663304 4917 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 18 09:21:46 crc kubenswrapper[4917]: I0318 09:21:46.731917 4917 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:48.999859 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 09:21:49 crc kubenswrapper[4917]: E0318 09:21:49.001412 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerName="registry-server" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.001441 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerName="registry-server" Mar 18 09:21:49 crc kubenswrapper[4917]: E0318 09:21:49.001482 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerName="extract-content" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.001495 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerName="extract-content" Mar 18 09:21:49 crc kubenswrapper[4917]: E0318 09:21:49.001529 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerName="extract-utilities" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.001545 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerName="extract-utilities" Mar 18 09:21:49 crc kubenswrapper[4917]: E0318 09:21:49.001624 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" containerName="tempest-tests-tempest-tests-runner" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.001649 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" containerName="tempest-tests-tempest-tests-runner" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.002087 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eeb4a3a-bb85-45b0-a02d-2b48328307a2" containerName="registry-server" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.002128 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea4965f-d2fe-4941-8ff8-1cf5cf9cd588" containerName="tempest-tests-tempest-tests-runner" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.003512 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.008550 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-c568t" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.021258 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.084372 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxrkd\" (UniqueName: \"kubernetes.io/projected/3c577fff-d7f1-43e9-a6c9-1dd565be5f74-kube-api-access-jxrkd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c577fff-d7f1-43e9-a6c9-1dd565be5f74\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.084685 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c577fff-d7f1-43e9-a6c9-1dd565be5f74\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.186550 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c577fff-d7f1-43e9-a6c9-1dd565be5f74\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.187015 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxrkd\" (UniqueName: \"kubernetes.io/projected/3c577fff-d7f1-43e9-a6c9-1dd565be5f74-kube-api-access-jxrkd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c577fff-d7f1-43e9-a6c9-1dd565be5f74\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.187598 4917 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c577fff-d7f1-43e9-a6c9-1dd565be5f74\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.215415 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxrkd\" (UniqueName: \"kubernetes.io/projected/3c577fff-d7f1-43e9-a6c9-1dd565be5f74-kube-api-access-jxrkd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c577fff-d7f1-43e9-a6c9-1dd565be5f74\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.226420 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"3c577fff-d7f1-43e9-a6c9-1dd565be5f74\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.349986 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 18 09:21:49 crc kubenswrapper[4917]: I0318 09:21:49.846817 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 18 09:21:49 crc kubenswrapper[4917]: W0318 09:21:49.855127 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c577fff_d7f1_43e9_a6c9_1dd565be5f74.slice/crio-69f68f4d5720b68826dd0d85b5faa793bdbb09e2fdef9f002930e7ec7f1a9ab7 WatchSource:0}: Error finding container 69f68f4d5720b68826dd0d85b5faa793bdbb09e2fdef9f002930e7ec7f1a9ab7: Status 404 returned error can't find the container with id 69f68f4d5720b68826dd0d85b5faa793bdbb09e2fdef9f002930e7ec7f1a9ab7 Mar 18 09:21:50 crc kubenswrapper[4917]: I0318 09:21:50.315940 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3c577fff-d7f1-43e9-a6c9-1dd565be5f74","Type":"ContainerStarted","Data":"69f68f4d5720b68826dd0d85b5faa793bdbb09e2fdef9f002930e7ec7f1a9ab7"} Mar 18 09:21:51 crc kubenswrapper[4917]: I0318 09:21:51.328189 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"3c577fff-d7f1-43e9-a6c9-1dd565be5f74","Type":"ContainerStarted","Data":"44e43a1bba5260927d1833dbf0b08dca8034c0fcc4639324ae3d44eedfd1d718"} Mar 18 09:21:51 crc kubenswrapper[4917]: I0318 09:21:51.355150 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.21198271 podStartE2EDuration="3.355128578s" podCreationTimestamp="2026-03-18 09:21:48 +0000 UTC" firstStartedPulling="2026-03-18 09:21:49.859236768 +0000 UTC m=+9294.800391492" lastFinishedPulling="2026-03-18 09:21:51.002382646 +0000 UTC m=+9295.943537360" observedRunningTime="2026-03-18 09:21:51.3420206 +0000 UTC m=+9296.283175324" watchObservedRunningTime="2026-03-18 09:21:51.355128578 +0000 UTC m=+9296.296283302" Mar 18 09:22:00 crc kubenswrapper[4917]: I0318 09:22:00.160066 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563762-vv2kl"] Mar 18 09:22:00 crc kubenswrapper[4917]: I0318 09:22:00.162366 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-vv2kl" Mar 18 09:22:00 crc kubenswrapper[4917]: I0318 09:22:00.164863 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:22:00 crc kubenswrapper[4917]: I0318 09:22:00.164994 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:22:00 crc kubenswrapper[4917]: I0318 09:22:00.165124 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:22:00 crc kubenswrapper[4917]: I0318 09:22:00.170983 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-vv2kl"] Mar 18 09:22:00 crc kubenswrapper[4917]: I0318 09:22:00.326885 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfnjp\" (UniqueName: \"kubernetes.io/projected/cf12c2b3-f4a3-4a79-8859-33391ef42687-kube-api-access-nfnjp\") pod \"auto-csr-approver-29563762-vv2kl\" (UID: \"cf12c2b3-f4a3-4a79-8859-33391ef42687\") " pod="openshift-infra/auto-csr-approver-29563762-vv2kl" Mar 18 09:22:00 crc kubenswrapper[4917]: I0318 09:22:00.429433 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfnjp\" (UniqueName: \"kubernetes.io/projected/cf12c2b3-f4a3-4a79-8859-33391ef42687-kube-api-access-nfnjp\") pod \"auto-csr-approver-29563762-vv2kl\" (UID: \"cf12c2b3-f4a3-4a79-8859-33391ef42687\") " pod="openshift-infra/auto-csr-approver-29563762-vv2kl" Mar 18 09:22:00 crc kubenswrapper[4917]: I0318 09:22:00.937176 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfnjp\" (UniqueName: \"kubernetes.io/projected/cf12c2b3-f4a3-4a79-8859-33391ef42687-kube-api-access-nfnjp\") pod \"auto-csr-approver-29563762-vv2kl\" (UID: \"cf12c2b3-f4a3-4a79-8859-33391ef42687\") " pod="openshift-infra/auto-csr-approver-29563762-vv2kl" Mar 18 09:22:01 crc kubenswrapper[4917]: I0318 09:22:01.081060 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-vv2kl" Mar 18 09:22:01 crc kubenswrapper[4917]: I0318 09:22:01.548719 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-vv2kl"] Mar 18 09:22:01 crc kubenswrapper[4917]: W0318 09:22:01.548998 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf12c2b3_f4a3_4a79_8859_33391ef42687.slice/crio-cb80a445d80451d0c4da2da9a418742619e111ba647d384e78533ea29d46a566 WatchSource:0}: Error finding container cb80a445d80451d0c4da2da9a418742619e111ba647d384e78533ea29d46a566: Status 404 returned error can't find the container with id cb80a445d80451d0c4da2da9a418742619e111ba647d384e78533ea29d46a566 Mar 18 09:22:02 crc kubenswrapper[4917]: I0318 09:22:02.515272 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-vv2kl" event={"ID":"cf12c2b3-f4a3-4a79-8859-33391ef42687","Type":"ContainerStarted","Data":"cb80a445d80451d0c4da2da9a418742619e111ba647d384e78533ea29d46a566"} Mar 18 09:22:03 crc kubenswrapper[4917]: I0318 09:22:03.540974 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-vv2kl" event={"ID":"cf12c2b3-f4a3-4a79-8859-33391ef42687","Type":"ContainerStarted","Data":"abc2c3bb7af6cc6854439c4c1bf491eea93dc53531bd9c1ede42d263472373ae"} Mar 18 09:22:03 crc kubenswrapper[4917]: I0318 09:22:03.563829 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563762-vv2kl" podStartSLOduration=2.0054351 podStartE2EDuration="3.563809956s" podCreationTimestamp="2026-03-18 09:22:00 +0000 UTC" firstStartedPulling="2026-03-18 09:22:01.553327518 +0000 UTC m=+9306.494482242" lastFinishedPulling="2026-03-18 09:22:03.111702364 +0000 UTC m=+9308.052857098" observedRunningTime="2026-03-18 09:22:03.555945046 +0000 UTC m=+9308.497099770" watchObservedRunningTime="2026-03-18 09:22:03.563809956 +0000 UTC m=+9308.504964660" Mar 18 09:22:04 crc kubenswrapper[4917]: I0318 09:22:04.558511 4917 generic.go:334] "Generic (PLEG): container finished" podID="cf12c2b3-f4a3-4a79-8859-33391ef42687" containerID="abc2c3bb7af6cc6854439c4c1bf491eea93dc53531bd9c1ede42d263472373ae" exitCode=0 Mar 18 09:22:04 crc kubenswrapper[4917]: I0318 09:22:04.558633 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-vv2kl" event={"ID":"cf12c2b3-f4a3-4a79-8859-33391ef42687","Type":"ContainerDied","Data":"abc2c3bb7af6cc6854439c4c1bf491eea93dc53531bd9c1ede42d263472373ae"} Mar 18 09:22:05 crc kubenswrapper[4917]: I0318 09:22:05.984318 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-vv2kl" Mar 18 09:22:06 crc kubenswrapper[4917]: I0318 09:22:06.076705 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfnjp\" (UniqueName: \"kubernetes.io/projected/cf12c2b3-f4a3-4a79-8859-33391ef42687-kube-api-access-nfnjp\") pod \"cf12c2b3-f4a3-4a79-8859-33391ef42687\" (UID: \"cf12c2b3-f4a3-4a79-8859-33391ef42687\") " Mar 18 09:22:06 crc kubenswrapper[4917]: I0318 09:22:06.086219 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf12c2b3-f4a3-4a79-8859-33391ef42687-kube-api-access-nfnjp" (OuterVolumeSpecName: "kube-api-access-nfnjp") pod "cf12c2b3-f4a3-4a79-8859-33391ef42687" (UID: "cf12c2b3-f4a3-4a79-8859-33391ef42687"). InnerVolumeSpecName "kube-api-access-nfnjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:22:06 crc kubenswrapper[4917]: I0318 09:22:06.179453 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfnjp\" (UniqueName: \"kubernetes.io/projected/cf12c2b3-f4a3-4a79-8859-33391ef42687-kube-api-access-nfnjp\") on node \"crc\" DevicePath \"\"" Mar 18 09:22:06 crc kubenswrapper[4917]: I0318 09:22:06.584764 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563762-vv2kl" event={"ID":"cf12c2b3-f4a3-4a79-8859-33391ef42687","Type":"ContainerDied","Data":"cb80a445d80451d0c4da2da9a418742619e111ba647d384e78533ea29d46a566"} Mar 18 09:22:06 crc kubenswrapper[4917]: I0318 09:22:06.585149 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb80a445d80451d0c4da2da9a418742619e111ba647d384e78533ea29d46a566" Mar 18 09:22:06 crc kubenswrapper[4917]: I0318 09:22:06.585245 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563762-vv2kl" Mar 18 09:22:06 crc kubenswrapper[4917]: I0318 09:22:06.659712 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-9l56h"] Mar 18 09:22:06 crc kubenswrapper[4917]: I0318 09:22:06.672069 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563756-9l56h"] Mar 18 09:22:07 crc kubenswrapper[4917]: I0318 09:22:07.784712 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="224ca311-56b5-49d6-9363-83b4671e465a" path="/var/lib/kubelet/pods/224ca311-56b5-49d6-9363-83b4671e465a/volumes" Mar 18 09:22:17 crc kubenswrapper[4917]: I0318 09:22:17.707181 4917 scope.go:117] "RemoveContainer" containerID="6deceed5dc238cfac13b076513f9a9d1df47d2f87fd6dbc4c67a22144eef1e7f" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.679199 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fplc/must-gather-2bgws"] Mar 18 09:23:00 crc kubenswrapper[4917]: E0318 09:23:00.680136 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf12c2b3-f4a3-4a79-8859-33391ef42687" containerName="oc" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.680148 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf12c2b3-f4a3-4a79-8859-33391ef42687" containerName="oc" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.680352 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf12c2b3-f4a3-4a79-8859-33391ef42687" containerName="oc" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.681345 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.688170 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9fplc"/"openshift-service-ca.crt" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.688410 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9fplc"/"default-dockercfg-wpdc5" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.688434 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9fplc"/"kube-root-ca.crt" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.705311 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fplc/must-gather-2bgws"] Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.857205 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmz8l\" (UniqueName: \"kubernetes.io/projected/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-kube-api-access-tmz8l\") pod \"must-gather-2bgws\" (UID: \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\") " pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.857318 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-must-gather-output\") pod \"must-gather-2bgws\" (UID: \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\") " pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.958992 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmz8l\" (UniqueName: \"kubernetes.io/projected/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-kube-api-access-tmz8l\") pod \"must-gather-2bgws\" (UID: \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\") " pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.959134 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-must-gather-output\") pod \"must-gather-2bgws\" (UID: \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\") " pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.961340 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-must-gather-output\") pod \"must-gather-2bgws\" (UID: \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\") " pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:23:00 crc kubenswrapper[4917]: I0318 09:23:00.983118 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmz8l\" (UniqueName: \"kubernetes.io/projected/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-kube-api-access-tmz8l\") pod \"must-gather-2bgws\" (UID: \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\") " pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:23:01 crc kubenswrapper[4917]: I0318 09:23:01.006822 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:23:01 crc kubenswrapper[4917]: I0318 09:23:01.477811 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fplc/must-gather-2bgws"] Mar 18 09:23:02 crc kubenswrapper[4917]: I0318 09:23:02.242689 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/must-gather-2bgws" event={"ID":"f12266ce-503e-4e3d-bbe4-c9e365a82fa3","Type":"ContainerStarted","Data":"bc7f9c80aa16bd37a2962ea8bdcf2732128f3ba0db818108ae7340fb453bff02"} Mar 18 09:23:09 crc kubenswrapper[4917]: I0318 09:23:09.356811 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/must-gather-2bgws" event={"ID":"f12266ce-503e-4e3d-bbe4-c9e365a82fa3","Type":"ContainerStarted","Data":"8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a"} Mar 18 09:23:09 crc kubenswrapper[4917]: I0318 09:23:09.357659 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/must-gather-2bgws" event={"ID":"f12266ce-503e-4e3d-bbe4-c9e365a82fa3","Type":"ContainerStarted","Data":"db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84"} Mar 18 09:23:09 crc kubenswrapper[4917]: I0318 09:23:09.390816 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9fplc/must-gather-2bgws" podStartSLOduration=2.359149546 podStartE2EDuration="9.390794418s" podCreationTimestamp="2026-03-18 09:23:00 +0000 UTC" firstStartedPulling="2026-03-18 09:23:01.481331632 +0000 UTC m=+9366.422486346" lastFinishedPulling="2026-03-18 09:23:08.512976504 +0000 UTC m=+9373.454131218" observedRunningTime="2026-03-18 09:23:09.383759768 +0000 UTC m=+9374.324914482" watchObservedRunningTime="2026-03-18 09:23:09.390794418 +0000 UTC m=+9374.331949142" Mar 18 09:23:14 crc kubenswrapper[4917]: I0318 09:23:14.554381 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fplc/crc-debug-rddmz"] Mar 18 09:23:14 crc kubenswrapper[4917]: I0318 09:23:14.556186 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:23:14 crc kubenswrapper[4917]: I0318 09:23:14.664638 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cmlz\" (UniqueName: \"kubernetes.io/projected/f2b2bba2-8c50-47a4-bb71-162e34cb9132-kube-api-access-2cmlz\") pod \"crc-debug-rddmz\" (UID: \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\") " pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:23:14 crc kubenswrapper[4917]: I0318 09:23:14.664745 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2b2bba2-8c50-47a4-bb71-162e34cb9132-host\") pod \"crc-debug-rddmz\" (UID: \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\") " pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:23:14 crc kubenswrapper[4917]: I0318 09:23:14.766281 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cmlz\" (UniqueName: \"kubernetes.io/projected/f2b2bba2-8c50-47a4-bb71-162e34cb9132-kube-api-access-2cmlz\") pod \"crc-debug-rddmz\" (UID: \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\") " pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:23:14 crc kubenswrapper[4917]: I0318 09:23:14.766399 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2b2bba2-8c50-47a4-bb71-162e34cb9132-host\") pod \"crc-debug-rddmz\" (UID: \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\") " pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:23:14 crc kubenswrapper[4917]: I0318 09:23:14.766677 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2b2bba2-8c50-47a4-bb71-162e34cb9132-host\") pod \"crc-debug-rddmz\" (UID: \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\") " pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:23:15 crc kubenswrapper[4917]: I0318 09:23:15.043255 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cmlz\" (UniqueName: \"kubernetes.io/projected/f2b2bba2-8c50-47a4-bb71-162e34cb9132-kube-api-access-2cmlz\") pod \"crc-debug-rddmz\" (UID: \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\") " pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:23:15 crc kubenswrapper[4917]: I0318 09:23:15.173102 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:23:15 crc kubenswrapper[4917]: W0318 09:23:15.220919 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b2bba2_8c50_47a4_bb71_162e34cb9132.slice/crio-263adf33070cc693fddbcd3a1bded77c820c2dcb5b0ecc716e554e134dcbf2f7 WatchSource:0}: Error finding container 263adf33070cc693fddbcd3a1bded77c820c2dcb5b0ecc716e554e134dcbf2f7: Status 404 returned error can't find the container with id 263adf33070cc693fddbcd3a1bded77c820c2dcb5b0ecc716e554e134dcbf2f7 Mar 18 09:23:15 crc kubenswrapper[4917]: I0318 09:23:15.423128 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/crc-debug-rddmz" event={"ID":"f2b2bba2-8c50-47a4-bb71-162e34cb9132","Type":"ContainerStarted","Data":"263adf33070cc693fddbcd3a1bded77c820c2dcb5b0ecc716e554e134dcbf2f7"} Mar 18 09:23:26 crc kubenswrapper[4917]: I0318 09:23:26.558581 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/crc-debug-rddmz" event={"ID":"f2b2bba2-8c50-47a4-bb71-162e34cb9132","Type":"ContainerStarted","Data":"1236ed334d7d813ed7b2cbd0886ede949623d8214f4db6caa598073a013ea7cf"} Mar 18 09:23:26 crc kubenswrapper[4917]: I0318 09:23:26.578816 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9fplc/crc-debug-rddmz" podStartSLOduration=1.49895302 podStartE2EDuration="12.578796838s" podCreationTimestamp="2026-03-18 09:23:14 +0000 UTC" firstStartedPulling="2026-03-18 09:23:15.227043278 +0000 UTC m=+9380.168198002" lastFinishedPulling="2026-03-18 09:23:26.306887106 +0000 UTC m=+9391.248041820" observedRunningTime="2026-03-18 09:23:26.573040839 +0000 UTC m=+9391.514195553" watchObservedRunningTime="2026-03-18 09:23:26.578796838 +0000 UTC m=+9391.519951552" Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.150293 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563764-lpjt2"] Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.152453 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-lpjt2" Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.155094 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.155316 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.155383 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.161866 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-lpjt2"] Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.280699 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmk7s\" (UniqueName: \"kubernetes.io/projected/a51b0918-fa96-4784-b6dc-be09150be166-kube-api-access-nmk7s\") pod \"auto-csr-approver-29563764-lpjt2\" (UID: \"a51b0918-fa96-4784-b6dc-be09150be166\") " pod="openshift-infra/auto-csr-approver-29563764-lpjt2" Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.393009 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmk7s\" (UniqueName: \"kubernetes.io/projected/a51b0918-fa96-4784-b6dc-be09150be166-kube-api-access-nmk7s\") pod \"auto-csr-approver-29563764-lpjt2\" (UID: \"a51b0918-fa96-4784-b6dc-be09150be166\") " pod="openshift-infra/auto-csr-approver-29563764-lpjt2" Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.417118 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmk7s\" (UniqueName: \"kubernetes.io/projected/a51b0918-fa96-4784-b6dc-be09150be166-kube-api-access-nmk7s\") pod \"auto-csr-approver-29563764-lpjt2\" (UID: \"a51b0918-fa96-4784-b6dc-be09150be166\") " pod="openshift-infra/auto-csr-approver-29563764-lpjt2" Mar 18 09:24:00 crc kubenswrapper[4917]: I0318 09:24:00.472127 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-lpjt2" Mar 18 09:24:01 crc kubenswrapper[4917]: I0318 09:24:01.045807 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-lpjt2"] Mar 18 09:24:01 crc kubenswrapper[4917]: I0318 09:24:01.891358 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-lpjt2" event={"ID":"a51b0918-fa96-4784-b6dc-be09150be166","Type":"ContainerStarted","Data":"b6b54b36d8e0ebe8b5756a9084f300939be2d9efb60b0df17b26204df9d629b3"} Mar 18 09:24:02 crc kubenswrapper[4917]: I0318 09:24:02.929053 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:24:02 crc kubenswrapper[4917]: I0318 09:24:02.929401 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:24:03 crc kubenswrapper[4917]: I0318 09:24:03.916454 4917 generic.go:334] "Generic (PLEG): container finished" podID="a51b0918-fa96-4784-b6dc-be09150be166" containerID="94ce77aef30befa2bb5b0cc4e4f16eac37ffdd7e3d1243bbc59fab16cb64b1bc" exitCode=0 Mar 18 09:24:03 crc kubenswrapper[4917]: I0318 09:24:03.916804 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-lpjt2" event={"ID":"a51b0918-fa96-4784-b6dc-be09150be166","Type":"ContainerDied","Data":"94ce77aef30befa2bb5b0cc4e4f16eac37ffdd7e3d1243bbc59fab16cb64b1bc"} Mar 18 09:24:05 crc kubenswrapper[4917]: I0318 09:24:05.273808 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-lpjt2" Mar 18 09:24:05 crc kubenswrapper[4917]: I0318 09:24:05.394754 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmk7s\" (UniqueName: \"kubernetes.io/projected/a51b0918-fa96-4784-b6dc-be09150be166-kube-api-access-nmk7s\") pod \"a51b0918-fa96-4784-b6dc-be09150be166\" (UID: \"a51b0918-fa96-4784-b6dc-be09150be166\") " Mar 18 09:24:05 crc kubenswrapper[4917]: I0318 09:24:05.400605 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a51b0918-fa96-4784-b6dc-be09150be166-kube-api-access-nmk7s" (OuterVolumeSpecName: "kube-api-access-nmk7s") pod "a51b0918-fa96-4784-b6dc-be09150be166" (UID: "a51b0918-fa96-4784-b6dc-be09150be166"). InnerVolumeSpecName "kube-api-access-nmk7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:05 crc kubenswrapper[4917]: I0318 09:24:05.497100 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmk7s\" (UniqueName: \"kubernetes.io/projected/a51b0918-fa96-4784-b6dc-be09150be166-kube-api-access-nmk7s\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:05 crc kubenswrapper[4917]: I0318 09:24:05.934522 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563764-lpjt2" event={"ID":"a51b0918-fa96-4784-b6dc-be09150be166","Type":"ContainerDied","Data":"b6b54b36d8e0ebe8b5756a9084f300939be2d9efb60b0df17b26204df9d629b3"} Mar 18 09:24:05 crc kubenswrapper[4917]: I0318 09:24:05.934831 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6b54b36d8e0ebe8b5756a9084f300939be2d9efb60b0df17b26204df9d629b3" Mar 18 09:24:05 crc kubenswrapper[4917]: I0318 09:24:05.934633 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563764-lpjt2" Mar 18 09:24:06 crc kubenswrapper[4917]: I0318 09:24:06.356744 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-q56hs"] Mar 18 09:24:06 crc kubenswrapper[4917]: I0318 09:24:06.365537 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563758-q56hs"] Mar 18 09:24:07 crc kubenswrapper[4917]: I0318 09:24:07.782392 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a4852f-8b12-447d-93b6-c325d1c5109e" path="/var/lib/kubelet/pods/d1a4852f-8b12-447d-93b6-c325d1c5109e/volumes" Mar 18 09:24:14 crc kubenswrapper[4917]: I0318 09:24:14.019924 4917 generic.go:334] "Generic (PLEG): container finished" podID="f2b2bba2-8c50-47a4-bb71-162e34cb9132" containerID="1236ed334d7d813ed7b2cbd0886ede949623d8214f4db6caa598073a013ea7cf" exitCode=0 Mar 18 09:24:14 crc kubenswrapper[4917]: I0318 09:24:14.020022 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/crc-debug-rddmz" event={"ID":"f2b2bba2-8c50-47a4-bb71-162e34cb9132","Type":"ContainerDied","Data":"1236ed334d7d813ed7b2cbd0886ede949623d8214f4db6caa598073a013ea7cf"} Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.170365 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.204200 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9fplc/crc-debug-rddmz"] Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.213857 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9fplc/crc-debug-rddmz"] Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.288335 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2b2bba2-8c50-47a4-bb71-162e34cb9132-host" (OuterVolumeSpecName: "host") pod "f2b2bba2-8c50-47a4-bb71-162e34cb9132" (UID: "f2b2bba2-8c50-47a4-bb71-162e34cb9132"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.288949 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2b2bba2-8c50-47a4-bb71-162e34cb9132-host\") pod \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\" (UID: \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\") " Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.289073 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cmlz\" (UniqueName: \"kubernetes.io/projected/f2b2bba2-8c50-47a4-bb71-162e34cb9132-kube-api-access-2cmlz\") pod \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\" (UID: \"f2b2bba2-8c50-47a4-bb71-162e34cb9132\") " Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.289408 4917 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2b2bba2-8c50-47a4-bb71-162e34cb9132-host\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.304233 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b2bba2-8c50-47a4-bb71-162e34cb9132-kube-api-access-2cmlz" (OuterVolumeSpecName: "kube-api-access-2cmlz") pod "f2b2bba2-8c50-47a4-bb71-162e34cb9132" (UID: "f2b2bba2-8c50-47a4-bb71-162e34cb9132"). InnerVolumeSpecName "kube-api-access-2cmlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.392499 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cmlz\" (UniqueName: \"kubernetes.io/projected/f2b2bba2-8c50-47a4-bb71-162e34cb9132-kube-api-access-2cmlz\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:15 crc kubenswrapper[4917]: I0318 09:24:15.789168 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2b2bba2-8c50-47a4-bb71-162e34cb9132" path="/var/lib/kubelet/pods/f2b2bba2-8c50-47a4-bb71-162e34cb9132/volumes" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.059802 4917 scope.go:117] "RemoveContainer" containerID="1236ed334d7d813ed7b2cbd0886ede949623d8214f4db6caa598073a013ea7cf" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.059970 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-rddmz" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.417138 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fplc/crc-debug-vw5j8"] Mar 18 09:24:16 crc kubenswrapper[4917]: E0318 09:24:16.418768 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b2bba2-8c50-47a4-bb71-162e34cb9132" containerName="container-00" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.418856 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b2bba2-8c50-47a4-bb71-162e34cb9132" containerName="container-00" Mar 18 09:24:16 crc kubenswrapper[4917]: E0318 09:24:16.418920 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a51b0918-fa96-4784-b6dc-be09150be166" containerName="oc" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.418975 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="a51b0918-fa96-4784-b6dc-be09150be166" containerName="oc" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.419220 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b2bba2-8c50-47a4-bb71-162e34cb9132" containerName="container-00" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.419302 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="a51b0918-fa96-4784-b6dc-be09150be166" containerName="oc" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.420063 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.519358 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adf00d6f-d576-42dc-a10c-91ab7d84eddd-host\") pod \"crc-debug-vw5j8\" (UID: \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\") " pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.519672 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpfq8\" (UniqueName: \"kubernetes.io/projected/adf00d6f-d576-42dc-a10c-91ab7d84eddd-kube-api-access-mpfq8\") pod \"crc-debug-vw5j8\" (UID: \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\") " pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.621374 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adf00d6f-d576-42dc-a10c-91ab7d84eddd-host\") pod \"crc-debug-vw5j8\" (UID: \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\") " pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.621429 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpfq8\" (UniqueName: \"kubernetes.io/projected/adf00d6f-d576-42dc-a10c-91ab7d84eddd-kube-api-access-mpfq8\") pod \"crc-debug-vw5j8\" (UID: \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\") " pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.621535 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adf00d6f-d576-42dc-a10c-91ab7d84eddd-host\") pod \"crc-debug-vw5j8\" (UID: \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\") " pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.649359 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpfq8\" (UniqueName: \"kubernetes.io/projected/adf00d6f-d576-42dc-a10c-91ab7d84eddd-kube-api-access-mpfq8\") pod \"crc-debug-vw5j8\" (UID: \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\") " pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:16 crc kubenswrapper[4917]: I0318 09:24:16.737260 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:17 crc kubenswrapper[4917]: I0318 09:24:17.080886 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/crc-debug-vw5j8" event={"ID":"adf00d6f-d576-42dc-a10c-91ab7d84eddd","Type":"ContainerStarted","Data":"a6803bcfa716f0dbdf37355179c5f631d9882ca46fe336417cab3995801de7c7"} Mar 18 09:24:17 crc kubenswrapper[4917]: I0318 09:24:17.081294 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/crc-debug-vw5j8" event={"ID":"adf00d6f-d576-42dc-a10c-91ab7d84eddd","Type":"ContainerStarted","Data":"ba6d770bd361c2666cb6b869b6d6d9675cea77c52dc39a7d4380c3ef8d66a108"} Mar 18 09:24:17 crc kubenswrapper[4917]: I0318 09:24:17.108298 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9fplc/crc-debug-vw5j8" podStartSLOduration=1.108276539 podStartE2EDuration="1.108276539s" podCreationTimestamp="2026-03-18 09:24:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:24:17.102638342 +0000 UTC m=+9442.043793076" watchObservedRunningTime="2026-03-18 09:24:17.108276539 +0000 UTC m=+9442.049431253" Mar 18 09:24:17 crc kubenswrapper[4917]: I0318 09:24:17.852705 4917 scope.go:117] "RemoveContainer" containerID="9eedb725cae0f961441ea3fb8e05828474a44b87e67341edb50400b8b0449055" Mar 18 09:24:18 crc kubenswrapper[4917]: I0318 09:24:18.092170 4917 generic.go:334] "Generic (PLEG): container finished" podID="adf00d6f-d576-42dc-a10c-91ab7d84eddd" containerID="a6803bcfa716f0dbdf37355179c5f631d9882ca46fe336417cab3995801de7c7" exitCode=0 Mar 18 09:24:18 crc kubenswrapper[4917]: I0318 09:24:18.092225 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/crc-debug-vw5j8" event={"ID":"adf00d6f-d576-42dc-a10c-91ab7d84eddd","Type":"ContainerDied","Data":"a6803bcfa716f0dbdf37355179c5f631d9882ca46fe336417cab3995801de7c7"} Mar 18 09:24:19 crc kubenswrapper[4917]: I0318 09:24:19.197173 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:19 crc kubenswrapper[4917]: I0318 09:24:19.291283 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpfq8\" (UniqueName: \"kubernetes.io/projected/adf00d6f-d576-42dc-a10c-91ab7d84eddd-kube-api-access-mpfq8\") pod \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\" (UID: \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\") " Mar 18 09:24:19 crc kubenswrapper[4917]: I0318 09:24:19.291365 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adf00d6f-d576-42dc-a10c-91ab7d84eddd-host\") pod \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\" (UID: \"adf00d6f-d576-42dc-a10c-91ab7d84eddd\") " Mar 18 09:24:19 crc kubenswrapper[4917]: I0318 09:24:19.291928 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adf00d6f-d576-42dc-a10c-91ab7d84eddd-host" (OuterVolumeSpecName: "host") pod "adf00d6f-d576-42dc-a10c-91ab7d84eddd" (UID: "adf00d6f-d576-42dc-a10c-91ab7d84eddd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:24:19 crc kubenswrapper[4917]: I0318 09:24:19.297225 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf00d6f-d576-42dc-a10c-91ab7d84eddd-kube-api-access-mpfq8" (OuterVolumeSpecName: "kube-api-access-mpfq8") pod "adf00d6f-d576-42dc-a10c-91ab7d84eddd" (UID: "adf00d6f-d576-42dc-a10c-91ab7d84eddd"). InnerVolumeSpecName "kube-api-access-mpfq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:19 crc kubenswrapper[4917]: I0318 09:24:19.396316 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpfq8\" (UniqueName: \"kubernetes.io/projected/adf00d6f-d576-42dc-a10c-91ab7d84eddd-kube-api-access-mpfq8\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:19 crc kubenswrapper[4917]: I0318 09:24:19.396344 4917 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adf00d6f-d576-42dc-a10c-91ab7d84eddd-host\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:19 crc kubenswrapper[4917]: I0318 09:24:19.948658 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9fplc/crc-debug-vw5j8"] Mar 18 09:24:19 crc kubenswrapper[4917]: I0318 09:24:19.956834 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9fplc/crc-debug-vw5j8"] Mar 18 09:24:20 crc kubenswrapper[4917]: I0318 09:24:20.109516 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba6d770bd361c2666cb6b869b6d6d9675cea77c52dc39a7d4380c3ef8d66a108" Mar 18 09:24:20 crc kubenswrapper[4917]: I0318 09:24:20.109620 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-vw5j8" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.140527 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fplc/crc-debug-4vprm"] Mar 18 09:24:21 crc kubenswrapper[4917]: E0318 09:24:21.142041 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf00d6f-d576-42dc-a10c-91ab7d84eddd" containerName="container-00" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.142122 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf00d6f-d576-42dc-a10c-91ab7d84eddd" containerName="container-00" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.142384 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf00d6f-d576-42dc-a10c-91ab7d84eddd" containerName="container-00" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.143113 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.234281 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgdjn\" (UniqueName: \"kubernetes.io/projected/275b6fd9-fc6a-465d-b73e-61c2c83902d4-kube-api-access-cgdjn\") pod \"crc-debug-4vprm\" (UID: \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\") " pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.234876 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/275b6fd9-fc6a-465d-b73e-61c2c83902d4-host\") pod \"crc-debug-4vprm\" (UID: \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\") " pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.337316 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/275b6fd9-fc6a-465d-b73e-61c2c83902d4-host\") pod \"crc-debug-4vprm\" (UID: \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\") " pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.337466 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/275b6fd9-fc6a-465d-b73e-61c2c83902d4-host\") pod \"crc-debug-4vprm\" (UID: \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\") " pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.338493 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgdjn\" (UniqueName: \"kubernetes.io/projected/275b6fd9-fc6a-465d-b73e-61c2c83902d4-kube-api-access-cgdjn\") pod \"crc-debug-4vprm\" (UID: \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\") " pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.363185 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgdjn\" (UniqueName: \"kubernetes.io/projected/275b6fd9-fc6a-465d-b73e-61c2c83902d4-kube-api-access-cgdjn\") pod \"crc-debug-4vprm\" (UID: \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\") " pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.469362 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:21 crc kubenswrapper[4917]: W0318 09:24:21.507946 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod275b6fd9_fc6a_465d_b73e_61c2c83902d4.slice/crio-3ec756f428e2a760240c34b04d1896a39ed03aa7ec85f2c93d6a8d670d25bd6c WatchSource:0}: Error finding container 3ec756f428e2a760240c34b04d1896a39ed03aa7ec85f2c93d6a8d670d25bd6c: Status 404 returned error can't find the container with id 3ec756f428e2a760240c34b04d1896a39ed03aa7ec85f2c93d6a8d670d25bd6c Mar 18 09:24:21 crc kubenswrapper[4917]: I0318 09:24:21.800658 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adf00d6f-d576-42dc-a10c-91ab7d84eddd" path="/var/lib/kubelet/pods/adf00d6f-d576-42dc-a10c-91ab7d84eddd/volumes" Mar 18 09:24:22 crc kubenswrapper[4917]: I0318 09:24:22.130428 4917 generic.go:334] "Generic (PLEG): container finished" podID="275b6fd9-fc6a-465d-b73e-61c2c83902d4" containerID="d99b076e40800f78e21b17654b6faa45c8cb51bb4ef2f43d8509663847ce0b6d" exitCode=0 Mar 18 09:24:22 crc kubenswrapper[4917]: I0318 09:24:22.130488 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/crc-debug-4vprm" event={"ID":"275b6fd9-fc6a-465d-b73e-61c2c83902d4","Type":"ContainerDied","Data":"d99b076e40800f78e21b17654b6faa45c8cb51bb4ef2f43d8509663847ce0b6d"} Mar 18 09:24:22 crc kubenswrapper[4917]: I0318 09:24:22.130577 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/crc-debug-4vprm" event={"ID":"275b6fd9-fc6a-465d-b73e-61c2c83902d4","Type":"ContainerStarted","Data":"3ec756f428e2a760240c34b04d1896a39ed03aa7ec85f2c93d6a8d670d25bd6c"} Mar 18 09:24:22 crc kubenswrapper[4917]: I0318 09:24:22.189762 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9fplc/crc-debug-4vprm"] Mar 18 09:24:22 crc kubenswrapper[4917]: I0318 09:24:22.199290 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9fplc/crc-debug-4vprm"] Mar 18 09:24:23 crc kubenswrapper[4917]: I0318 09:24:23.647382 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:23 crc kubenswrapper[4917]: I0318 09:24:23.800008 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/275b6fd9-fc6a-465d-b73e-61c2c83902d4-host\") pod \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\" (UID: \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\") " Mar 18 09:24:23 crc kubenswrapper[4917]: I0318 09:24:23.800108 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgdjn\" (UniqueName: \"kubernetes.io/projected/275b6fd9-fc6a-465d-b73e-61c2c83902d4-kube-api-access-cgdjn\") pod \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\" (UID: \"275b6fd9-fc6a-465d-b73e-61c2c83902d4\") " Mar 18 09:24:23 crc kubenswrapper[4917]: I0318 09:24:23.800158 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/275b6fd9-fc6a-465d-b73e-61c2c83902d4-host" (OuterVolumeSpecName: "host") pod "275b6fd9-fc6a-465d-b73e-61c2c83902d4" (UID: "275b6fd9-fc6a-465d-b73e-61c2c83902d4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 18 09:24:23 crc kubenswrapper[4917]: I0318 09:24:23.800728 4917 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/275b6fd9-fc6a-465d-b73e-61c2c83902d4-host\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:23 crc kubenswrapper[4917]: I0318 09:24:23.806002 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/275b6fd9-fc6a-465d-b73e-61c2c83902d4-kube-api-access-cgdjn" (OuterVolumeSpecName: "kube-api-access-cgdjn") pod "275b6fd9-fc6a-465d-b73e-61c2c83902d4" (UID: "275b6fd9-fc6a-465d-b73e-61c2c83902d4"). InnerVolumeSpecName "kube-api-access-cgdjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:24:23 crc kubenswrapper[4917]: I0318 09:24:23.904076 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgdjn\" (UniqueName: \"kubernetes.io/projected/275b6fd9-fc6a-465d-b73e-61c2c83902d4-kube-api-access-cgdjn\") on node \"crc\" DevicePath \"\"" Mar 18 09:24:24 crc kubenswrapper[4917]: I0318 09:24:24.157426 4917 scope.go:117] "RemoveContainer" containerID="d99b076e40800f78e21b17654b6faa45c8cb51bb4ef2f43d8509663847ce0b6d" Mar 18 09:24:24 crc kubenswrapper[4917]: I0318 09:24:24.157685 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/crc-debug-4vprm" Mar 18 09:24:25 crc kubenswrapper[4917]: I0318 09:24:25.786230 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="275b6fd9-fc6a-465d-b73e-61c2c83902d4" path="/var/lib/kubelet/pods/275b6fd9-fc6a-465d-b73e-61c2c83902d4/volumes" Mar 18 09:24:32 crc kubenswrapper[4917]: I0318 09:24:32.929409 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:24:32 crc kubenswrapper[4917]: I0318 09:24:32.929929 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:25:02 crc kubenswrapper[4917]: I0318 09:25:02.929765 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:25:02 crc kubenswrapper[4917]: I0318 09:25:02.930694 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:25:02 crc kubenswrapper[4917]: I0318 09:25:02.930780 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 09:25:02 crc kubenswrapper[4917]: I0318 09:25:02.932175 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"631374ea93931dfc55d694e6dd1da33a32d0d39afef7672b4338c0a7786c053e"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:25:02 crc kubenswrapper[4917]: I0318 09:25:02.932306 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://631374ea93931dfc55d694e6dd1da33a32d0d39afef7672b4338c0a7786c053e" gracePeriod=600 Mar 18 09:25:03 crc kubenswrapper[4917]: I0318 09:25:03.620545 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="631374ea93931dfc55d694e6dd1da33a32d0d39afef7672b4338c0a7786c053e" exitCode=0 Mar 18 09:25:03 crc kubenswrapper[4917]: I0318 09:25:03.621241 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"631374ea93931dfc55d694e6dd1da33a32d0d39afef7672b4338c0a7786c053e"} Mar 18 09:25:03 crc kubenswrapper[4917]: I0318 09:25:03.621302 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb"} Mar 18 09:25:03 crc kubenswrapper[4917]: I0318 09:25:03.621339 4917 scope.go:117] "RemoveContainer" containerID="b5e5a17bc9466890fe198575b6de26c6520e6fa0e5035cad3d9c446b7997d1af" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.140283 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563766-zclgd"] Mar 18 09:26:00 crc kubenswrapper[4917]: E0318 09:26:00.141552 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="275b6fd9-fc6a-465d-b73e-61c2c83902d4" containerName="container-00" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.141577 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="275b6fd9-fc6a-465d-b73e-61c2c83902d4" containerName="container-00" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.141933 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="275b6fd9-fc6a-465d-b73e-61c2c83902d4" containerName="container-00" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.143029 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-zclgd" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.146247 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.146430 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.146660 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.148922 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-zclgd"] Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.186767 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88rj6\" (UniqueName: \"kubernetes.io/projected/db8e855c-339f-497a-999f-4c3ea56327cb-kube-api-access-88rj6\") pod \"auto-csr-approver-29563766-zclgd\" (UID: \"db8e855c-339f-497a-999f-4c3ea56327cb\") " pod="openshift-infra/auto-csr-approver-29563766-zclgd" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.288867 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88rj6\" (UniqueName: \"kubernetes.io/projected/db8e855c-339f-497a-999f-4c3ea56327cb-kube-api-access-88rj6\") pod \"auto-csr-approver-29563766-zclgd\" (UID: \"db8e855c-339f-497a-999f-4c3ea56327cb\") " pod="openshift-infra/auto-csr-approver-29563766-zclgd" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.325879 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88rj6\" (UniqueName: \"kubernetes.io/projected/db8e855c-339f-497a-999f-4c3ea56327cb-kube-api-access-88rj6\") pod \"auto-csr-approver-29563766-zclgd\" (UID: \"db8e855c-339f-497a-999f-4c3ea56327cb\") " pod="openshift-infra/auto-csr-approver-29563766-zclgd" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.462113 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-zclgd" Mar 18 09:26:00 crc kubenswrapper[4917]: I0318 09:26:00.984810 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-zclgd"] Mar 18 09:26:01 crc kubenswrapper[4917]: W0318 09:26:01.009533 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb8e855c_339f_497a_999f_4c3ea56327cb.slice/crio-91508c60ee9a89ffad87618c216c6b3f346f0f1c08de07556450ddf4cd7b61c9 WatchSource:0}: Error finding container 91508c60ee9a89ffad87618c216c6b3f346f0f1c08de07556450ddf4cd7b61c9: Status 404 returned error can't find the container with id 91508c60ee9a89ffad87618c216c6b3f346f0f1c08de07556450ddf4cd7b61c9 Mar 18 09:26:01 crc kubenswrapper[4917]: I0318 09:26:01.333378 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563766-zclgd" event={"ID":"db8e855c-339f-497a-999f-4c3ea56327cb","Type":"ContainerStarted","Data":"91508c60ee9a89ffad87618c216c6b3f346f0f1c08de07556450ddf4cd7b61c9"} Mar 18 09:26:03 crc kubenswrapper[4917]: I0318 09:26:03.356570 4917 generic.go:334] "Generic (PLEG): container finished" podID="db8e855c-339f-497a-999f-4c3ea56327cb" containerID="33b9f5ba629ab653693bfea20fb75364e2e5d97491f8438435ed075a9fb8e53a" exitCode=0 Mar 18 09:26:03 crc kubenswrapper[4917]: I0318 09:26:03.356652 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563766-zclgd" event={"ID":"db8e855c-339f-497a-999f-4c3ea56327cb","Type":"ContainerDied","Data":"33b9f5ba629ab653693bfea20fb75364e2e5d97491f8438435ed075a9fb8e53a"} Mar 18 09:26:04 crc kubenswrapper[4917]: I0318 09:26:04.744403 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-zclgd" Mar 18 09:26:04 crc kubenswrapper[4917]: I0318 09:26:04.799818 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88rj6\" (UniqueName: \"kubernetes.io/projected/db8e855c-339f-497a-999f-4c3ea56327cb-kube-api-access-88rj6\") pod \"db8e855c-339f-497a-999f-4c3ea56327cb\" (UID: \"db8e855c-339f-497a-999f-4c3ea56327cb\") " Mar 18 09:26:05 crc kubenswrapper[4917]: I0318 09:26:05.380092 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563766-zclgd" event={"ID":"db8e855c-339f-497a-999f-4c3ea56327cb","Type":"ContainerDied","Data":"91508c60ee9a89ffad87618c216c6b3f346f0f1c08de07556450ddf4cd7b61c9"} Mar 18 09:26:05 crc kubenswrapper[4917]: I0318 09:26:05.380479 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91508c60ee9a89ffad87618c216c6b3f346f0f1c08de07556450ddf4cd7b61c9" Mar 18 09:26:05 crc kubenswrapper[4917]: I0318 09:26:05.380176 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563766-zclgd" Mar 18 09:26:05 crc kubenswrapper[4917]: I0318 09:26:05.437075 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8e855c-339f-497a-999f-4c3ea56327cb-kube-api-access-88rj6" (OuterVolumeSpecName: "kube-api-access-88rj6") pod "db8e855c-339f-497a-999f-4c3ea56327cb" (UID: "db8e855c-339f-497a-999f-4c3ea56327cb"). InnerVolumeSpecName "kube-api-access-88rj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:26:05 crc kubenswrapper[4917]: I0318 09:26:05.519297 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88rj6\" (UniqueName: \"kubernetes.io/projected/db8e855c-339f-497a-999f-4c3ea56327cb-kube-api-access-88rj6\") on node \"crc\" DevicePath \"\"" Mar 18 09:26:05 crc kubenswrapper[4917]: I0318 09:26:05.814527 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-pv28h"] Mar 18 09:26:05 crc kubenswrapper[4917]: I0318 09:26:05.832234 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563760-pv28h"] Mar 18 09:26:07 crc kubenswrapper[4917]: I0318 09:26:07.783749 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c05360f-a8d4-471a-9b9b-b3a57e4c4d12" path="/var/lib/kubelet/pods/7c05360f-a8d4-471a-9b9b-b3a57e4c4d12/volumes" Mar 18 09:26:18 crc kubenswrapper[4917]: I0318 09:26:17.999676 4917 scope.go:117] "RemoveContainer" containerID="9d8529bcc6697844ee778295a7be014d6fdd18c5d243bbe6d3f4b044b29191fb" Mar 18 09:26:22 crc kubenswrapper[4917]: I0318 09:26:22.244633 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_e280f1c8-4761-422b-a9c2-5c429de52eef/init-config-reloader/0.log" Mar 18 09:26:22 crc kubenswrapper[4917]: I0318 09:26:22.488641 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_e280f1c8-4761-422b-a9c2-5c429de52eef/init-config-reloader/0.log" Mar 18 09:26:22 crc kubenswrapper[4917]: I0318 09:26:22.527003 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_e280f1c8-4761-422b-a9c2-5c429de52eef/alertmanager/0.log" Mar 18 09:26:22 crc kubenswrapper[4917]: I0318 09:26:22.545755 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_e280f1c8-4761-422b-a9c2-5c429de52eef/config-reloader/0.log" Mar 18 09:26:22 crc kubenswrapper[4917]: I0318 09:26:22.757690 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_47cdaf9a-44d8-4517-9e86-2e855a5dbfb4/aodh-api/0.log" Mar 18 09:26:22 crc kubenswrapper[4917]: I0318 09:26:22.780488 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_47cdaf9a-44d8-4517-9e86-2e855a5dbfb4/aodh-evaluator/0.log" Mar 18 09:26:22 crc kubenswrapper[4917]: I0318 09:26:22.948894 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_47cdaf9a-44d8-4517-9e86-2e855a5dbfb4/aodh-listener/0.log" Mar 18 09:26:22 crc kubenswrapper[4917]: I0318 09:26:22.955306 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_47cdaf9a-44d8-4517-9e86-2e855a5dbfb4/aodh-notifier/0.log" Mar 18 09:26:23 crc kubenswrapper[4917]: I0318 09:26:23.040308 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84574ccfdb-jwjkm_b26bf0a9-c189-4146-aee9-2d2513251872/barbican-api/0.log" Mar 18 09:26:23 crc kubenswrapper[4917]: I0318 09:26:23.167458 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-84574ccfdb-jwjkm_b26bf0a9-c189-4146-aee9-2d2513251872/barbican-api-log/0.log" Mar 18 09:26:23 crc kubenswrapper[4917]: I0318 09:26:23.223632 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-547d8cd7db-x2gql_2b9513fe-7b49-4571-b3fa-71c00adf8dd6/barbican-keystone-listener/0.log" Mar 18 09:26:23 crc kubenswrapper[4917]: I0318 09:26:23.458472 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d7b5bfd65-9fl8g_c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917/barbican-worker/0.log" Mar 18 09:26:23 crc kubenswrapper[4917]: I0318 09:26:23.511030 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-d7b5bfd65-9fl8g_c35fbeb6-0c9b-4f8e-87a7-232cbbaaf917/barbican-worker-log/0.log" Mar 18 09:26:23 crc kubenswrapper[4917]: I0318 09:26:23.585729 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-547d8cd7db-x2gql_2b9513fe-7b49-4571-b3fa-71c00adf8dd6/barbican-keystone-listener-log/0.log" Mar 18 09:26:23 crc kubenswrapper[4917]: I0318 09:26:23.734114 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-gzhp8_637c5a5e-39a1-498f-b95f-863f64074f0b/bootstrap-openstack-openstack-cell1/0.log" Mar 18 09:26:23 crc kubenswrapper[4917]: I0318 09:26:23.865434 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_93f15a4a-a697-4645-b0fb-1954e6f028c2/ceilometer-central-agent/0.log" Mar 18 09:26:23 crc kubenswrapper[4917]: I0318 09:26:23.939325 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_93f15a4a-a697-4645-b0fb-1954e6f028c2/proxy-httpd/0.log" Mar 18 09:26:24 crc kubenswrapper[4917]: I0318 09:26:24.008466 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_93f15a4a-a697-4645-b0fb-1954e6f028c2/sg-core/0.log" Mar 18 09:26:24 crc kubenswrapper[4917]: I0318 09:26:24.010805 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_93f15a4a-a697-4645-b0fb-1954e6f028c2/ceilometer-notification-agent/0.log" Mar 18 09:26:24 crc kubenswrapper[4917]: I0318 09:26:24.224355 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_94257751-2950-43fe-acb4-f96f1a8e5264/cinder-api/0.log" Mar 18 09:26:24 crc kubenswrapper[4917]: I0318 09:26:24.242916 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_94257751-2950-43fe-acb4-f96f1a8e5264/cinder-api-log/0.log" Mar 18 09:26:24 crc kubenswrapper[4917]: I0318 09:26:24.357747 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cb5a322e-ac24-41a9-a411-1ba1e05738e5/cinder-scheduler/0.log" Mar 18 09:26:24 crc kubenswrapper[4917]: I0318 09:26:24.480790 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cb5a322e-ac24-41a9-a411-1ba1e05738e5/probe/0.log" Mar 18 09:26:24 crc kubenswrapper[4917]: I0318 09:26:24.504766 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-hngpl_517c4a9b-15b8-42fd-8c75-2e4487db81ab/configure-network-openstack-openstack-cell1/0.log" Mar 18 09:26:24 crc kubenswrapper[4917]: I0318 09:26:24.721225 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-hgp6d_a7a34bdf-e873-4224-aff7-f81e8d05c6d4/configure-os-openstack-openstack-cell1/0.log" Mar 18 09:26:24 crc kubenswrapper[4917]: I0318 09:26:24.829363 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-68c69f87f-zcd25_3a7032b3-8006-4159-ad39-67fd51cbda21/init/0.log" Mar 18 09:26:25 crc kubenswrapper[4917]: I0318 09:26:25.025547 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-68c69f87f-zcd25_3a7032b3-8006-4159-ad39-67fd51cbda21/init/0.log" Mar 18 09:26:25 crc kubenswrapper[4917]: I0318 09:26:25.059905 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-68c69f87f-zcd25_3a7032b3-8006-4159-ad39-67fd51cbda21/dnsmasq-dns/0.log" Mar 18 09:26:25 crc kubenswrapper[4917]: I0318 09:26:25.130503 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-96ck2_56acaff3-9fc1-4c18-b49d-f2a025d5804b/download-cache-openstack-openstack-cell1/0.log" Mar 18 09:26:25 crc kubenswrapper[4917]: I0318 09:26:25.503466 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_944c92dd-53dd-4c82-9dd1-b96bfaff4130/glance-httpd/0.log" Mar 18 09:26:25 crc kubenswrapper[4917]: I0318 09:26:25.524286 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_944c92dd-53dd-4c82-9dd1-b96bfaff4130/glance-log/0.log" Mar 18 09:26:26 crc kubenswrapper[4917]: I0318 09:26:26.068209 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1bc2d62e-408e-48bb-9fce-1901055ecbd0/glance-log/0.log" Mar 18 09:26:26 crc kubenswrapper[4917]: I0318 09:26:26.150056 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_1bc2d62e-408e-48bb-9fce-1901055ecbd0/glance-httpd/0.log" Mar 18 09:26:26 crc kubenswrapper[4917]: I0318 09:26:26.521268 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-9f889575f-j9mf4_4b41925b-da5b-4ad8-b314-56b2fb282f95/heat-api/0.log" Mar 18 09:26:26 crc kubenswrapper[4917]: I0318 09:26:26.641273 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-9f4b4874b-nfbm6_7efdb9e3-6dd2-4f00-a37b-c85fa4b2de8c/heat-engine/0.log" Mar 18 09:26:26 crc kubenswrapper[4917]: I0318 09:26:26.756267 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7cdb7dbfb5-59str_c6bad2ad-7a90-4798-863d-3b81e21103bc/heat-cfnapi/0.log" Mar 18 09:26:26 crc kubenswrapper[4917]: I0318 09:26:26.890985 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5995799978-72tkb_2cd988f0-ae23-4f45-8247-29abf7b143c4/horizon/0.log" Mar 18 09:26:27 crc kubenswrapper[4917]: I0318 09:26:27.000073 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-4ppg5_f93f1f68-8a97-4e89-9d2b-b1583f046a2d/install-certs-openstack-openstack-cell1/0.log" Mar 18 09:26:27 crc kubenswrapper[4917]: I0318 09:26:27.101009 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-tvtpj_d4d1cb7a-8c3a-4932-a94a-03038c30b8e5/install-os-openstack-openstack-cell1/0.log" Mar 18 09:26:27 crc kubenswrapper[4917]: I0318 09:26:27.302767 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5995799978-72tkb_2cd988f0-ae23-4f45-8247-29abf7b143c4/horizon-log/0.log" Mar 18 09:26:27 crc kubenswrapper[4917]: I0318 09:26:27.370682 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29563741-qfnz6_1a111eec-671e-4a20-9e9a-5ce32fd77146/keystone-cron/0.log" Mar 18 09:26:27 crc kubenswrapper[4917]: I0318 09:26:27.950958 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7cd97484f6-89g6g_be44e918-695b-4764-b584-9698eeabb807/keystone-api/0.log" Mar 18 09:26:27 crc kubenswrapper[4917]: I0318 09:26:27.971728 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-f7mjl_b993b322-7111-4544-9def-988363263914/libvirt-openstack-openstack-cell1/0.log" Mar 18 09:26:27 crc kubenswrapper[4917]: I0318 09:26:27.982076 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6639b737-2219-4fdb-9a93-4e0155460477/kube-state-metrics/0.log" Mar 18 09:26:28 crc kubenswrapper[4917]: I0318 09:26:28.461594 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-ttm74_e81deaf1-264a-4b25-b851-b95a06067a3a/neutron-dhcp-openstack-openstack-cell1/0.log" Mar 18 09:26:28 crc kubenswrapper[4917]: I0318 09:26:28.497012 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d49c699bc-wmgbl_776717e0-ca51-4856-abc7-f1b03d0d312b/neutron-httpd/0.log" Mar 18 09:26:28 crc kubenswrapper[4917]: I0318 09:26:28.734468 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-6c4fv_c3deb9d3-e372-47e2-9096-2e30c4aa547f/neutron-metadata-openstack-openstack-cell1/0.log" Mar 18 09:26:28 crc kubenswrapper[4917]: I0318 09:26:28.775617 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6d49c699bc-wmgbl_776717e0-ca51-4856-abc7-f1b03d0d312b/neutron-api/0.log" Mar 18 09:26:28 crc kubenswrapper[4917]: I0318 09:26:28.970444 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-c2fcp_4d740554-2999-4fcd-89ec-f5377c3db109/neutron-sriov-openstack-openstack-cell1/0.log" Mar 18 09:26:29 crc kubenswrapper[4917]: I0318 09:26:29.646332 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_5a8b2c27-8097-4e77-a187-bcb38a11b60f/nova-cell0-conductor-conductor/0.log" Mar 18 09:26:29 crc kubenswrapper[4917]: I0318 09:26:29.668859 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b0b28978-b862-4cb1-9af3-bf8e71479244/nova-api-log/0.log" Mar 18 09:26:29 crc kubenswrapper[4917]: I0318 09:26:29.674983 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b0b28978-b862-4cb1-9af3-bf8e71479244/nova-api-api/0.log" Mar 18 09:26:29 crc kubenswrapper[4917]: I0318 09:26:29.997114 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_944ee733-175d-44d6-bd03-1f55c6282343/nova-cell1-novncproxy-novncproxy/0.log" Mar 18 09:26:30 crc kubenswrapper[4917]: I0318 09:26:30.009316 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_4faa5863-119c-460a-8ba2-e02c385319a9/nova-cell1-conductor-conductor/0.log" Mar 18 09:26:30 crc kubenswrapper[4917]: I0318 09:26:30.181678 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellmxd2w_68b25068-0099-413b-8a35-d218abba4a8c/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Mar 18 09:26:30 crc kubenswrapper[4917]: I0318 09:26:30.336746 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-v42ph_65333a03-4e64-4807-bae2-cb0b1d517e08/nova-cell1-openstack-openstack-cell1/0.log" Mar 18 09:26:30 crc kubenswrapper[4917]: I0318 09:26:30.481111 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f6a66cf9-c416-4b82-99f9-7f389cb14e79/nova-metadata-log/0.log" Mar 18 09:26:30 crc kubenswrapper[4917]: I0318 09:26:30.791020 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_3b3ccdd1-1e4c-4671-b8cd-c950889b24be/nova-scheduler-scheduler/0.log" Mar 18 09:26:30 crc kubenswrapper[4917]: I0318 09:26:30.793752 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f6a66cf9-c416-4b82-99f9-7f389cb14e79/nova-metadata-metadata/0.log" Mar 18 09:26:30 crc kubenswrapper[4917]: I0318 09:26:30.834982 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ff386b88-0ec0-4ff3-8579-84af54562ab6/mysql-bootstrap/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.144598 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ff386b88-0ec0-4ff3-8579-84af54562ab6/galera/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.161630 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_ff386b88-0ec0-4ff3-8579-84af54562ab6/mysql-bootstrap/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.179070 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b0524b90-39ac-4532-820a-23f804a96420/mysql-bootstrap/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.321198 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b0524b90-39ac-4532-820a-23f804a96420/mysql-bootstrap/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.336522 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b0524b90-39ac-4532-820a-23f804a96420/galera/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.434647 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f4465c4d-dda1-44ec-84d0-5a060ec3453f/openstackclient/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.622835 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac882469-bc1e-4d1f-a8cb-68e67ed26912/openstack-network-exporter/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.707906 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_ac882469-bc1e-4d1f-a8cb-68e67ed26912/ovn-northd/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.856119 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-7ff47_717aeb26-970e-480b-95d1-d514151ca1ac/ovn-openstack-openstack-cell1/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.974663 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9023a312-69d0-4d07-bd00-aadc2d360175/openstack-network-exporter/0.log" Mar 18 09:26:31 crc kubenswrapper[4917]: I0318 09:26:31.987971 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9023a312-69d0-4d07-bd00-aadc2d360175/ovsdbserver-nb/0.log" Mar 18 09:26:32 crc kubenswrapper[4917]: I0318 09:26:32.154738 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_648fc31e-cfb7-4852-89ab-88296601d71d/openstack-network-exporter/0.log" Mar 18 09:26:32 crc kubenswrapper[4917]: I0318 09:26:32.275624 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_648fc31e-cfb7-4852-89ab-88296601d71d/ovsdbserver-nb/0.log" Mar 18 09:26:32 crc kubenswrapper[4917]: I0318 09:26:32.335145 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_e25a8b13-2060-4331-ae83-11d45548dce0/openstack-network-exporter/0.log" Mar 18 09:26:32 crc kubenswrapper[4917]: I0318 09:26:32.462579 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_e25a8b13-2060-4331-ae83-11d45548dce0/ovsdbserver-nb/0.log" Mar 18 09:26:32 crc kubenswrapper[4917]: I0318 09:26:32.522295 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cd05943e-fff4-460d-b6a1-14635410027a/openstack-network-exporter/0.log" Mar 18 09:26:32 crc kubenswrapper[4917]: I0318 09:26:32.561361 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cd05943e-fff4-460d-b6a1-14635410027a/ovsdbserver-sb/0.log" Mar 18 09:26:32 crc kubenswrapper[4917]: I0318 09:26:32.772981 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_5e3b81ab-abf1-4b33-93c5-b4a86b14ab62/openstack-network-exporter/0.log" Mar 18 09:26:32 crc kubenswrapper[4917]: I0318 09:26:32.866784 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_5e3b81ab-abf1-4b33-93c5-b4a86b14ab62/ovsdbserver-sb/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.173642 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_255dc086-6870-42ef-9774-18e47840139d/ovsdbserver-sb/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.226414 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_255dc086-6870-42ef-9774-18e47840139d/openstack-network-exporter/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.408626 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d866b6d76-b4474_907d7650-17ad-415e-b034-7545f9a0df95/placement-api/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.473948 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-ckhcgl_44b6d485-9af3-465d-82a5-076b82497187/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.534896 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-d866b6d76-b4474_907d7650-17ad-415e-b034-7545f9a0df95/placement-log/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.689367 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fd1a219f-72b5-4574-8f02-9a956ee8bb56/init-config-reloader/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.913737 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fd1a219f-72b5-4574-8f02-9a956ee8bb56/config-reloader/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.951716 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fd1a219f-72b5-4574-8f02-9a956ee8bb56/thanos-sidecar/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.963105 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fd1a219f-72b5-4574-8f02-9a956ee8bb56/init-config-reloader/0.log" Mar 18 09:26:33 crc kubenswrapper[4917]: I0318 09:26:33.996172 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_fd1a219f-72b5-4574-8f02-9a956ee8bb56/prometheus/0.log" Mar 18 09:26:34 crc kubenswrapper[4917]: I0318 09:26:34.162937 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72/setup-container/0.log" Mar 18 09:26:34 crc kubenswrapper[4917]: I0318 09:26:34.372185 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72/setup-container/0.log" Mar 18 09:26:34 crc kubenswrapper[4917]: I0318 09:26:34.437695 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0f5f6d6e-ea66-4dc9-a78a-d59c5dbdbb72/rabbitmq/0.log" Mar 18 09:26:34 crc kubenswrapper[4917]: I0318 09:26:34.483259 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_669c480a-39bf-4e91-985b-528c87fa0129/setup-container/0.log" Mar 18 09:26:35 crc kubenswrapper[4917]: I0318 09:26:35.076110 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_669c480a-39bf-4e91-985b-528c87fa0129/setup-container/0.log" Mar 18 09:26:35 crc kubenswrapper[4917]: I0318 09:26:35.180918 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_669c480a-39bf-4e91-985b-528c87fa0129/rabbitmq/0.log" Mar 18 09:26:35 crc kubenswrapper[4917]: I0318 09:26:35.208140 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-hj7wk_21e0212b-6666-4baf-bcb7-ef705369af7f/reboot-os-openstack-openstack-cell1/0.log" Mar 18 09:26:35 crc kubenswrapper[4917]: I0318 09:26:35.439456 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-lmvxk_57890f30-e41c-4670-8b1a-baf27e024f02/run-os-openstack-openstack-cell1/0.log" Mar 18 09:26:35 crc kubenswrapper[4917]: I0318 09:26:35.479294 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-5bnqr_4c797acd-959e-4642-a4d5-037291d1f2ce/ssh-known-hosts-openstack/0.log" Mar 18 09:26:35 crc kubenswrapper[4917]: I0318 09:26:35.766474 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-747bdb4bf8-skf9g_7eb5b6e2-0173-4d18-a025-57faa95077af/proxy-server/0.log" Mar 18 09:26:35 crc kubenswrapper[4917]: I0318 09:26:35.925898 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-65679_d7067b7e-c682-4627-be7d-f86139040b78/swift-ring-rebalance/0.log" Mar 18 09:26:35 crc kubenswrapper[4917]: I0318 09:26:35.956317 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-747bdb4bf8-skf9g_7eb5b6e2-0173-4d18-a025-57faa95077af/proxy-httpd/0.log" Mar 18 09:26:36 crc kubenswrapper[4917]: I0318 09:26:36.145101 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-48kxh_cc4c5b61-1b45-4d87-aa89-88daf0227751/telemetry-openstack-openstack-cell1/0.log" Mar 18 09:26:36 crc kubenswrapper[4917]: I0318 09:26:36.276597 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_aea4965f-d2fe-4941-8ff8-1cf5cf9cd588/tempest-tests-tempest-tests-runner/0.log" Mar 18 09:26:36 crc kubenswrapper[4917]: I0318 09:26:36.456918 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_3c577fff-d7f1-43e9-a6c9-1dd565be5f74/test-operator-logs-container/0.log" Mar 18 09:26:36 crc kubenswrapper[4917]: I0318 09:26:36.821481 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-khdcj_afc5f89c-2aa9-4a99-8fda-794ab379f98a/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Mar 18 09:26:37 crc kubenswrapper[4917]: I0318 09:26:37.066834 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-xhxnc_634f496b-b740-4ed3-9dac-45ec0afdbb16/validate-network-openstack-openstack-cell1/0.log" Mar 18 09:26:50 crc kubenswrapper[4917]: I0318 09:26:50.780525 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_694c424e-6894-48b0-9724-22d72b167a8c/memcached/0.log" Mar 18 09:27:08 crc kubenswrapper[4917]: I0318 09:27:08.610572 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-9xcfc_e340972f-e354-42b8-98f0-7188fe46ae69/manager/0.log" Mar 18 09:27:08 crc kubenswrapper[4917]: I0318 09:27:08.832064 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv_305d3d2a-a9ee-443c-a91a-8c72ec108b60/util/0.log" Mar 18 09:27:09 crc kubenswrapper[4917]: I0318 09:27:09.043237 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv_305d3d2a-a9ee-443c-a91a-8c72ec108b60/pull/0.log" Mar 18 09:27:09 crc kubenswrapper[4917]: I0318 09:27:09.102965 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv_305d3d2a-a9ee-443c-a91a-8c72ec108b60/util/0.log" Mar 18 09:27:09 crc kubenswrapper[4917]: I0318 09:27:09.148949 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv_305d3d2a-a9ee-443c-a91a-8c72ec108b60/pull/0.log" Mar 18 09:27:09 crc kubenswrapper[4917]: I0318 09:27:09.365455 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv_305d3d2a-a9ee-443c-a91a-8c72ec108b60/pull/0.log" Mar 18 09:27:09 crc kubenswrapper[4917]: I0318 09:27:09.410134 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv_305d3d2a-a9ee-443c-a91a-8c72ec108b60/util/0.log" Mar 18 09:27:09 crc kubenswrapper[4917]: I0318 09:27:09.417156 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_dcc7ee2d329e979470464bdf20d796f32cb792a2a57193a7e00beef3a8snwxv_305d3d2a-a9ee-443c-a91a-8c72ec108b60/extract/0.log" Mar 18 09:27:09 crc kubenswrapper[4917]: I0318 09:27:09.636255 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-ptnzg_f0153e44-31ff-4d0f-8592-d6171f714f97/manager/0.log" Mar 18 09:27:09 crc kubenswrapper[4917]: I0318 09:27:09.945127 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-d2748_46d6297c-23d8-40d4-8238-c4b5c4c8669c/manager/0.log" Mar 18 09:27:10 crc kubenswrapper[4917]: I0318 09:27:10.008805 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-5d6d5_617f6235-ba96-4558-8bc4-bef2488095d2/manager/0.log" Mar 18 09:27:10 crc kubenswrapper[4917]: I0318 09:27:10.170332 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-ghz2b_83b25f75-498a-4c21-9fd4-222f866b7dec/manager/0.log" Mar 18 09:27:10 crc kubenswrapper[4917]: I0318 09:27:10.499970 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-mbcrf_52be5f50-56f7-4258-863d-998730dd87b7/manager/0.log" Mar 18 09:27:10 crc kubenswrapper[4917]: I0318 09:27:10.921920 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-dt54q_7b8cbed9-9d0f-48dd-a497-910e2e1036ad/manager/0.log" Mar 18 09:27:11 crc kubenswrapper[4917]: I0318 09:27:11.100877 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-wjfbl_74a539c5-379b-44f2-ac68-c91ba6a7bafa/manager/0.log" Mar 18 09:27:11 crc kubenswrapper[4917]: I0318 09:27:11.121486 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-h845j_155909c6-db8f-43ca-88e1-4757d8583af3/manager/0.log" Mar 18 09:27:11 crc kubenswrapper[4917]: I0318 09:27:11.381526 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-krlq9_e807d4e5-b371-4e8c-ad6b-d0a3dc68e495/manager/0.log" Mar 18 09:27:11 crc kubenswrapper[4917]: I0318 09:27:11.595131 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-blj7z_02fe4313-7ce9-497e-a01d-b69b6ed0faa5/manager/0.log" Mar 18 09:27:11 crc kubenswrapper[4917]: I0318 09:27:11.815852 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-72hk7_d6ab874a-dbdb-4a4d-bcf0-366d44849574/manager/0.log" Mar 18 09:27:11 crc kubenswrapper[4917]: I0318 09:27:11.879729 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-f8nxt_b29b952f-ff1c-4734-a43b-4b836d090108/manager/0.log" Mar 18 09:27:11 crc kubenswrapper[4917]: I0318 09:27:11.906064 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-jh9lg_bcaaf338-4e05-4578-ab2c-55a4a6909f0d/manager/0.log" Mar 18 09:27:12 crc kubenswrapper[4917]: I0318 09:27:12.048410 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c68874588-kjm55_4e8c2c8e-89ce-487f-b6f9-21c3af395094/manager/0.log" Mar 18 09:27:12 crc kubenswrapper[4917]: I0318 09:27:12.239161 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-57d974f4f8-dztqn_fc407819-4730-4ca0-9ccb-dcf8ba968fa9/operator/0.log" Mar 18 09:27:12 crc kubenswrapper[4917]: I0318 09:27:12.543211 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mtzqz_ad827dce-95e9-4fc0-bcc8-14788aba8766/registry-server/0.log" Mar 18 09:27:12 crc kubenswrapper[4917]: I0318 09:27:12.624724 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-n4sbn_a51336ad-c2bd-4c62-80f1-45fc36883ead/manager/0.log" Mar 18 09:27:12 crc kubenswrapper[4917]: I0318 09:27:12.685552 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-hhtxp_58ee89c8-9799-45bc-8851-9f0309c9fe9c/manager/0.log" Mar 18 09:27:12 crc kubenswrapper[4917]: I0318 09:27:12.828259 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pmp5r_b1879bdf-322b-447c-964f-18eeda782b83/operator/0.log" Mar 18 09:27:12 crc kubenswrapper[4917]: I0318 09:27:12.963997 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-6gvpd_cf290619-453a-4fe0-802d-8a079ad9268f/manager/0.log" Mar 18 09:27:13 crc kubenswrapper[4917]: I0318 09:27:13.196211 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-pbdwd_33ac06ad-c83e-4881-85f4-c13e31f72ac9/manager/0.log" Mar 18 09:27:13 crc kubenswrapper[4917]: I0318 09:27:13.285570 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-b8bsp_d6e0ddcc-f5b6-4201-8261-2f2b975a6532/manager/0.log" Mar 18 09:27:13 crc kubenswrapper[4917]: I0318 09:27:13.369243 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-fkdn9_31f7a213-0bfe-4a21-9f97-684d862587fe/manager/0.log" Mar 18 09:27:14 crc kubenswrapper[4917]: I0318 09:27:14.578902 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5b7b5fbd97-fbnbz_93bce59e-ffb2-4962-9e6d-79d909c26899/manager/0.log" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.491314 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rgt7l"] Mar 18 09:27:19 crc kubenswrapper[4917]: E0318 09:27:19.492473 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db8e855c-339f-497a-999f-4c3ea56327cb" containerName="oc" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.492493 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8e855c-339f-497a-999f-4c3ea56327cb" containerName="oc" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.492829 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="db8e855c-339f-497a-999f-4c3ea56327cb" containerName="oc" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.497469 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.510463 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rgt7l"] Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.515931 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-catalog-content\") pod \"certified-operators-rgt7l\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.516128 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m7c9\" (UniqueName: \"kubernetes.io/projected/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-kube-api-access-4m7c9\") pod \"certified-operators-rgt7l\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.516416 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-utilities\") pod \"certified-operators-rgt7l\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.622979 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m7c9\" (UniqueName: \"kubernetes.io/projected/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-kube-api-access-4m7c9\") pod \"certified-operators-rgt7l\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.623107 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-utilities\") pod \"certified-operators-rgt7l\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.623173 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-catalog-content\") pod \"certified-operators-rgt7l\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.623810 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-catalog-content\") pod \"certified-operators-rgt7l\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.624038 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-utilities\") pod \"certified-operators-rgt7l\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.646562 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m7c9\" (UniqueName: \"kubernetes.io/projected/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-kube-api-access-4m7c9\") pod \"certified-operators-rgt7l\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:19 crc kubenswrapper[4917]: I0318 09:27:19.823171 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:20 crc kubenswrapper[4917]: I0318 09:27:20.352441 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rgt7l"] Mar 18 09:27:21 crc kubenswrapper[4917]: I0318 09:27:21.146495 4917 generic.go:334] "Generic (PLEG): container finished" podID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerID="8c95e7ca12c6237d0fffd67e0e2919cc2e1eb7d118f830e307b30380c7574c27" exitCode=0 Mar 18 09:27:21 crc kubenswrapper[4917]: I0318 09:27:21.146546 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgt7l" event={"ID":"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2","Type":"ContainerDied","Data":"8c95e7ca12c6237d0fffd67e0e2919cc2e1eb7d118f830e307b30380c7574c27"} Mar 18 09:27:21 crc kubenswrapper[4917]: I0318 09:27:21.146896 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgt7l" event={"ID":"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2","Type":"ContainerStarted","Data":"658579948d4d8b5d78b3ed67e853083ab284fbfc19a3c23c8dc1705b41dd48b0"} Mar 18 09:27:21 crc kubenswrapper[4917]: I0318 09:27:21.148366 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:27:22 crc kubenswrapper[4917]: I0318 09:27:22.158356 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgt7l" event={"ID":"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2","Type":"ContainerStarted","Data":"2a28ae9470976e7420f7711bff5c9f8ec075b111a6034e2c10db21887e4e6083"} Mar 18 09:27:23 crc kubenswrapper[4917]: I0318 09:27:23.170674 4917 generic.go:334] "Generic (PLEG): container finished" podID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerID="2a28ae9470976e7420f7711bff5c9f8ec075b111a6034e2c10db21887e4e6083" exitCode=0 Mar 18 09:27:23 crc kubenswrapper[4917]: I0318 09:27:23.170759 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgt7l" event={"ID":"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2","Type":"ContainerDied","Data":"2a28ae9470976e7420f7711bff5c9f8ec075b111a6034e2c10db21887e4e6083"} Mar 18 09:27:24 crc kubenswrapper[4917]: I0318 09:27:24.184354 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgt7l" event={"ID":"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2","Type":"ContainerStarted","Data":"c117658138c85024024fd5377f28aa41fd6fba592edc2ab58ef51a58911b8eca"} Mar 18 09:27:24 crc kubenswrapper[4917]: I0318 09:27:24.209348 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rgt7l" podStartSLOduration=2.81705144 podStartE2EDuration="5.209326783s" podCreationTimestamp="2026-03-18 09:27:19 +0000 UTC" firstStartedPulling="2026-03-18 09:27:21.148096765 +0000 UTC m=+9626.089251479" lastFinishedPulling="2026-03-18 09:27:23.540372108 +0000 UTC m=+9628.481526822" observedRunningTime="2026-03-18 09:27:24.1984762 +0000 UTC m=+9629.139630924" watchObservedRunningTime="2026-03-18 09:27:24.209326783 +0000 UTC m=+9629.150481497" Mar 18 09:27:29 crc kubenswrapper[4917]: I0318 09:27:29.823675 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:29 crc kubenswrapper[4917]: I0318 09:27:29.824035 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:29 crc kubenswrapper[4917]: I0318 09:27:29.879907 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:30 crc kubenswrapper[4917]: I0318 09:27:30.317906 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:30 crc kubenswrapper[4917]: I0318 09:27:30.373169 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rgt7l"] Mar 18 09:27:32 crc kubenswrapper[4917]: I0318 09:27:32.264608 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rgt7l" podUID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerName="registry-server" containerID="cri-o://c117658138c85024024fd5377f28aa41fd6fba592edc2ab58ef51a58911b8eca" gracePeriod=2 Mar 18 09:27:32 crc kubenswrapper[4917]: I0318 09:27:32.928854 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:27:32 crc kubenswrapper[4917]: I0318 09:27:32.929318 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.279664 4917 generic.go:334] "Generic (PLEG): container finished" podID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerID="c117658138c85024024fd5377f28aa41fd6fba592edc2ab58ef51a58911b8eca" exitCode=0 Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.279722 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgt7l" event={"ID":"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2","Type":"ContainerDied","Data":"c117658138c85024024fd5377f28aa41fd6fba592edc2ab58ef51a58911b8eca"} Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.774658 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.936754 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-catalog-content\") pod \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.937194 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m7c9\" (UniqueName: \"kubernetes.io/projected/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-kube-api-access-4m7c9\") pod \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.937342 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-utilities\") pod \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\" (UID: \"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2\") " Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.938089 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-utilities" (OuterVolumeSpecName: "utilities") pod "e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" (UID: "e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.940077 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.953137 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-kube-api-access-4m7c9" (OuterVolumeSpecName: "kube-api-access-4m7c9") pod "e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" (UID: "e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2"). InnerVolumeSpecName "kube-api-access-4m7c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:27:33 crc kubenswrapper[4917]: I0318 09:27:33.986298 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" (UID: "e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:27:34 crc kubenswrapper[4917]: I0318 09:27:34.041487 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m7c9\" (UniqueName: \"kubernetes.io/projected/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-kube-api-access-4m7c9\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:34 crc kubenswrapper[4917]: I0318 09:27:34.041521 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:27:34 crc kubenswrapper[4917]: I0318 09:27:34.289793 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rgt7l" event={"ID":"e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2","Type":"ContainerDied","Data":"658579948d4d8b5d78b3ed67e853083ab284fbfc19a3c23c8dc1705b41dd48b0"} Mar 18 09:27:34 crc kubenswrapper[4917]: I0318 09:27:34.289866 4917 scope.go:117] "RemoveContainer" containerID="c117658138c85024024fd5377f28aa41fd6fba592edc2ab58ef51a58911b8eca" Mar 18 09:27:34 crc kubenswrapper[4917]: I0318 09:27:34.289866 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rgt7l" Mar 18 09:27:34 crc kubenswrapper[4917]: I0318 09:27:34.313236 4917 scope.go:117] "RemoveContainer" containerID="2a28ae9470976e7420f7711bff5c9f8ec075b111a6034e2c10db21887e4e6083" Mar 18 09:27:34 crc kubenswrapper[4917]: I0318 09:27:34.338167 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rgt7l"] Mar 18 09:27:34 crc kubenswrapper[4917]: I0318 09:27:34.354358 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rgt7l"] Mar 18 09:27:34 crc kubenswrapper[4917]: I0318 09:27:34.367296 4917 scope.go:117] "RemoveContainer" containerID="8c95e7ca12c6237d0fffd67e0e2919cc2e1eb7d118f830e307b30380c7574c27" Mar 18 09:27:35 crc kubenswrapper[4917]: I0318 09:27:35.792507 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" path="/var/lib/kubelet/pods/e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2/volumes" Mar 18 09:27:36 crc kubenswrapper[4917]: I0318 09:27:36.108128 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jrjf8_5beb9e93-3da2-4bc9-b40a-24406435d739/control-plane-machine-set-operator/0.log" Mar 18 09:27:36 crc kubenswrapper[4917]: I0318 09:27:36.228560 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h5mvm_9d08595e-c5f0-49cd-bc6c-5e248bcc76e7/kube-rbac-proxy/0.log" Mar 18 09:27:36 crc kubenswrapper[4917]: I0318 09:27:36.339464 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-h5mvm_9d08595e-c5f0-49cd-bc6c-5e248bcc76e7/machine-api-operator/0.log" Mar 18 09:27:50 crc kubenswrapper[4917]: I0318 09:27:50.213619 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-hf4fm_24c2a3d3-a81f-4920-bef3-7def59acacec/cert-manager-controller/0.log" Mar 18 09:27:50 crc kubenswrapper[4917]: I0318 09:27:50.417268 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-wkfj5_1690baaf-577c-4ce7-afe8-480821bb1419/cert-manager-cainjector/0.log" Mar 18 09:27:50 crc kubenswrapper[4917]: I0318 09:27:50.425076 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-h8dkf_487f44b8-8709-4c21-a458-5bf381a76858/cert-manager-webhook/0.log" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.142665 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563768-sb4nb"] Mar 18 09:28:00 crc kubenswrapper[4917]: E0318 09:28:00.143789 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerName="registry-server" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.143810 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerName="registry-server" Mar 18 09:28:00 crc kubenswrapper[4917]: E0318 09:28:00.143847 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerName="extract-utilities" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.143856 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerName="extract-utilities" Mar 18 09:28:00 crc kubenswrapper[4917]: E0318 09:28:00.143884 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerName="extract-content" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.143893 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerName="extract-content" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.144144 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="e760af4f-2f20-4c4d-a57a-e4f70e3a6cb2" containerName="registry-server" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.145062 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-sb4nb" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.148301 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.148321 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.149210 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.162207 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-sb4nb"] Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.245939 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5tzw\" (UniqueName: \"kubernetes.io/projected/29f6a538-34da-4ac9-86bd-58f3eb8675a4-kube-api-access-x5tzw\") pod \"auto-csr-approver-29563768-sb4nb\" (UID: \"29f6a538-34da-4ac9-86bd-58f3eb8675a4\") " pod="openshift-infra/auto-csr-approver-29563768-sb4nb" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.347739 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5tzw\" (UniqueName: \"kubernetes.io/projected/29f6a538-34da-4ac9-86bd-58f3eb8675a4-kube-api-access-x5tzw\") pod \"auto-csr-approver-29563768-sb4nb\" (UID: \"29f6a538-34da-4ac9-86bd-58f3eb8675a4\") " pod="openshift-infra/auto-csr-approver-29563768-sb4nb" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.531475 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5tzw\" (UniqueName: \"kubernetes.io/projected/29f6a538-34da-4ac9-86bd-58f3eb8675a4-kube-api-access-x5tzw\") pod \"auto-csr-approver-29563768-sb4nb\" (UID: \"29f6a538-34da-4ac9-86bd-58f3eb8675a4\") " pod="openshift-infra/auto-csr-approver-29563768-sb4nb" Mar 18 09:28:00 crc kubenswrapper[4917]: I0318 09:28:00.766162 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-sb4nb" Mar 18 09:28:01 crc kubenswrapper[4917]: I0318 09:28:01.259338 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-sb4nb"] Mar 18 09:28:01 crc kubenswrapper[4917]: I0318 09:28:01.588304 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563768-sb4nb" event={"ID":"29f6a538-34da-4ac9-86bd-58f3eb8675a4","Type":"ContainerStarted","Data":"35defafc4ad19463c88c705d07ed884dcb753b7fb0fa888f91eabed339c437ac"} Mar 18 09:28:02 crc kubenswrapper[4917]: I0318 09:28:02.929579 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:28:02 crc kubenswrapper[4917]: I0318 09:28:02.929915 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:28:03 crc kubenswrapper[4917]: I0318 09:28:03.539143 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-sn24n_c796ff9e-9db2-437c-92f6-fecb8ff9df67/nmstate-console-plugin/0.log" Mar 18 09:28:03 crc kubenswrapper[4917]: I0318 09:28:03.611630 4917 generic.go:334] "Generic (PLEG): container finished" podID="29f6a538-34da-4ac9-86bd-58f3eb8675a4" containerID="4c49d69369f1bf1f0501e8e84bc1bdbc0ec34214906ef64c871923e3463e0b98" exitCode=0 Mar 18 09:28:03 crc kubenswrapper[4917]: I0318 09:28:03.611674 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563768-sb4nb" event={"ID":"29f6a538-34da-4ac9-86bd-58f3eb8675a4","Type":"ContainerDied","Data":"4c49d69369f1bf1f0501e8e84bc1bdbc0ec34214906ef64c871923e3463e0b98"} Mar 18 09:28:03 crc kubenswrapper[4917]: I0318 09:28:03.721730 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-c9pr9_89bb20ed-e5eb-4dd2-8b45-95ca7bcffe97/nmstate-handler/0.log" Mar 18 09:28:03 crc kubenswrapper[4917]: I0318 09:28:03.818562 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qqdsm_d92256d3-2d78-4985-944e-273ad62ce8f2/kube-rbac-proxy/0.log" Mar 18 09:28:03 crc kubenswrapper[4917]: I0318 09:28:03.852595 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-qqdsm_d92256d3-2d78-4985-944e-273ad62ce8f2/nmstate-metrics/0.log" Mar 18 09:28:03 crc kubenswrapper[4917]: I0318 09:28:03.900628 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-4446x_5a0173d1-0d7d-4b19-b01f-b5f6d8a543eb/nmstate-operator/0.log" Mar 18 09:28:04 crc kubenswrapper[4917]: I0318 09:28:04.024006 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-ldchq_cf995f1c-4cea-4dd8-9283-161f3e9b3a4d/nmstate-webhook/0.log" Mar 18 09:28:04 crc kubenswrapper[4917]: I0318 09:28:04.970003 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-sb4nb" Mar 18 09:28:05 crc kubenswrapper[4917]: I0318 09:28:05.069497 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5tzw\" (UniqueName: \"kubernetes.io/projected/29f6a538-34da-4ac9-86bd-58f3eb8675a4-kube-api-access-x5tzw\") pod \"29f6a538-34da-4ac9-86bd-58f3eb8675a4\" (UID: \"29f6a538-34da-4ac9-86bd-58f3eb8675a4\") " Mar 18 09:28:05 crc kubenswrapper[4917]: I0318 09:28:05.077818 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f6a538-34da-4ac9-86bd-58f3eb8675a4-kube-api-access-x5tzw" (OuterVolumeSpecName: "kube-api-access-x5tzw") pod "29f6a538-34da-4ac9-86bd-58f3eb8675a4" (UID: "29f6a538-34da-4ac9-86bd-58f3eb8675a4"). InnerVolumeSpecName "kube-api-access-x5tzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:28:05 crc kubenswrapper[4917]: I0318 09:28:05.172016 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5tzw\" (UniqueName: \"kubernetes.io/projected/29f6a538-34da-4ac9-86bd-58f3eb8675a4-kube-api-access-x5tzw\") on node \"crc\" DevicePath \"\"" Mar 18 09:28:05 crc kubenswrapper[4917]: I0318 09:28:05.633449 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563768-sb4nb" event={"ID":"29f6a538-34da-4ac9-86bd-58f3eb8675a4","Type":"ContainerDied","Data":"35defafc4ad19463c88c705d07ed884dcb753b7fb0fa888f91eabed339c437ac"} Mar 18 09:28:05 crc kubenswrapper[4917]: I0318 09:28:05.633911 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35defafc4ad19463c88c705d07ed884dcb753b7fb0fa888f91eabed339c437ac" Mar 18 09:28:05 crc kubenswrapper[4917]: I0318 09:28:05.633497 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563768-sb4nb" Mar 18 09:28:06 crc kubenswrapper[4917]: I0318 09:28:06.038326 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-vv2kl"] Mar 18 09:28:06 crc kubenswrapper[4917]: I0318 09:28:06.046890 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563762-vv2kl"] Mar 18 09:28:07 crc kubenswrapper[4917]: I0318 09:28:07.794609 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf12c2b3-f4a3-4a79-8859-33391ef42687" path="/var/lib/kubelet/pods/cf12c2b3-f4a3-4a79-8859-33391ef42687/volumes" Mar 18 09:28:18 crc kubenswrapper[4917]: I0318 09:28:18.138250 4917 scope.go:117] "RemoveContainer" containerID="abc2c3bb7af6cc6854439c4c1bf491eea93dc53531bd9c1ede42d263472373ae" Mar 18 09:28:19 crc kubenswrapper[4917]: I0318 09:28:19.378947 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-7rjvc_1ff89bec-5b99-41f8-8859-90b9fac30a81/prometheus-operator/0.log" Mar 18 09:28:19 crc kubenswrapper[4917]: I0318 09:28:19.625028 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk_0041ff4d-5053-427a-aaac-dadb33cd80c8/prometheus-operator-admission-webhook/0.log" Mar 18 09:28:19 crc kubenswrapper[4917]: I0318 09:28:19.626699 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb_a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6/prometheus-operator-admission-webhook/0.log" Mar 18 09:28:19 crc kubenswrapper[4917]: I0318 09:28:19.815506 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-xhwk7_e2c168dd-a6d0-40e6-832b-e5658df4e05d/operator/0.log" Mar 18 09:28:19 crc kubenswrapper[4917]: I0318 09:28:19.820673 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-7479c95c5-vfhcm_dd85f6bd-3b28-4ae4-bc48-7733ccd8546e/perses-operator/0.log" Mar 18 09:28:32 crc kubenswrapper[4917]: I0318 09:28:32.929422 4917 patch_prober.go:28] interesting pod/machine-config-daemon-xp5xk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 18 09:28:32 crc kubenswrapper[4917]: I0318 09:28:32.929880 4917 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 18 09:28:32 crc kubenswrapper[4917]: I0318 09:28:32.929926 4917 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" Mar 18 09:28:32 crc kubenswrapper[4917]: I0318 09:28:32.930536 4917 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb"} pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 18 09:28:32 crc kubenswrapper[4917]: I0318 09:28:32.930603 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerName="machine-config-daemon" containerID="cri-o://cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" gracePeriod=600 Mar 18 09:28:33 crc kubenswrapper[4917]: E0318 09:28:33.052221 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:28:33 crc kubenswrapper[4917]: I0318 09:28:33.934073 4917 generic.go:334] "Generic (PLEG): container finished" podID="cc04c58e-83bd-4c0c-b58a-da7dea820272" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" exitCode=0 Mar 18 09:28:33 crc kubenswrapper[4917]: I0318 09:28:33.934134 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerDied","Data":"cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb"} Mar 18 09:28:33 crc kubenswrapper[4917]: I0318 09:28:33.934627 4917 scope.go:117] "RemoveContainer" containerID="631374ea93931dfc55d694e6dd1da33a32d0d39afef7672b4338c0a7786c053e" Mar 18 09:28:33 crc kubenswrapper[4917]: I0318 09:28:33.935681 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:28:33 crc kubenswrapper[4917]: E0318 09:28:33.936348 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:28:36 crc kubenswrapper[4917]: I0318 09:28:36.177412 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zscqs_9500ccc2-23e4-4cad-97db-6a0fac025f5e/kube-rbac-proxy/0.log" Mar 18 09:28:36 crc kubenswrapper[4917]: I0318 09:28:36.476597 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-frr-files/0.log" Mar 18 09:28:36 crc kubenswrapper[4917]: I0318 09:28:36.640813 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-metrics/0.log" Mar 18 09:28:36 crc kubenswrapper[4917]: I0318 09:28:36.659970 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-zscqs_9500ccc2-23e4-4cad-97db-6a0fac025f5e/controller/0.log" Mar 18 09:28:36 crc kubenswrapper[4917]: I0318 09:28:36.676449 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-frr-files/0.log" Mar 18 09:28:36 crc kubenswrapper[4917]: I0318 09:28:36.683072 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-reloader/0.log" Mar 18 09:28:36 crc kubenswrapper[4917]: I0318 09:28:36.802196 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-reloader/0.log" Mar 18 09:28:36 crc kubenswrapper[4917]: I0318 09:28:36.983181 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-frr-files/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.012800 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-reloader/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.016425 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-metrics/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.048231 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-metrics/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.213079 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-reloader/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.216129 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-metrics/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.233997 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/cp-frr-files/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.240685 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/controller/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.370903 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/frr-metrics/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.423941 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/kube-rbac-proxy/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.461736 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/kube-rbac-proxy-frr/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.614555 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/reloader/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.706245 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-gc44k_ba0f2128-2563-4331-9c56-c7ea7a46b0d5/frr-k8s-webhook-server/0.log" Mar 18 09:28:37 crc kubenswrapper[4917]: I0318 09:28:37.892494 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-74bdc75999-hb7kz_ae747d59-ac7b-464b-9cb5-859d07e265a8/manager/0.log" Mar 18 09:28:38 crc kubenswrapper[4917]: I0318 09:28:38.114494 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-84fb45cdd8-gjn5b_b5699c2f-0727-4d0e-b131-6966e1bc8126/webhook-server/0.log" Mar 18 09:28:38 crc kubenswrapper[4917]: I0318 09:28:38.232226 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j9g22_bc1ceb3d-1108-434e-892d-e08b164c8937/kube-rbac-proxy/0.log" Mar 18 09:28:38 crc kubenswrapper[4917]: I0318 09:28:38.904750 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-j9g22_bc1ceb3d-1108-434e-892d-e08b164c8937/speaker/0.log" Mar 18 09:28:40 crc kubenswrapper[4917]: I0318 09:28:40.788930 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-24wx8_4f428b53-24d1-4ec3-8d38-271e43a0e71d/frr/0.log" Mar 18 09:28:48 crc kubenswrapper[4917]: I0318 09:28:48.773136 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:28:48 crc kubenswrapper[4917]: E0318 09:28:48.774139 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:28:55 crc kubenswrapper[4917]: I0318 09:28:55.263046 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm_04e01d3a-3bf7-4904-9bd7-131a3f9ca046/util/0.log" Mar 18 09:28:55 crc kubenswrapper[4917]: I0318 09:28:55.449714 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm_04e01d3a-3bf7-4904-9bd7-131a3f9ca046/pull/0.log" Mar 18 09:28:55 crc kubenswrapper[4917]: I0318 09:28:55.469404 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm_04e01d3a-3bf7-4904-9bd7-131a3f9ca046/util/0.log" Mar 18 09:28:55 crc kubenswrapper[4917]: I0318 09:28:55.503266 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm_04e01d3a-3bf7-4904-9bd7-131a3f9ca046/pull/0.log" Mar 18 09:28:55 crc kubenswrapper[4917]: I0318 09:28:55.648183 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm_04e01d3a-3bf7-4904-9bd7-131a3f9ca046/extract/0.log" Mar 18 09:28:55 crc kubenswrapper[4917]: I0318 09:28:55.659671 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm_04e01d3a-3bf7-4904-9bd7-131a3f9ca046/pull/0.log" Mar 18 09:28:55 crc kubenswrapper[4917]: I0318 09:28:55.691849 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874dhxlm_04e01d3a-3bf7-4904-9bd7-131a3f9ca046/util/0.log" Mar 18 09:28:56 crc kubenswrapper[4917]: I0318 09:28:56.193404 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2_d1e77201-ed7f-4151-bf4c-b81a3fbfda8b/util/0.log" Mar 18 09:28:56 crc kubenswrapper[4917]: I0318 09:28:56.822981 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2_d1e77201-ed7f-4151-bf4c-b81a3fbfda8b/util/0.log" Mar 18 09:28:56 crc kubenswrapper[4917]: I0318 09:28:56.968618 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2_d1e77201-ed7f-4151-bf4c-b81a3fbfda8b/pull/0.log" Mar 18 09:28:56 crc kubenswrapper[4917]: I0318 09:28:56.974216 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2_d1e77201-ed7f-4151-bf4c-b81a3fbfda8b/pull/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.185455 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2_d1e77201-ed7f-4151-bf4c-b81a3fbfda8b/util/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.222316 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2_d1e77201-ed7f-4151-bf4c-b81a3fbfda8b/pull/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.266321 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xndh2_d1e77201-ed7f-4151-bf4c-b81a3fbfda8b/extract/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.390209 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp_8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a/util/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.584718 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp_8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a/util/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.674234 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp_8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a/pull/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.724121 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp_8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a/pull/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.838175 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp_8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a/util/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.897410 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp_8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a/pull/0.log" Mar 18 09:28:57 crc kubenswrapper[4917]: I0318 09:28:57.946757 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5xnvkp_8f91b9a4-bcdc-4f7f-a5e9-b6a3706f0a0a/extract/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.043119 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79_01b048b4-f839-47f4-b90c-ed166746415a/util/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.250613 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79_01b048b4-f839-47f4-b90c-ed166746415a/pull/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.270987 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79_01b048b4-f839-47f4-b90c-ed166746415a/util/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.271832 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79_01b048b4-f839-47f4-b90c-ed166746415a/pull/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.420917 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79_01b048b4-f839-47f4-b90c-ed166746415a/util/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.460121 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79_01b048b4-f839-47f4-b90c-ed166746415a/extract/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.486157 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726g8r79_01b048b4-f839-47f4-b90c-ed166746415a/pull/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.615360 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nmvwc_7685910a-28c7-4d3a-83c4-22be2b2a3bf8/extract-utilities/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.818467 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nmvwc_7685910a-28c7-4d3a-83c4-22be2b2a3bf8/extract-utilities/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.824739 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nmvwc_7685910a-28c7-4d3a-83c4-22be2b2a3bf8/extract-content/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.845444 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nmvwc_7685910a-28c7-4d3a-83c4-22be2b2a3bf8/extract-content/0.log" Mar 18 09:28:58 crc kubenswrapper[4917]: I0318 09:28:58.980666 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nmvwc_7685910a-28c7-4d3a-83c4-22be2b2a3bf8/extract-utilities/0.log" Mar 18 09:28:59 crc kubenswrapper[4917]: I0318 09:28:59.019140 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nmvwc_7685910a-28c7-4d3a-83c4-22be2b2a3bf8/extract-content/0.log" Mar 18 09:28:59 crc kubenswrapper[4917]: I0318 09:28:59.092558 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrnfk_11b8a666-98f1-4d91-aae4-259d185a772a/extract-utilities/0.log" Mar 18 09:28:59 crc kubenswrapper[4917]: I0318 09:28:59.312728 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrnfk_11b8a666-98f1-4d91-aae4-259d185a772a/extract-content/0.log" Mar 18 09:28:59 crc kubenswrapper[4917]: I0318 09:28:59.356563 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrnfk_11b8a666-98f1-4d91-aae4-259d185a772a/extract-utilities/0.log" Mar 18 09:28:59 crc kubenswrapper[4917]: I0318 09:28:59.410855 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrnfk_11b8a666-98f1-4d91-aae4-259d185a772a/extract-content/0.log" Mar 18 09:28:59 crc kubenswrapper[4917]: I0318 09:28:59.603066 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrnfk_11b8a666-98f1-4d91-aae4-259d185a772a/extract-content/0.log" Mar 18 09:28:59 crc kubenswrapper[4917]: I0318 09:28:59.622037 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrnfk_11b8a666-98f1-4d91-aae4-259d185a772a/extract-utilities/0.log" Mar 18 09:28:59 crc kubenswrapper[4917]: I0318 09:28:59.824690 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nmvwc_7685910a-28c7-4d3a-83c4-22be2b2a3bf8/registry-server/0.log" Mar 18 09:28:59 crc kubenswrapper[4917]: I0318 09:28:59.856043 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rflln_eddc4b08-6464-49cf-9b06-a482cbcbef5d/marketplace-operator/0.log" Mar 18 09:29:00 crc kubenswrapper[4917]: I0318 09:29:00.026980 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vhrw_9e197559-b623-42e3-bf29-dcf0c20a779d/extract-utilities/0.log" Mar 18 09:29:00 crc kubenswrapper[4917]: I0318 09:29:00.099397 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vhrw_9e197559-b623-42e3-bf29-dcf0c20a779d/extract-utilities/0.log" Mar 18 09:29:00 crc kubenswrapper[4917]: I0318 09:29:00.154750 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vhrw_9e197559-b623-42e3-bf29-dcf0c20a779d/extract-content/0.log" Mar 18 09:29:00 crc kubenswrapper[4917]: I0318 09:29:00.195299 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vhrw_9e197559-b623-42e3-bf29-dcf0c20a779d/extract-content/0.log" Mar 18 09:29:00 crc kubenswrapper[4917]: I0318 09:29:00.360985 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vhrw_9e197559-b623-42e3-bf29-dcf0c20a779d/extract-utilities/0.log" Mar 18 09:29:00 crc kubenswrapper[4917]: I0318 09:29:00.409564 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vhrw_9e197559-b623-42e3-bf29-dcf0c20a779d/extract-content/0.log" Mar 18 09:29:00 crc kubenswrapper[4917]: I0318 09:29:00.539507 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-lrnfk_11b8a666-98f1-4d91-aae4-259d185a772a/registry-server/0.log" Mar 18 09:29:00 crc kubenswrapper[4917]: I0318 09:29:00.741694 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9vhrw_9e197559-b623-42e3-bf29-dcf0c20a779d/registry-server/0.log" Mar 18 09:29:00 crc kubenswrapper[4917]: I0318 09:29:00.844545 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6p978_dcba268d-27fe-4d66-863a-b94762a886b4/extract-utilities/0.log" Mar 18 09:29:01 crc kubenswrapper[4917]: I0318 09:29:01.010750 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6p978_dcba268d-27fe-4d66-863a-b94762a886b4/extract-utilities/0.log" Mar 18 09:29:01 crc kubenswrapper[4917]: I0318 09:29:01.029358 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6p978_dcba268d-27fe-4d66-863a-b94762a886b4/extract-content/0.log" Mar 18 09:29:01 crc kubenswrapper[4917]: I0318 09:29:01.034561 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6p978_dcba268d-27fe-4d66-863a-b94762a886b4/extract-content/0.log" Mar 18 09:29:01 crc kubenswrapper[4917]: I0318 09:29:01.156472 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6p978_dcba268d-27fe-4d66-863a-b94762a886b4/extract-utilities/0.log" Mar 18 09:29:01 crc kubenswrapper[4917]: I0318 09:29:01.191001 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6p978_dcba268d-27fe-4d66-863a-b94762a886b4/extract-content/0.log" Mar 18 09:29:02 crc kubenswrapper[4917]: I0318 09:29:02.376773 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6p978_dcba268d-27fe-4d66-863a-b94762a886b4/registry-server/0.log" Mar 18 09:29:02 crc kubenswrapper[4917]: I0318 09:29:02.774277 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:29:02 crc kubenswrapper[4917]: E0318 09:29:02.774490 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:29:13 crc kubenswrapper[4917]: I0318 09:29:13.774360 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:29:13 crc kubenswrapper[4917]: E0318 09:29:13.775267 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:29:17 crc kubenswrapper[4917]: I0318 09:29:17.137317 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58d8fc49c4-fkgsk_0041ff4d-5053-427a-aaac-dadb33cd80c8/prometheus-operator-admission-webhook/0.log" Mar 18 09:29:17 crc kubenswrapper[4917]: I0318 09:29:17.151755 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-7rjvc_1ff89bec-5b99-41f8-8859-90b9fac30a81/prometheus-operator/0.log" Mar 18 09:29:17 crc kubenswrapper[4917]: I0318 09:29:17.216678 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-58d8fc49c4-nnzjb_a3476bdf-7c3c-43d4-acb4-9fb6ca9d51d6/prometheus-operator-admission-webhook/0.log" Mar 18 09:29:17 crc kubenswrapper[4917]: I0318 09:29:17.360300 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-xhwk7_e2c168dd-a6d0-40e6-832b-e5658df4e05d/operator/0.log" Mar 18 09:29:17 crc kubenswrapper[4917]: I0318 09:29:17.386723 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-7479c95c5-vfhcm_dd85f6bd-3b28-4ae4-bc48-7733ccd8546e/perses-operator/0.log" Mar 18 09:29:26 crc kubenswrapper[4917]: I0318 09:29:26.772816 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:29:26 crc kubenswrapper[4917]: E0318 09:29:26.773730 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:29:40 crc kubenswrapper[4917]: I0318 09:29:40.772553 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:29:40 crc kubenswrapper[4917]: E0318 09:29:40.773271 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:29:53 crc kubenswrapper[4917]: I0318 09:29:53.776077 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:29:53 crc kubenswrapper[4917]: E0318 09:29:53.777570 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.750533 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hbgbb"] Mar 18 09:29:58 crc kubenswrapper[4917]: E0318 09:29:58.751856 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29f6a538-34da-4ac9-86bd-58f3eb8675a4" containerName="oc" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.751880 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f6a538-34da-4ac9-86bd-58f3eb8675a4" containerName="oc" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.752223 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="29f6a538-34da-4ac9-86bd-58f3eb8675a4" containerName="oc" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.754456 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.766925 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbgbb"] Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.834761 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-catalog-content\") pod \"redhat-marketplace-hbgbb\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.835423 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-utilities\") pod \"redhat-marketplace-hbgbb\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.835706 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6n2q\" (UniqueName: \"kubernetes.io/projected/3ae9e5ba-61a3-4b07-9c44-7493152530e3-kube-api-access-q6n2q\") pod \"redhat-marketplace-hbgbb\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.937577 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-utilities\") pod \"redhat-marketplace-hbgbb\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.937729 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6n2q\" (UniqueName: \"kubernetes.io/projected/3ae9e5ba-61a3-4b07-9c44-7493152530e3-kube-api-access-q6n2q\") pod \"redhat-marketplace-hbgbb\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.937825 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-catalog-content\") pod \"redhat-marketplace-hbgbb\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.938907 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-catalog-content\") pod \"redhat-marketplace-hbgbb\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.939441 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-utilities\") pod \"redhat-marketplace-hbgbb\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:58 crc kubenswrapper[4917]: I0318 09:29:58.985856 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6n2q\" (UniqueName: \"kubernetes.io/projected/3ae9e5ba-61a3-4b07-9c44-7493152530e3-kube-api-access-q6n2q\") pod \"redhat-marketplace-hbgbb\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:59 crc kubenswrapper[4917]: I0318 09:29:59.087431 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:29:59 crc kubenswrapper[4917]: I0318 09:29:59.428026 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbgbb"] Mar 18 09:29:59 crc kubenswrapper[4917]: I0318 09:29:59.867627 4917 generic.go:334] "Generic (PLEG): container finished" podID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerID="e265f19717943274dfc97b567124f4e2ca0906fbb6721f720ff5a21bca99b626" exitCode=0 Mar 18 09:29:59 crc kubenswrapper[4917]: I0318 09:29:59.867673 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbgbb" event={"ID":"3ae9e5ba-61a3-4b07-9c44-7493152530e3","Type":"ContainerDied","Data":"e265f19717943274dfc97b567124f4e2ca0906fbb6721f720ff5a21bca99b626"} Mar 18 09:29:59 crc kubenswrapper[4917]: I0318 09:29:59.867697 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbgbb" event={"ID":"3ae9e5ba-61a3-4b07-9c44-7493152530e3","Type":"ContainerStarted","Data":"0abfce0e22a9a074b03c555720d1f48c205f449b59b66771d2cde6149c599235"} Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.141180 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563770-xdmwt"] Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.143190 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-xdmwt" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.145528 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.145972 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.145991 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.149895 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2"] Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.151597 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.161361 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.162380 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.185683 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-xdmwt"] Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.208313 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2"] Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.268822 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w4ft\" (UniqueName: \"kubernetes.io/projected/1824c38d-7556-43e7-9dff-49a0757d8f8e-kube-api-access-9w4ft\") pod \"auto-csr-approver-29563770-xdmwt\" (UID: \"1824c38d-7556-43e7-9dff-49a0757d8f8e\") " pod="openshift-infra/auto-csr-approver-29563770-xdmwt" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.268902 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-config-volume\") pod \"collect-profiles-29563770-5j5v2\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.269061 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-secret-volume\") pod \"collect-profiles-29563770-5j5v2\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.269085 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl55z\" (UniqueName: \"kubernetes.io/projected/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-kube-api-access-sl55z\") pod \"collect-profiles-29563770-5j5v2\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.370978 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-secret-volume\") pod \"collect-profiles-29563770-5j5v2\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.371015 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl55z\" (UniqueName: \"kubernetes.io/projected/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-kube-api-access-sl55z\") pod \"collect-profiles-29563770-5j5v2\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.371103 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w4ft\" (UniqueName: \"kubernetes.io/projected/1824c38d-7556-43e7-9dff-49a0757d8f8e-kube-api-access-9w4ft\") pod \"auto-csr-approver-29563770-xdmwt\" (UID: \"1824c38d-7556-43e7-9dff-49a0757d8f8e\") " pod="openshift-infra/auto-csr-approver-29563770-xdmwt" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.371136 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-config-volume\") pod \"collect-profiles-29563770-5j5v2\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.373090 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-config-volume\") pod \"collect-profiles-29563770-5j5v2\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.632543 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-secret-volume\") pod \"collect-profiles-29563770-5j5v2\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.634657 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w4ft\" (UniqueName: \"kubernetes.io/projected/1824c38d-7556-43e7-9dff-49a0757d8f8e-kube-api-access-9w4ft\") pod \"auto-csr-approver-29563770-xdmwt\" (UID: \"1824c38d-7556-43e7-9dff-49a0757d8f8e\") " pod="openshift-infra/auto-csr-approver-29563770-xdmwt" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.641107 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl55z\" (UniqueName: \"kubernetes.io/projected/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-kube-api-access-sl55z\") pod \"collect-profiles-29563770-5j5v2\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.766464 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-xdmwt" Mar 18 09:30:00 crc kubenswrapper[4917]: I0318 09:30:00.793287 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:01 crc kubenswrapper[4917]: I0318 09:30:01.266874 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563770-xdmwt"] Mar 18 09:30:01 crc kubenswrapper[4917]: I0318 09:30:01.341250 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2"] Mar 18 09:30:01 crc kubenswrapper[4917]: W0318 09:30:01.343537 4917 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b4eb574_94a5_46a8_b9c2_fd7bae63694d.slice/crio-b57e538372bf213b18015fb17fee78b74ec42eb7f823dcaf509f30140b108031 WatchSource:0}: Error finding container b57e538372bf213b18015fb17fee78b74ec42eb7f823dcaf509f30140b108031: Status 404 returned error can't find the container with id b57e538372bf213b18015fb17fee78b74ec42eb7f823dcaf509f30140b108031 Mar 18 09:30:01 crc kubenswrapper[4917]: I0318 09:30:01.899114 4917 generic.go:334] "Generic (PLEG): container finished" podID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerID="accaaa22058310dbc717b6f59f31c8474253a5eb29ddf6c34d41350eb05f8ac8" exitCode=0 Mar 18 09:30:01 crc kubenswrapper[4917]: I0318 09:30:01.899176 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbgbb" event={"ID":"3ae9e5ba-61a3-4b07-9c44-7493152530e3","Type":"ContainerDied","Data":"accaaa22058310dbc717b6f59f31c8474253a5eb29ddf6c34d41350eb05f8ac8"} Mar 18 09:30:01 crc kubenswrapper[4917]: I0318 09:30:01.901515 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563770-xdmwt" event={"ID":"1824c38d-7556-43e7-9dff-49a0757d8f8e","Type":"ContainerStarted","Data":"6d275f48700519c296d6d53f2171d3119728cab4bb16e968707cbe332322afb1"} Mar 18 09:30:01 crc kubenswrapper[4917]: I0318 09:30:01.905949 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" event={"ID":"3b4eb574-94a5-46a8-b9c2-fd7bae63694d","Type":"ContainerStarted","Data":"a0b768043d1087e6b0085f678a840ecf9a13a8b63e15b1057ee0e309734ee245"} Mar 18 09:30:01 crc kubenswrapper[4917]: I0318 09:30:01.906005 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" event={"ID":"3b4eb574-94a5-46a8-b9c2-fd7bae63694d","Type":"ContainerStarted","Data":"b57e538372bf213b18015fb17fee78b74ec42eb7f823dcaf509f30140b108031"} Mar 18 09:30:01 crc kubenswrapper[4917]: I0318 09:30:01.959165 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" podStartSLOduration=1.959145731 podStartE2EDuration="1.959145731s" podCreationTimestamp="2026-03-18 09:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-18 09:30:01.951119476 +0000 UTC m=+9786.892274190" watchObservedRunningTime="2026-03-18 09:30:01.959145731 +0000 UTC m=+9786.900300445" Mar 18 09:30:02 crc kubenswrapper[4917]: I0318 09:30:02.933430 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbgbb" event={"ID":"3ae9e5ba-61a3-4b07-9c44-7493152530e3","Type":"ContainerStarted","Data":"a61b9a2804ef683a82da76f2a97dde4a5ce6744b4c723caa4b160cd9f304e4da"} Mar 18 09:30:02 crc kubenswrapper[4917]: I0318 09:30:02.936975 4917 generic.go:334] "Generic (PLEG): container finished" podID="3b4eb574-94a5-46a8-b9c2-fd7bae63694d" containerID="a0b768043d1087e6b0085f678a840ecf9a13a8b63e15b1057ee0e309734ee245" exitCode=0 Mar 18 09:30:02 crc kubenswrapper[4917]: I0318 09:30:02.937019 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" event={"ID":"3b4eb574-94a5-46a8-b9c2-fd7bae63694d","Type":"ContainerDied","Data":"a0b768043d1087e6b0085f678a840ecf9a13a8b63e15b1057ee0e309734ee245"} Mar 18 09:30:02 crc kubenswrapper[4917]: I0318 09:30:02.952414 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hbgbb" podStartSLOduration=2.304309062 podStartE2EDuration="4.952393652s" podCreationTimestamp="2026-03-18 09:29:58 +0000 UTC" firstStartedPulling="2026-03-18 09:29:59.869326565 +0000 UTC m=+9784.810481279" lastFinishedPulling="2026-03-18 09:30:02.517411115 +0000 UTC m=+9787.458565869" observedRunningTime="2026-03-18 09:30:02.94984301 +0000 UTC m=+9787.890997724" watchObservedRunningTime="2026-03-18 09:30:02.952393652 +0000 UTC m=+9787.893548386" Mar 18 09:30:03 crc kubenswrapper[4917]: I0318 09:30:03.976546 4917 generic.go:334] "Generic (PLEG): container finished" podID="1824c38d-7556-43e7-9dff-49a0757d8f8e" containerID="7a9d79c991a6b55f57b22061cb43280db11378b7e6ffcd61b90a752a53fb7565" exitCode=0 Mar 18 09:30:03 crc kubenswrapper[4917]: I0318 09:30:03.976785 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563770-xdmwt" event={"ID":"1824c38d-7556-43e7-9dff-49a0757d8f8e","Type":"ContainerDied","Data":"7a9d79c991a6b55f57b22061cb43280db11378b7e6ffcd61b90a752a53fb7565"} Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.390349 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.468035 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-config-volume\") pod \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.468116 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-secret-volume\") pod \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.468170 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sl55z\" (UniqueName: \"kubernetes.io/projected/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-kube-api-access-sl55z\") pod \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\" (UID: \"3b4eb574-94a5-46a8-b9c2-fd7bae63694d\") " Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.468898 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-config-volume" (OuterVolumeSpecName: "config-volume") pod "3b4eb574-94a5-46a8-b9c2-fd7bae63694d" (UID: "3b4eb574-94a5-46a8-b9c2-fd7bae63694d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.469202 4917 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.479912 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3b4eb574-94a5-46a8-b9c2-fd7bae63694d" (UID: "3b4eb574-94a5-46a8-b9c2-fd7bae63694d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.479927 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-kube-api-access-sl55z" (OuterVolumeSpecName: "kube-api-access-sl55z") pod "3b4eb574-94a5-46a8-b9c2-fd7bae63694d" (UID: "3b4eb574-94a5-46a8-b9c2-fd7bae63694d"). InnerVolumeSpecName "kube-api-access-sl55z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.571433 4917 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.571492 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sl55z\" (UniqueName: \"kubernetes.io/projected/3b4eb574-94a5-46a8-b9c2-fd7bae63694d-kube-api-access-sl55z\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.772474 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:30:04 crc kubenswrapper[4917]: E0318 09:30:04.772930 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.987247 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" event={"ID":"3b4eb574-94a5-46a8-b9c2-fd7bae63694d","Type":"ContainerDied","Data":"b57e538372bf213b18015fb17fee78b74ec42eb7f823dcaf509f30140b108031"} Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.987312 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b57e538372bf213b18015fb17fee78b74ec42eb7f823dcaf509f30140b108031" Mar 18 09:30:04 crc kubenswrapper[4917]: I0318 09:30:04.987314 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29563770-5j5v2" Mar 18 09:30:05 crc kubenswrapper[4917]: I0318 09:30:05.276822 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-xdmwt" Mar 18 09:30:05 crc kubenswrapper[4917]: I0318 09:30:05.384314 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w4ft\" (UniqueName: \"kubernetes.io/projected/1824c38d-7556-43e7-9dff-49a0757d8f8e-kube-api-access-9w4ft\") pod \"1824c38d-7556-43e7-9dff-49a0757d8f8e\" (UID: \"1824c38d-7556-43e7-9dff-49a0757d8f8e\") " Mar 18 09:30:05 crc kubenswrapper[4917]: I0318 09:30:05.392150 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1824c38d-7556-43e7-9dff-49a0757d8f8e-kube-api-access-9w4ft" (OuterVolumeSpecName: "kube-api-access-9w4ft") pod "1824c38d-7556-43e7-9dff-49a0757d8f8e" (UID: "1824c38d-7556-43e7-9dff-49a0757d8f8e"). InnerVolumeSpecName "kube-api-access-9w4ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:30:05 crc kubenswrapper[4917]: I0318 09:30:05.460324 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb"] Mar 18 09:30:05 crc kubenswrapper[4917]: I0318 09:30:05.468989 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29563725-dprsb"] Mar 18 09:30:05 crc kubenswrapper[4917]: I0318 09:30:05.487108 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w4ft\" (UniqueName: \"kubernetes.io/projected/1824c38d-7556-43e7-9dff-49a0757d8f8e-kube-api-access-9w4ft\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:05 crc kubenswrapper[4917]: I0318 09:30:05.793304 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd3de79-3341-4256-bc5f-ca8adb6fce8e" path="/var/lib/kubelet/pods/dcd3de79-3341-4256-bc5f-ca8adb6fce8e/volumes" Mar 18 09:30:06 crc kubenswrapper[4917]: I0318 09:30:06.000717 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563770-xdmwt" event={"ID":"1824c38d-7556-43e7-9dff-49a0757d8f8e","Type":"ContainerDied","Data":"6d275f48700519c296d6d53f2171d3119728cab4bb16e968707cbe332322afb1"} Mar 18 09:30:06 crc kubenswrapper[4917]: I0318 09:30:06.000773 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563770-xdmwt" Mar 18 09:30:06 crc kubenswrapper[4917]: I0318 09:30:06.000784 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d275f48700519c296d6d53f2171d3119728cab4bb16e968707cbe332322afb1" Mar 18 09:30:06 crc kubenswrapper[4917]: I0318 09:30:06.353394 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-lpjt2"] Mar 18 09:30:06 crc kubenswrapper[4917]: I0318 09:30:06.366698 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563764-lpjt2"] Mar 18 09:30:07 crc kubenswrapper[4917]: I0318 09:30:07.787851 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a51b0918-fa96-4784-b6dc-be09150be166" path="/var/lib/kubelet/pods/a51b0918-fa96-4784-b6dc-be09150be166/volumes" Mar 18 09:30:09 crc kubenswrapper[4917]: I0318 09:30:09.088624 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:30:09 crc kubenswrapper[4917]: I0318 09:30:09.089112 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:30:09 crc kubenswrapper[4917]: I0318 09:30:09.180430 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:30:10 crc kubenswrapper[4917]: I0318 09:30:10.140443 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:30:10 crc kubenswrapper[4917]: I0318 09:30:10.216570 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbgbb"] Mar 18 09:30:12 crc kubenswrapper[4917]: I0318 09:30:12.079735 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hbgbb" podUID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerName="registry-server" containerID="cri-o://a61b9a2804ef683a82da76f2a97dde4a5ce6744b4c723caa4b160cd9f304e4da" gracePeriod=2 Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.091058 4917 generic.go:334] "Generic (PLEG): container finished" podID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerID="a61b9a2804ef683a82da76f2a97dde4a5ce6744b4c723caa4b160cd9f304e4da" exitCode=0 Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.091445 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbgbb" event={"ID":"3ae9e5ba-61a3-4b07-9c44-7493152530e3","Type":"ContainerDied","Data":"a61b9a2804ef683a82da76f2a97dde4a5ce6744b4c723caa4b160cd9f304e4da"} Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.091481 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hbgbb" event={"ID":"3ae9e5ba-61a3-4b07-9c44-7493152530e3","Type":"ContainerDied","Data":"0abfce0e22a9a074b03c555720d1f48c205f449b59b66771d2cde6149c599235"} Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.091496 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0abfce0e22a9a074b03c555720d1f48c205f449b59b66771d2cde6149c599235" Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.614330 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.787668 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-catalog-content\") pod \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.787767 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6n2q\" (UniqueName: \"kubernetes.io/projected/3ae9e5ba-61a3-4b07-9c44-7493152530e3-kube-api-access-q6n2q\") pod \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.787831 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-utilities\") pod \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\" (UID: \"3ae9e5ba-61a3-4b07-9c44-7493152530e3\") " Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.790443 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-utilities" (OuterVolumeSpecName: "utilities") pod "3ae9e5ba-61a3-4b07-9c44-7493152530e3" (UID: "3ae9e5ba-61a3-4b07-9c44-7493152530e3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.797522 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae9e5ba-61a3-4b07-9c44-7493152530e3-kube-api-access-q6n2q" (OuterVolumeSpecName: "kube-api-access-q6n2q") pod "3ae9e5ba-61a3-4b07-9c44-7493152530e3" (UID: "3ae9e5ba-61a3-4b07-9c44-7493152530e3"). InnerVolumeSpecName "kube-api-access-q6n2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.829196 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ae9e5ba-61a3-4b07-9c44-7493152530e3" (UID: "3ae9e5ba-61a3-4b07-9c44-7493152530e3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.897726 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6n2q\" (UniqueName: \"kubernetes.io/projected/3ae9e5ba-61a3-4b07-9c44-7493152530e3-kube-api-access-q6n2q\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.897776 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:13 crc kubenswrapper[4917]: I0318 09:30:13.897789 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae9e5ba-61a3-4b07-9c44-7493152530e3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:14 crc kubenswrapper[4917]: I0318 09:30:14.099232 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hbgbb" Mar 18 09:30:14 crc kubenswrapper[4917]: I0318 09:30:14.156480 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbgbb"] Mar 18 09:30:14 crc kubenswrapper[4917]: I0318 09:30:14.171174 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hbgbb"] Mar 18 09:30:15 crc kubenswrapper[4917]: I0318 09:30:15.813010 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" path="/var/lib/kubelet/pods/3ae9e5ba-61a3-4b07-9c44-7493152530e3/volumes" Mar 18 09:30:17 crc kubenswrapper[4917]: I0318 09:30:17.773254 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:30:17 crc kubenswrapper[4917]: E0318 09:30:17.773878 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:30:18 crc kubenswrapper[4917]: I0318 09:30:18.248037 4917 scope.go:117] "RemoveContainer" containerID="5737c8f5cb3c5cff4cfc88731716a806c078fb16466e2cb37711448304edf5b7" Mar 18 09:30:18 crc kubenswrapper[4917]: I0318 09:30:18.305554 4917 scope.go:117] "RemoveContainer" containerID="94ce77aef30befa2bb5b0cc4e4f16eac37ffdd7e3d1243bbc59fab16cb64b1bc" Mar 18 09:30:18 crc kubenswrapper[4917]: I0318 09:30:18.382116 4917 scope.go:117] "RemoveContainer" containerID="a6803bcfa716f0dbdf37355179c5f631d9882ca46fe336417cab3995801de7c7" Mar 18 09:30:28 crc kubenswrapper[4917]: I0318 09:30:28.773192 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:30:28 crc kubenswrapper[4917]: E0318 09:30:28.774118 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.775316 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:30:40 crc kubenswrapper[4917]: E0318 09:30:40.776989 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.833499 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wzv2n"] Mar 18 09:30:40 crc kubenswrapper[4917]: E0318 09:30:40.834218 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1824c38d-7556-43e7-9dff-49a0757d8f8e" containerName="oc" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.834251 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="1824c38d-7556-43e7-9dff-49a0757d8f8e" containerName="oc" Mar 18 09:30:40 crc kubenswrapper[4917]: E0318 09:30:40.834298 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerName="extract-utilities" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.834312 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerName="extract-utilities" Mar 18 09:30:40 crc kubenswrapper[4917]: E0318 09:30:40.834336 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerName="extract-content" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.834348 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerName="extract-content" Mar 18 09:30:40 crc kubenswrapper[4917]: E0318 09:30:40.834368 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerName="registry-server" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.834380 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerName="registry-server" Mar 18 09:30:40 crc kubenswrapper[4917]: E0318 09:30:40.834443 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b4eb574-94a5-46a8-b9c2-fd7bae63694d" containerName="collect-profiles" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.834457 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b4eb574-94a5-46a8-b9c2-fd7bae63694d" containerName="collect-profiles" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.834971 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae9e5ba-61a3-4b07-9c44-7493152530e3" containerName="registry-server" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.835029 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b4eb574-94a5-46a8-b9c2-fd7bae63694d" containerName="collect-profiles" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.835057 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="1824c38d-7556-43e7-9dff-49a0757d8f8e" containerName="oc" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.838479 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:40 crc kubenswrapper[4917]: I0318 09:30:40.848490 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzv2n"] Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.030253 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fndbm\" (UniqueName: \"kubernetes.io/projected/78b14379-e44d-4384-9874-95baa7fdbda3-kube-api-access-fndbm\") pod \"community-operators-wzv2n\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.030334 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-utilities\") pod \"community-operators-wzv2n\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.030567 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-catalog-content\") pod \"community-operators-wzv2n\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.132832 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fndbm\" (UniqueName: \"kubernetes.io/projected/78b14379-e44d-4384-9874-95baa7fdbda3-kube-api-access-fndbm\") pod \"community-operators-wzv2n\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.132894 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-utilities\") pod \"community-operators-wzv2n\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.132946 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-catalog-content\") pod \"community-operators-wzv2n\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.133402 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-catalog-content\") pod \"community-operators-wzv2n\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.133548 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-utilities\") pod \"community-operators-wzv2n\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.149868 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fndbm\" (UniqueName: \"kubernetes.io/projected/78b14379-e44d-4384-9874-95baa7fdbda3-kube-api-access-fndbm\") pod \"community-operators-wzv2n\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.247245 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:41 crc kubenswrapper[4917]: I0318 09:30:41.794562 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzv2n"] Mar 18 09:30:43 crc kubenswrapper[4917]: I0318 09:30:43.504560 4917 generic.go:334] "Generic (PLEG): container finished" podID="78b14379-e44d-4384-9874-95baa7fdbda3" containerID="163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792" exitCode=0 Mar 18 09:30:43 crc kubenswrapper[4917]: I0318 09:30:43.504724 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzv2n" event={"ID":"78b14379-e44d-4384-9874-95baa7fdbda3","Type":"ContainerDied","Data":"163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792"} Mar 18 09:30:43 crc kubenswrapper[4917]: I0318 09:30:43.505086 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzv2n" event={"ID":"78b14379-e44d-4384-9874-95baa7fdbda3","Type":"ContainerStarted","Data":"f80e041b204a962e4b1711007b8ec3c861f7af5327a2637211dbdf4859a62651"} Mar 18 09:30:45 crc kubenswrapper[4917]: I0318 09:30:45.533651 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzv2n" event={"ID":"78b14379-e44d-4384-9874-95baa7fdbda3","Type":"ContainerStarted","Data":"533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0"} Mar 18 09:30:46 crc kubenswrapper[4917]: I0318 09:30:46.563102 4917 generic.go:334] "Generic (PLEG): container finished" podID="78b14379-e44d-4384-9874-95baa7fdbda3" containerID="533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0" exitCode=0 Mar 18 09:30:46 crc kubenswrapper[4917]: I0318 09:30:46.563321 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzv2n" event={"ID":"78b14379-e44d-4384-9874-95baa7fdbda3","Type":"ContainerDied","Data":"533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0"} Mar 18 09:30:47 crc kubenswrapper[4917]: I0318 09:30:47.578668 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzv2n" event={"ID":"78b14379-e44d-4384-9874-95baa7fdbda3","Type":"ContainerStarted","Data":"94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034"} Mar 18 09:30:47 crc kubenswrapper[4917]: I0318 09:30:47.612513 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wzv2n" podStartSLOduration=3.922767459 podStartE2EDuration="7.612485672s" podCreationTimestamp="2026-03-18 09:30:40 +0000 UTC" firstStartedPulling="2026-03-18 09:30:43.506649158 +0000 UTC m=+9828.447803892" lastFinishedPulling="2026-03-18 09:30:47.196367351 +0000 UTC m=+9832.137522105" observedRunningTime="2026-03-18 09:30:47.600134322 +0000 UTC m=+9832.541289076" watchObservedRunningTime="2026-03-18 09:30:47.612485672 +0000 UTC m=+9832.553640416" Mar 18 09:30:51 crc kubenswrapper[4917]: I0318 09:30:51.247521 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:51 crc kubenswrapper[4917]: I0318 09:30:51.249650 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:51 crc kubenswrapper[4917]: I0318 09:30:51.322945 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:52 crc kubenswrapper[4917]: I0318 09:30:52.712961 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:53 crc kubenswrapper[4917]: I0318 09:30:53.597975 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzv2n"] Mar 18 09:30:54 crc kubenswrapper[4917]: I0318 09:30:54.679981 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wzv2n" podUID="78b14379-e44d-4384-9874-95baa7fdbda3" containerName="registry-server" containerID="cri-o://94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034" gracePeriod=2 Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.223802 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.383606 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-catalog-content\") pod \"78b14379-e44d-4384-9874-95baa7fdbda3\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.383839 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-utilities\") pod \"78b14379-e44d-4384-9874-95baa7fdbda3\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.383934 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fndbm\" (UniqueName: \"kubernetes.io/projected/78b14379-e44d-4384-9874-95baa7fdbda3-kube-api-access-fndbm\") pod \"78b14379-e44d-4384-9874-95baa7fdbda3\" (UID: \"78b14379-e44d-4384-9874-95baa7fdbda3\") " Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.385548 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-utilities" (OuterVolumeSpecName: "utilities") pod "78b14379-e44d-4384-9874-95baa7fdbda3" (UID: "78b14379-e44d-4384-9874-95baa7fdbda3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.403771 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b14379-e44d-4384-9874-95baa7fdbda3-kube-api-access-fndbm" (OuterVolumeSpecName: "kube-api-access-fndbm") pod "78b14379-e44d-4384-9874-95baa7fdbda3" (UID: "78b14379-e44d-4384-9874-95baa7fdbda3"). InnerVolumeSpecName "kube-api-access-fndbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.452161 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78b14379-e44d-4384-9874-95baa7fdbda3" (UID: "78b14379-e44d-4384-9874-95baa7fdbda3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.486119 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fndbm\" (UniqueName: \"kubernetes.io/projected/78b14379-e44d-4384-9874-95baa7fdbda3-kube-api-access-fndbm\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.486156 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.486167 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78b14379-e44d-4384-9874-95baa7fdbda3-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.727501 4917 generic.go:334] "Generic (PLEG): container finished" podID="78b14379-e44d-4384-9874-95baa7fdbda3" containerID="94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034" exitCode=0 Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.727779 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzv2n" event={"ID":"78b14379-e44d-4384-9874-95baa7fdbda3","Type":"ContainerDied","Data":"94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034"} Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.727807 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzv2n" event={"ID":"78b14379-e44d-4384-9874-95baa7fdbda3","Type":"ContainerDied","Data":"f80e041b204a962e4b1711007b8ec3c861f7af5327a2637211dbdf4859a62651"} Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.727845 4917 scope.go:117] "RemoveContainer" containerID="94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.728047 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzv2n" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.795769 4917 scope.go:117] "RemoveContainer" containerID="533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.798701 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.798723 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzv2n"] Mar 18 09:30:55 crc kubenswrapper[4917]: E0318 09:30:55.799622 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.800105 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wzv2n"] Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.839060 4917 scope.go:117] "RemoveContainer" containerID="163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.888076 4917 scope.go:117] "RemoveContainer" containerID="94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034" Mar 18 09:30:55 crc kubenswrapper[4917]: E0318 09:30:55.888946 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034\": container with ID starting with 94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034 not found: ID does not exist" containerID="94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.888993 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034"} err="failed to get container status \"94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034\": rpc error: code = NotFound desc = could not find container \"94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034\": container with ID starting with 94b851f5d0b682af7297167b59f6846fd91a256f7e6398427b6233ae3f1dc034 not found: ID does not exist" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.889026 4917 scope.go:117] "RemoveContainer" containerID="533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0" Mar 18 09:30:55 crc kubenswrapper[4917]: E0318 09:30:55.889472 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0\": container with ID starting with 533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0 not found: ID does not exist" containerID="533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.889505 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0"} err="failed to get container status \"533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0\": rpc error: code = NotFound desc = could not find container \"533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0\": container with ID starting with 533f4fd0c3027310c8795c74cf8d9226d0d004555698a9be68e9474d91f627b0 not found: ID does not exist" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.889521 4917 scope.go:117] "RemoveContainer" containerID="163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792" Mar 18 09:30:55 crc kubenswrapper[4917]: E0318 09:30:55.889864 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792\": container with ID starting with 163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792 not found: ID does not exist" containerID="163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792" Mar 18 09:30:55 crc kubenswrapper[4917]: I0318 09:30:55.889889 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792"} err="failed to get container status \"163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792\": rpc error: code = NotFound desc = could not find container \"163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792\": container with ID starting with 163ea8aa086db113d3df3eebad46b706931b732fb31a1916e80f534ee3f29792 not found: ID does not exist" Mar 18 09:30:57 crc kubenswrapper[4917]: I0318 09:30:57.786442 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b14379-e44d-4384-9874-95baa7fdbda3" path="/var/lib/kubelet/pods/78b14379-e44d-4384-9874-95baa7fdbda3/volumes" Mar 18 09:31:06 crc kubenswrapper[4917]: I0318 09:31:06.773412 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:31:06 crc kubenswrapper[4917]: E0318 09:31:06.774258 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:31:18 crc kubenswrapper[4917]: I0318 09:31:18.773223 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:31:18 crc kubenswrapper[4917]: E0318 09:31:18.774258 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.117737 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2d9bv"] Mar 18 09:31:28 crc kubenswrapper[4917]: E0318 09:31:28.118725 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b14379-e44d-4384-9874-95baa7fdbda3" containerName="extract-utilities" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.118741 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b14379-e44d-4384-9874-95baa7fdbda3" containerName="extract-utilities" Mar 18 09:31:28 crc kubenswrapper[4917]: E0318 09:31:28.118760 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b14379-e44d-4384-9874-95baa7fdbda3" containerName="extract-content" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.118767 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b14379-e44d-4384-9874-95baa7fdbda3" containerName="extract-content" Mar 18 09:31:28 crc kubenswrapper[4917]: E0318 09:31:28.118798 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b14379-e44d-4384-9874-95baa7fdbda3" containerName="registry-server" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.118818 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b14379-e44d-4384-9874-95baa7fdbda3" containerName="registry-server" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.119051 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b14379-e44d-4384-9874-95baa7fdbda3" containerName="registry-server" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.120998 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.141087 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2d9bv"] Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.252445 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwktk\" (UniqueName: \"kubernetes.io/projected/cce9b40f-ea1e-4887-bf52-65ca42fd2567-kube-api-access-dwktk\") pod \"redhat-operators-2d9bv\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.252950 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-utilities\") pod \"redhat-operators-2d9bv\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.253084 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-catalog-content\") pod \"redhat-operators-2d9bv\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.355532 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwktk\" (UniqueName: \"kubernetes.io/projected/cce9b40f-ea1e-4887-bf52-65ca42fd2567-kube-api-access-dwktk\") pod \"redhat-operators-2d9bv\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.355903 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-utilities\") pod \"redhat-operators-2d9bv\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.356046 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-catalog-content\") pod \"redhat-operators-2d9bv\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.356326 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-utilities\") pod \"redhat-operators-2d9bv\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.356442 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-catalog-content\") pod \"redhat-operators-2d9bv\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.382444 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwktk\" (UniqueName: \"kubernetes.io/projected/cce9b40f-ea1e-4887-bf52-65ca42fd2567-kube-api-access-dwktk\") pod \"redhat-operators-2d9bv\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.460784 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:28 crc kubenswrapper[4917]: I0318 09:31:28.959007 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2d9bv"] Mar 18 09:31:29 crc kubenswrapper[4917]: I0318 09:31:29.145071 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d9bv" event={"ID":"cce9b40f-ea1e-4887-bf52-65ca42fd2567","Type":"ContainerStarted","Data":"da4a08cebce66739f9583780f876b906a6cf11122285d05dc74b0db3209be167"} Mar 18 09:31:29 crc kubenswrapper[4917]: I0318 09:31:29.147933 4917 generic.go:334] "Generic (PLEG): container finished" podID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" containerID="db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84" exitCode=0 Mar 18 09:31:29 crc kubenswrapper[4917]: I0318 09:31:29.148023 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fplc/must-gather-2bgws" event={"ID":"f12266ce-503e-4e3d-bbe4-c9e365a82fa3","Type":"ContainerDied","Data":"db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84"} Mar 18 09:31:29 crc kubenswrapper[4917]: I0318 09:31:29.148769 4917 scope.go:117] "RemoveContainer" containerID="db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84" Mar 18 09:31:29 crc kubenswrapper[4917]: I0318 09:31:29.664738 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9fplc_must-gather-2bgws_f12266ce-503e-4e3d-bbe4-c9e365a82fa3/gather/0.log" Mar 18 09:31:30 crc kubenswrapper[4917]: I0318 09:31:30.161359 4917 generic.go:334] "Generic (PLEG): container finished" podID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerID="2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311" exitCode=0 Mar 18 09:31:30 crc kubenswrapper[4917]: I0318 09:31:30.161402 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d9bv" event={"ID":"cce9b40f-ea1e-4887-bf52-65ca42fd2567","Type":"ContainerDied","Data":"2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311"} Mar 18 09:31:31 crc kubenswrapper[4917]: I0318 09:31:31.774776 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:31:31 crc kubenswrapper[4917]: E0318 09:31:31.778682 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:31:32 crc kubenswrapper[4917]: I0318 09:31:32.182172 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d9bv" event={"ID":"cce9b40f-ea1e-4887-bf52-65ca42fd2567","Type":"ContainerStarted","Data":"f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5"} Mar 18 09:31:34 crc kubenswrapper[4917]: I0318 09:31:34.203760 4917 generic.go:334] "Generic (PLEG): container finished" podID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerID="f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5" exitCode=0 Mar 18 09:31:34 crc kubenswrapper[4917]: I0318 09:31:34.203869 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d9bv" event={"ID":"cce9b40f-ea1e-4887-bf52-65ca42fd2567","Type":"ContainerDied","Data":"f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5"} Mar 18 09:31:35 crc kubenswrapper[4917]: I0318 09:31:35.215637 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d9bv" event={"ID":"cce9b40f-ea1e-4887-bf52-65ca42fd2567","Type":"ContainerStarted","Data":"38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13"} Mar 18 09:31:35 crc kubenswrapper[4917]: I0318 09:31:35.237191 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2d9bv" podStartSLOduration=2.569635919 podStartE2EDuration="7.237173849s" podCreationTimestamp="2026-03-18 09:31:28 +0000 UTC" firstStartedPulling="2026-03-18 09:31:30.165029947 +0000 UTC m=+9875.106184661" lastFinishedPulling="2026-03-18 09:31:34.832567837 +0000 UTC m=+9879.773722591" observedRunningTime="2026-03-18 09:31:35.233211853 +0000 UTC m=+9880.174366577" watchObservedRunningTime="2026-03-18 09:31:35.237173849 +0000 UTC m=+9880.178328573" Mar 18 09:31:38 crc kubenswrapper[4917]: I0318 09:31:38.462043 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:38 crc kubenswrapper[4917]: I0318 09:31:38.463376 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:39 crc kubenswrapper[4917]: I0318 09:31:39.277506 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9fplc/must-gather-2bgws"] Mar 18 09:31:39 crc kubenswrapper[4917]: I0318 09:31:39.277849 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9fplc/must-gather-2bgws" podUID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" containerName="copy" containerID="cri-o://8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a" gracePeriod=2 Mar 18 09:31:39 crc kubenswrapper[4917]: I0318 09:31:39.286496 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9fplc/must-gather-2bgws"] Mar 18 09:31:39 crc kubenswrapper[4917]: I0318 09:31:39.544120 4917 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2d9bv" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerName="registry-server" probeResult="failure" output=< Mar 18 09:31:39 crc kubenswrapper[4917]: timeout: failed to connect service ":50051" within 1s Mar 18 09:31:39 crc kubenswrapper[4917]: > Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.265105 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9fplc_must-gather-2bgws_f12266ce-503e-4e3d-bbe4-c9e365a82fa3/copy/0.log" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.265118 4917 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9fplc_must-gather-2bgws_f12266ce-503e-4e3d-bbe4-c9e365a82fa3/copy/0.log" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.265730 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.265896 4917 generic.go:334] "Generic (PLEG): container finished" podID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" containerID="8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a" exitCode=143 Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.265965 4917 scope.go:117] "RemoveContainer" containerID="8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.294468 4917 scope.go:117] "RemoveContainer" containerID="db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.379910 4917 scope.go:117] "RemoveContainer" containerID="8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a" Mar 18 09:31:40 crc kubenswrapper[4917]: E0318 09:31:40.380284 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a\": container with ID starting with 8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a not found: ID does not exist" containerID="8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.380322 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a"} err="failed to get container status \"8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a\": rpc error: code = NotFound desc = could not find container \"8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a\": container with ID starting with 8f1e61577226860cf992b1e232ba337bd65d558697835e14bbaf0f86ffdeff3a not found: ID does not exist" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.380348 4917 scope.go:117] "RemoveContainer" containerID="db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84" Mar 18 09:31:40 crc kubenswrapper[4917]: E0318 09:31:40.380616 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84\": container with ID starting with db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84 not found: ID does not exist" containerID="db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.380638 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84"} err="failed to get container status \"db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84\": rpc error: code = NotFound desc = could not find container \"db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84\": container with ID starting with db1b9dc1b375588604d3ff92d351209bca2e8d949838a9f382aa5f0a32205c84 not found: ID does not exist" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.447109 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmz8l\" (UniqueName: \"kubernetes.io/projected/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-kube-api-access-tmz8l\") pod \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\" (UID: \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\") " Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.447208 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-must-gather-output\") pod \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\" (UID: \"f12266ce-503e-4e3d-bbe4-c9e365a82fa3\") " Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.455156 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-kube-api-access-tmz8l" (OuterVolumeSpecName: "kube-api-access-tmz8l") pod "f12266ce-503e-4e3d-bbe4-c9e365a82fa3" (UID: "f12266ce-503e-4e3d-bbe4-c9e365a82fa3"). InnerVolumeSpecName "kube-api-access-tmz8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.549505 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmz8l\" (UniqueName: \"kubernetes.io/projected/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-kube-api-access-tmz8l\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.658915 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f12266ce-503e-4e3d-bbe4-c9e365a82fa3" (UID: "f12266ce-503e-4e3d-bbe4-c9e365a82fa3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:31:40 crc kubenswrapper[4917]: I0318 09:31:40.753017 4917 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f12266ce-503e-4e3d-bbe4-c9e365a82fa3-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:41 crc kubenswrapper[4917]: I0318 09:31:41.275403 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fplc/must-gather-2bgws" Mar 18 09:31:41 crc kubenswrapper[4917]: I0318 09:31:41.785890 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" path="/var/lib/kubelet/pods/f12266ce-503e-4e3d-bbe4-c9e365a82fa3/volumes" Mar 18 09:31:42 crc kubenswrapper[4917]: I0318 09:31:42.773320 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:31:42 crc kubenswrapper[4917]: E0318 09:31:42.773990 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:31:48 crc kubenswrapper[4917]: I0318 09:31:48.543337 4917 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:48 crc kubenswrapper[4917]: I0318 09:31:48.629329 4917 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:48 crc kubenswrapper[4917]: I0318 09:31:48.788186 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2d9bv"] Mar 18 09:31:50 crc kubenswrapper[4917]: I0318 09:31:50.369994 4917 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2d9bv" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerName="registry-server" containerID="cri-o://38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13" gracePeriod=2 Mar 18 09:31:50 crc kubenswrapper[4917]: I0318 09:31:50.837227 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:50 crc kubenswrapper[4917]: I0318 09:31:50.996700 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-utilities\") pod \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " Mar 18 09:31:50 crc kubenswrapper[4917]: I0318 09:31:50.997178 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-catalog-content\") pod \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " Mar 18 09:31:50 crc kubenswrapper[4917]: I0318 09:31:50.997381 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwktk\" (UniqueName: \"kubernetes.io/projected/cce9b40f-ea1e-4887-bf52-65ca42fd2567-kube-api-access-dwktk\") pod \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\" (UID: \"cce9b40f-ea1e-4887-bf52-65ca42fd2567\") " Mar 18 09:31:50 crc kubenswrapper[4917]: I0318 09:31:50.998514 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-utilities" (OuterVolumeSpecName: "utilities") pod "cce9b40f-ea1e-4887-bf52-65ca42fd2567" (UID: "cce9b40f-ea1e-4887-bf52-65ca42fd2567"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.005833 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce9b40f-ea1e-4887-bf52-65ca42fd2567-kube-api-access-dwktk" (OuterVolumeSpecName: "kube-api-access-dwktk") pod "cce9b40f-ea1e-4887-bf52-65ca42fd2567" (UID: "cce9b40f-ea1e-4887-bf52-65ca42fd2567"). InnerVolumeSpecName "kube-api-access-dwktk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.101097 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwktk\" (UniqueName: \"kubernetes.io/projected/cce9b40f-ea1e-4887-bf52-65ca42fd2567-kube-api-access-dwktk\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.101137 4917 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-utilities\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.158864 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cce9b40f-ea1e-4887-bf52-65ca42fd2567" (UID: "cce9b40f-ea1e-4887-bf52-65ca42fd2567"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.202759 4917 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cce9b40f-ea1e-4887-bf52-65ca42fd2567-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.393070 4917 generic.go:334] "Generic (PLEG): container finished" podID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerID="38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13" exitCode=0 Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.393118 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d9bv" event={"ID":"cce9b40f-ea1e-4887-bf52-65ca42fd2567","Type":"ContainerDied","Data":"38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13"} Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.393146 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2d9bv" event={"ID":"cce9b40f-ea1e-4887-bf52-65ca42fd2567","Type":"ContainerDied","Data":"da4a08cebce66739f9583780f876b906a6cf11122285d05dc74b0db3209be167"} Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.393163 4917 scope.go:117] "RemoveContainer" containerID="38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.395032 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2d9bv" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.417961 4917 scope.go:117] "RemoveContainer" containerID="f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.442780 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2d9bv"] Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.453906 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2d9bv"] Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.480031 4917 scope.go:117] "RemoveContainer" containerID="2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.513904 4917 scope.go:117] "RemoveContainer" containerID="38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13" Mar 18 09:31:51 crc kubenswrapper[4917]: E0318 09:31:51.514491 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13\": container with ID starting with 38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13 not found: ID does not exist" containerID="38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.514556 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13"} err="failed to get container status \"38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13\": rpc error: code = NotFound desc = could not find container \"38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13\": container with ID starting with 38a77ca6c1d9d0b1afbc6a93e7327d3e31ff69caa6add1cfad0e3299c2ac2a13 not found: ID does not exist" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.514727 4917 scope.go:117] "RemoveContainer" containerID="f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5" Mar 18 09:31:51 crc kubenswrapper[4917]: E0318 09:31:51.515174 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5\": container with ID starting with f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5 not found: ID does not exist" containerID="f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.515236 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5"} err="failed to get container status \"f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5\": rpc error: code = NotFound desc = could not find container \"f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5\": container with ID starting with f8d2dd278eb6d09f6d7717f779627c6f78e6afeb00407a5249245d0128b00bf5 not found: ID does not exist" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.515272 4917 scope.go:117] "RemoveContainer" containerID="2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311" Mar 18 09:31:51 crc kubenswrapper[4917]: E0318 09:31:51.515562 4917 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311\": container with ID starting with 2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311 not found: ID does not exist" containerID="2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.515615 4917 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311"} err="failed to get container status \"2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311\": rpc error: code = NotFound desc = could not find container \"2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311\": container with ID starting with 2ebf22703f2a0d19ee518c4e4cb20b6325c9b08d8ea5ac0efdf039d269054311 not found: ID does not exist" Mar 18 09:31:51 crc kubenswrapper[4917]: I0318 09:31:51.782854 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" path="/var/lib/kubelet/pods/cce9b40f-ea1e-4887-bf52-65ca42fd2567/volumes" Mar 18 09:31:55 crc kubenswrapper[4917]: I0318 09:31:55.789219 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:31:55 crc kubenswrapper[4917]: E0318 09:31:55.789898 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.145653 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563772-ff8xc"] Mar 18 09:32:00 crc kubenswrapper[4917]: E0318 09:32:00.146568 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" containerName="gather" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.146600 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" containerName="gather" Mar 18 09:32:00 crc kubenswrapper[4917]: E0318 09:32:00.146617 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" containerName="copy" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.146641 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" containerName="copy" Mar 18 09:32:00 crc kubenswrapper[4917]: E0318 09:32:00.146661 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerName="extract-utilities" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.146670 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerName="extract-utilities" Mar 18 09:32:00 crc kubenswrapper[4917]: E0318 09:32:00.146716 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerName="extract-content" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.146733 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerName="extract-content" Mar 18 09:32:00 crc kubenswrapper[4917]: E0318 09:32:00.146751 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerName="registry-server" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.146760 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerName="registry-server" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.146983 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce9b40f-ea1e-4887-bf52-65ca42fd2567" containerName="registry-server" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.147001 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" containerName="copy" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.147034 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12266ce-503e-4e3d-bbe4-c9e365a82fa3" containerName="gather" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.147826 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-ff8xc" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.151336 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.151548 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.151683 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.158060 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-ff8xc"] Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.306553 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w6tg\" (UniqueName: \"kubernetes.io/projected/d9f7c6ce-46b5-46aa-99c5-23372215fcb5-kube-api-access-9w6tg\") pod \"auto-csr-approver-29563772-ff8xc\" (UID: \"d9f7c6ce-46b5-46aa-99c5-23372215fcb5\") " pod="openshift-infra/auto-csr-approver-29563772-ff8xc" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.407883 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w6tg\" (UniqueName: \"kubernetes.io/projected/d9f7c6ce-46b5-46aa-99c5-23372215fcb5-kube-api-access-9w6tg\") pod \"auto-csr-approver-29563772-ff8xc\" (UID: \"d9f7c6ce-46b5-46aa-99c5-23372215fcb5\") " pod="openshift-infra/auto-csr-approver-29563772-ff8xc" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.439444 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w6tg\" (UniqueName: \"kubernetes.io/projected/d9f7c6ce-46b5-46aa-99c5-23372215fcb5-kube-api-access-9w6tg\") pod \"auto-csr-approver-29563772-ff8xc\" (UID: \"d9f7c6ce-46b5-46aa-99c5-23372215fcb5\") " pod="openshift-infra/auto-csr-approver-29563772-ff8xc" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.486778 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-ff8xc" Mar 18 09:32:00 crc kubenswrapper[4917]: I0318 09:32:00.986230 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563772-ff8xc"] Mar 18 09:32:01 crc kubenswrapper[4917]: I0318 09:32:01.533250 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-ff8xc" event={"ID":"d9f7c6ce-46b5-46aa-99c5-23372215fcb5","Type":"ContainerStarted","Data":"8066e6a9d1efd51ebe845d47edff62d1eb5dca1bb3c99087058290e5b34583d4"} Mar 18 09:32:02 crc kubenswrapper[4917]: I0318 09:32:02.549200 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-ff8xc" event={"ID":"d9f7c6ce-46b5-46aa-99c5-23372215fcb5","Type":"ContainerStarted","Data":"7ddb42c101302454119077f60d8039bfe44cbd85f641e5dcc73934dd07ff17a2"} Mar 18 09:32:02 crc kubenswrapper[4917]: I0318 09:32:02.572734 4917 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29563772-ff8xc" podStartSLOduration=1.5541125980000001 podStartE2EDuration="2.572713194s" podCreationTimestamp="2026-03-18 09:32:00 +0000 UTC" firstStartedPulling="2026-03-18 09:32:00.999452232 +0000 UTC m=+9905.940606966" lastFinishedPulling="2026-03-18 09:32:02.018052848 +0000 UTC m=+9906.959207562" observedRunningTime="2026-03-18 09:32:02.567780014 +0000 UTC m=+9907.508934768" watchObservedRunningTime="2026-03-18 09:32:02.572713194 +0000 UTC m=+9907.513867918" Mar 18 09:32:03 crc kubenswrapper[4917]: I0318 09:32:03.560317 4917 generic.go:334] "Generic (PLEG): container finished" podID="d9f7c6ce-46b5-46aa-99c5-23372215fcb5" containerID="7ddb42c101302454119077f60d8039bfe44cbd85f641e5dcc73934dd07ff17a2" exitCode=0 Mar 18 09:32:03 crc kubenswrapper[4917]: I0318 09:32:03.560367 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-ff8xc" event={"ID":"d9f7c6ce-46b5-46aa-99c5-23372215fcb5","Type":"ContainerDied","Data":"7ddb42c101302454119077f60d8039bfe44cbd85f641e5dcc73934dd07ff17a2"} Mar 18 09:32:04 crc kubenswrapper[4917]: I0318 09:32:04.925858 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-ff8xc" Mar 18 09:32:05 crc kubenswrapper[4917]: I0318 09:32:05.117231 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w6tg\" (UniqueName: \"kubernetes.io/projected/d9f7c6ce-46b5-46aa-99c5-23372215fcb5-kube-api-access-9w6tg\") pod \"d9f7c6ce-46b5-46aa-99c5-23372215fcb5\" (UID: \"d9f7c6ce-46b5-46aa-99c5-23372215fcb5\") " Mar 18 09:32:05 crc kubenswrapper[4917]: I0318 09:32:05.128759 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9f7c6ce-46b5-46aa-99c5-23372215fcb5-kube-api-access-9w6tg" (OuterVolumeSpecName: "kube-api-access-9w6tg") pod "d9f7c6ce-46b5-46aa-99c5-23372215fcb5" (UID: "d9f7c6ce-46b5-46aa-99c5-23372215fcb5"). InnerVolumeSpecName "kube-api-access-9w6tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:32:05 crc kubenswrapper[4917]: I0318 09:32:05.221886 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w6tg\" (UniqueName: \"kubernetes.io/projected/d9f7c6ce-46b5-46aa-99c5-23372215fcb5-kube-api-access-9w6tg\") on node \"crc\" DevicePath \"\"" Mar 18 09:32:05 crc kubenswrapper[4917]: I0318 09:32:05.583839 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563772-ff8xc" event={"ID":"d9f7c6ce-46b5-46aa-99c5-23372215fcb5","Type":"ContainerDied","Data":"8066e6a9d1efd51ebe845d47edff62d1eb5dca1bb3c99087058290e5b34583d4"} Mar 18 09:32:05 crc kubenswrapper[4917]: I0318 09:32:05.583876 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8066e6a9d1efd51ebe845d47edff62d1eb5dca1bb3c99087058290e5b34583d4" Mar 18 09:32:05 crc kubenswrapper[4917]: I0318 09:32:05.583923 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563772-ff8xc" Mar 18 09:32:05 crc kubenswrapper[4917]: I0318 09:32:05.642037 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-zclgd"] Mar 18 09:32:05 crc kubenswrapper[4917]: I0318 09:32:05.657058 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563766-zclgd"] Mar 18 09:32:05 crc kubenswrapper[4917]: I0318 09:32:05.797724 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8e855c-339f-497a-999f-4c3ea56327cb" path="/var/lib/kubelet/pods/db8e855c-339f-497a-999f-4c3ea56327cb/volumes" Mar 18 09:32:08 crc kubenswrapper[4917]: I0318 09:32:08.772458 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:32:08 crc kubenswrapper[4917]: E0318 09:32:08.773552 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:32:18 crc kubenswrapper[4917]: I0318 09:32:18.525329 4917 scope.go:117] "RemoveContainer" containerID="33b9f5ba629ab653693bfea20fb75364e2e5d97491f8438435ed075a9fb8e53a" Mar 18 09:32:19 crc kubenswrapper[4917]: I0318 09:32:19.772653 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:32:19 crc kubenswrapper[4917]: E0318 09:32:19.773190 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:32:31 crc kubenswrapper[4917]: I0318 09:32:31.774818 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:32:31 crc kubenswrapper[4917]: E0318 09:32:31.775655 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:32:45 crc kubenswrapper[4917]: I0318 09:32:45.785365 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:32:45 crc kubenswrapper[4917]: E0318 09:32:45.789430 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:32:59 crc kubenswrapper[4917]: I0318 09:32:59.774789 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:32:59 crc kubenswrapper[4917]: E0318 09:32:59.776368 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:33:11 crc kubenswrapper[4917]: I0318 09:33:11.773627 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:33:11 crc kubenswrapper[4917]: E0318 09:33:11.774890 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:33:22 crc kubenswrapper[4917]: I0318 09:33:22.772783 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:33:22 crc kubenswrapper[4917]: E0318 09:33:22.774020 4917 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xp5xk_openshift-machine-config-operator(cc04c58e-83bd-4c0c-b58a-da7dea820272)\"" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" podUID="cc04c58e-83bd-4c0c-b58a-da7dea820272" Mar 18 09:33:34 crc kubenswrapper[4917]: I0318 09:33:34.773942 4917 scope.go:117] "RemoveContainer" containerID="cb06fc19d8d9faa7b293e89f98b509c91a7d57c5fa31b296d2b641e5ff0a0ebb" Mar 18 09:33:35 crc kubenswrapper[4917]: I0318 09:33:35.686036 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xp5xk" event={"ID":"cc04c58e-83bd-4c0c-b58a-da7dea820272","Type":"ContainerStarted","Data":"d4e6faba9d6ffaea5e0891bfda6b0a24af9b0834cefb8b49de278d8c3580e05e"} Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.138847 4917 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29563774-5x7lw"] Mar 18 09:34:00 crc kubenswrapper[4917]: E0318 09:34:00.139957 4917 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f7c6ce-46b5-46aa-99c5-23372215fcb5" containerName="oc" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.139975 4917 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f7c6ce-46b5-46aa-99c5-23372215fcb5" containerName="oc" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.140236 4917 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f7c6ce-46b5-46aa-99c5-23372215fcb5" containerName="oc" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.141109 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-5x7lw" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.143116 4917 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-6qjhf" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.143822 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.143822 4917 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.160766 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-5x7lw"] Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.298687 4917 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96zkc\" (UniqueName: \"kubernetes.io/projected/29f70947-dd62-4913-aa2a-d8fcef5168b2-kube-api-access-96zkc\") pod \"auto-csr-approver-29563774-5x7lw\" (UID: \"29f70947-dd62-4913-aa2a-d8fcef5168b2\") " pod="openshift-infra/auto-csr-approver-29563774-5x7lw" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.400718 4917 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96zkc\" (UniqueName: \"kubernetes.io/projected/29f70947-dd62-4913-aa2a-d8fcef5168b2-kube-api-access-96zkc\") pod \"auto-csr-approver-29563774-5x7lw\" (UID: \"29f70947-dd62-4913-aa2a-d8fcef5168b2\") " pod="openshift-infra/auto-csr-approver-29563774-5x7lw" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.424653 4917 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96zkc\" (UniqueName: \"kubernetes.io/projected/29f70947-dd62-4913-aa2a-d8fcef5168b2-kube-api-access-96zkc\") pod \"auto-csr-approver-29563774-5x7lw\" (UID: \"29f70947-dd62-4913-aa2a-d8fcef5168b2\") " pod="openshift-infra/auto-csr-approver-29563774-5x7lw" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.458513 4917 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-5x7lw" Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.907484 4917 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29563774-5x7lw"] Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.922142 4917 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 18 09:34:00 crc kubenswrapper[4917]: I0318 09:34:00.961389 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563774-5x7lw" event={"ID":"29f70947-dd62-4913-aa2a-d8fcef5168b2","Type":"ContainerStarted","Data":"270752031e384f74f675cac4331d6f507aa65a2923a44f57a0c2149a97b08897"} Mar 18 09:34:02 crc kubenswrapper[4917]: I0318 09:34:02.982295 4917 generic.go:334] "Generic (PLEG): container finished" podID="29f70947-dd62-4913-aa2a-d8fcef5168b2" containerID="de965e79008b4197f5a3ccbd2f9165336ce731c0ba98522960824f8dba02e873" exitCode=0 Mar 18 09:34:02 crc kubenswrapper[4917]: I0318 09:34:02.982813 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563774-5x7lw" event={"ID":"29f70947-dd62-4913-aa2a-d8fcef5168b2","Type":"ContainerDied","Data":"de965e79008b4197f5a3ccbd2f9165336ce731c0ba98522960824f8dba02e873"} Mar 18 09:34:04 crc kubenswrapper[4917]: I0318 09:34:04.404948 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-5x7lw" Mar 18 09:34:04 crc kubenswrapper[4917]: I0318 09:34:04.509073 4917 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96zkc\" (UniqueName: \"kubernetes.io/projected/29f70947-dd62-4913-aa2a-d8fcef5168b2-kube-api-access-96zkc\") pod \"29f70947-dd62-4913-aa2a-d8fcef5168b2\" (UID: \"29f70947-dd62-4913-aa2a-d8fcef5168b2\") " Mar 18 09:34:04 crc kubenswrapper[4917]: I0318 09:34:04.514291 4917 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f70947-dd62-4913-aa2a-d8fcef5168b2-kube-api-access-96zkc" (OuterVolumeSpecName: "kube-api-access-96zkc") pod "29f70947-dd62-4913-aa2a-d8fcef5168b2" (UID: "29f70947-dd62-4913-aa2a-d8fcef5168b2"). InnerVolumeSpecName "kube-api-access-96zkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 18 09:34:04 crc kubenswrapper[4917]: I0318 09:34:04.611452 4917 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96zkc\" (UniqueName: \"kubernetes.io/projected/29f70947-dd62-4913-aa2a-d8fcef5168b2-kube-api-access-96zkc\") on node \"crc\" DevicePath \"\"" Mar 18 09:34:05 crc kubenswrapper[4917]: I0318 09:34:05.011611 4917 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29563774-5x7lw" event={"ID":"29f70947-dd62-4913-aa2a-d8fcef5168b2","Type":"ContainerDied","Data":"270752031e384f74f675cac4331d6f507aa65a2923a44f57a0c2149a97b08897"} Mar 18 09:34:05 crc kubenswrapper[4917]: I0318 09:34:05.011668 4917 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="270752031e384f74f675cac4331d6f507aa65a2923a44f57a0c2149a97b08897" Mar 18 09:34:05 crc kubenswrapper[4917]: I0318 09:34:05.011778 4917 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29563774-5x7lw" Mar 18 09:34:05 crc kubenswrapper[4917]: I0318 09:34:05.494459 4917 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-sb4nb"] Mar 18 09:34:05 crc kubenswrapper[4917]: I0318 09:34:05.506781 4917 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29563768-sb4nb"] Mar 18 09:34:05 crc kubenswrapper[4917]: I0318 09:34:05.789684 4917 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f6a538-34da-4ac9-86bd-58f3eb8675a4" path="/var/lib/kubelet/pods/29f6a538-34da-4ac9-86bd-58f3eb8675a4/volumes" Mar 18 09:34:18 crc kubenswrapper[4917]: I0318 09:34:18.702351 4917 scope.go:117] "RemoveContainer" containerID="4c49d69369f1bf1f0501e8e84bc1bdbc0ec34214906ef64c871923e3463e0b98"